Skip to main content

Posts

Showing posts from April, 2018

[Post Event] Global Azure Bootcamp 2018, Cluj-Napoca

Last Saturday (April 21, 2018), ITCamp Community hosted Global Azure Bootcamp in Cluj-Napoca. This year we tried to do things a little different and offering a functional application inside Azure after workshops. We decided to implement a weather alert system end-to-end. From ingesting alerts from an external system to a web application that can notify users when a warning appears in their region. We had not only the app but also the ARM Templates and CD configured for it. From Azure Services perspective we used the following: Azure Service Bus Azure Function Apps Azure Web Apps Azure Redis Cache There were 3 workshops that covered: Developing backend application inside Azure Functions (Radu Vunvulea)  Designing the web application (Radu Pascal)  Create the automation scripts using ARM (Florin Loghiade) Slides and pictures can be found below. See you next year! Global azure bootcamp 2018 from Radu Vunvulea Pictures:

[Post Event] Cloud Conference 2018, Bucharest

Today (April 19th, 2018) I had the great opportunity to be invited as a speaker at "Conferinta de Cloud." I was amazed to find out that around 300 people join this event, that it is dedicated to cloud and Microsoft Azure. The event has two tracks, one dedicated to the business side and a second one dedicated to technical one. I was surprised to discover that almost half of them were interested in the business track. This is a good indicator that confirms us that cloud is here to stay and Microsoft Azure it is used more and [more in business. I delivered a session focused on things that we need to take into account at the moment when we start a project on the cloud. People forget to check the most basic things when they kick-off the project. Because of this, they are ending in situations where the consumption is over the limits or data is public accessible. Session slides can be found below: Title : Day Zero In A Cloud Project Abstract :  In this session, we will take

Read-only replicas - Taking advantage of free DTUs

A new feature of Azure SQL enables us to simplify how we do our day to day business when we need to have analytics capabilities in near-real time on our databases. Scenarios A common scenario is when you have an Azure SQL database that is heavy hit by the clients, and we need reporting or analytics capabilities at the same time. A common solution is to create a read-only replica that it is used for reporting, data aggregation and other daily or weekly small things that you have to do with data. Even when you have a data warehouse or a reporting layer, you still need for some narrow cases to go directly to the live database for real (near) time analytics. Another case is when you have many read operations on data that are not changed so often, and you cannot integrate a cache level. Sounds odd, but there are some country regulations that might force you to do that. Current solution For all these scenarios usually, it involves creating a replica in the same or another Azure Regio

Azure Redis Cache and connection management

These days I encountered an application deployed on Azure that had connectivity issues with Azure Redis Cache. The application is a web-based application with most of the logic inside Azure Functions. The system it is using Azure Redis Cache for data exchange between the web-application and the functions behind the scene that is crunching the data. The deployment is stable and working as expected for 5-10 minutes. After that, it is down for the next 20-30 minutes. The cycle repeats over and over again with a generic error on both sides (Azure Web Application and Azure Functions) that indicates that the source of the problem is Azure Redis Cache. The errors are similar to the one below: No connection is available to service this operation: RPUSH ST; UnableToConnect on lola.redis.cache.windows.net: 6380 /Interactive, origin: ResetNonConnected, input-buffer: 0 , outstanding: 0 , last-read: 5 s ago, last-write: 5 s ago, unanswered-write: 459810 s ago, keep-alive: 60 s, pending:

Serialized headaches when you combine .NET Core and .NET 4.6

When I'm involved in an application development project, I try all the time to keep the language ecosystem reduced as much as possible. Even if we are leaving in a world where interoperability is higher than ever, there are times when because of this you can lose a lot of time because of it. Let's take the below example: An Azure Function application wrote  in .NET 4.6 that send messages to Azure Service Bus Topic. Inside the message body content is serialized in JSON format An Azure Web App application wrote in .NET Core that receives messages from Azure Service Topic and displays it on the UI. Everything it is okay until you run the code and an error occurs during deserialization. When you take a look at what you send and receive you see the following: .NET 4.6 OUTPUT:        @ string 3http://schemas.microsoft.com/2003/10/Serialization/�k{"CountryCode":"RO","AlertType":0,"Level":5,"ValidFrom":"2018-04-0

Azure Policy - An excellent tool for resource governance inside Azure

The number of Azure Regions increases every day. Just a few days ago two new locations were announced. All the things that are now happing related to the cloud are not only crazy but also scarry. Scarry from the government and legal perspective. For example inside an Azure Subscription, a user can create storages in any location around the globe. What happens if you based in the UK, you have customer information that is not allowed to leave the country. You might say that you would train the team that has the rights to create new resources only in Azure Regions that are based in the UK. This solution is not enough because from government perspective you don't have a mechanism that would enforce this. To enforce something like this, Microsoft Azure is offering Azure Policies. This service is allowing us to define a specific list of rules and actions that are applying automatically to resources that are created under particular Azure Subscription or Azure Resource Group (or even

Different cases of Azure VM 'reboots'

This post is focusing on different types of maintenance activities that can happen on your Azure Virtual Machine (VM). Depending on the actions that need to be done on your VM, this can have a direct or indirect impact on the machine availability or performance.  Impact types First of all, let's identify how a machine can be impacted by different maintenance activities. We do not include in this discussion cases like outages or disasters. Performance decrease : During a live migration of a machine, the performance of the machine can be lower than usual. This happens because of the live migration of a machine from one physical hardware to another that consumes resources. Pause : For a short period of time the machine is paused, but without data loss or connectivity problems (E.g., RDP connection is maintained, but delayed for a short period of time). Unplanned reboot : There are isolated cases when the VM needs a restart. In this situation, there is a short period of downt