Skip to main content

Posts

Showing posts from 2013

NuGet packages, project references and wrong use of packages

In today post I will talk about what you should never do when you are using NuGet. We have an application developed 2-3 years ago. Coming to V2, means also uprating different version of stacked used like Entity Framework, Unity and so on. When the first version was develop, each package from NuGet was installed in one or two project and added as assembly reference in the other one. Until now everything sounds good. But there is a small mistake. When you reference the assemblies in the other project, you are referring them something like this “../packages/FooPackage 1.2.3”. When you will upgrade the package using NuGet, the references to the package in the projects where package was installed are updated to the new version of the package. But, in the project where there are only references to the assemblies from the package nothing will happen. This means that you will end up with two version of the same stack (package) in the same application. This is the dream of every develo

Windows Azure Emulator, Web Apps & 500 Errors caused by IIS Express

Windows Azure Emulator is a tool that provide us the ability to debug and run application for Windows Azure on our machine. When we have a WEB application, by default when we run our application on the local machine will use IIS Express. 90% of the people will say: That’s great, I don’t need a normal IIS for this. This is true until you use features of IIS that are not supported on IIS Express. For example “IPSecurity” section, that give us the ability to control what IPs we allow to access our application. If we run a web application into a web role with this configuration, we will receive a 500 error code. After this we will spend 1h to activate the feature that allow us to see the custom error and in the end we will see 500.19 internal error: “This configuration section cannot be used at this path. This happens when the section is locked at a parent level. Locking is either by default…”. What should we do next? An option is to comment the configuration but WAIT. If we open the pr

Windows Azure Storage and Read-Access Geo Redundant Feature (Part 2)

Part 1 In the last post we describe the new feature of Windows Azure Storage that give us the possibility to not only replicate the storage content to another datacenter, but also access in in read-only mode when the primary storage is down - Read Access Geo Redundant Storage (RA-GRS) In this post we’ll talk about some things that we should be aware when we are staring to use RA-GRS. Retry Policy First thing that we need to be aware is the retry policy. With geo redundant feature we need another retry policy that can automatically fall back to the second account when the first down cannot be accessed. For this purpose a new interface was created called “IExtendedRetryPolicy” was created. This new interface comes with a method called “Evaluate” that detect if the operation should be retried or not. Because we have two storage accounts that needs to be checked there is a small change in behavior. We need a mechanism to switch the two accounts and also to take into the consideration

Windows Azure Storage and Read-Access Geo Redundant Feature

The end this year brought to me a great news. Azure team announced that we have support for Read Access of Geo Redundant Storage. Until now, we could activate geo redundant feature, that guaranty us that all the content would be duplicated to another data center and in case of a disaster, Azure would be able to recover our content from the storage. But until the recovery would be made and so on we would not be able to access the content until the cluster failover storage recovery mechanism would be triggered. Read Access of Geo Redundant Storage is a long and complicate name. But more important than name is what the feature offer to us. From now, if we activate it and the data center used by to store our data goes down, we will be able to use the second one (that is used for geo-redundancy) to access our content. We’ll not be able to write content, but we can read content for it – in read only mode. For application that store updates, configuration or resources this is a crucial th

[Post Event] Global Day of Code Retreat 2013, December 14th,

Last Saturday I participate to a Code Retreat . This event was part of the “Global Day of Code Retreat 2013” where over 2.200 people participate. The Cluj-Napoca event was organized by RABS in collaboration with Codecamp, Cluj.rb, Agile Works and Functional Programmers Cluj-Napoca. We had 5 rounds where we resolve the classic Code Retreat problem – The Game of Live. During this rounds I had the opportunity to exercise and learn new things not only related to TDD but also related to how you should approach a problem. I had the opportunity to resolve the problem in C#, Java, JavaScript and Typescript. The round that I enjoyed the most was the one where you could not to talk with your college. In a situations like this it is crucial to name your classes, methods and TEST UNIT methods very explicit and good. From the code quality perspective and the way how we resolve the problem, I had the feeling that that round was the best one ever. The most important think that I learned was: sma

MVC - How to log requests that are coming to our web application

In this blog post we will see where we can hook our code if we need to log all the requests that are coming to our web application. There are moments when we want to log all the requests that are coming to our web application. From the requests URL, to the query or form parameters. This task can be done in different ways. In this blog post we will see four different ways to do this. Action Filter A solution could be to create an action filter that has access to all the request information. From request we access information like request URL, query params, form params and so on. public class FooActionFilterAttribute : ActionFilterAttribute { public override void OnActionExecuting(ActionExecutingContext filterContext) { HttpRequestBase request =filterContext.RequestContext.HttpContext.Request; string requestType = request.HttpMethod; string formParams = request.Form.ToString(); string queryParams = request.QueryString.ToString(); str

Digging through SignalR - Command Line Arguments

SignalR contains a console application that can be used to make stress tests. The solution that was used to parse the command parameters was ‘cmdline – Command Line Parser’. This is a pretty nice library that can be installed using NuGet. This command line parses works perfectly, this is why the last time when it was updated in in September 2012. Beside this solution, you can find another solution on Codeplex and github called very similar “Command Line Parser Library”. Both solutions are great and resolve the same problem in a very similar way. First time when I saw this two solutions I could swear that are the same solution (from the available API). I saw a lot of projects, where people started to implement their own arguments parsers, even if we have plenty of them on the marker. It is not so important what command line parser you use it, as long you use it and don’t rewrite it again and again. In general, the use cases that we need to support are pretty simple and are covered by

[Post Event] ITDays , December 5-6, 2013 - Extension Points in Enterprise Application

Yesterday I had the opportunity to participate at ITDays as a speaker. I had a 30 minutes session where I talked about Extension Points in Enterprise Application. It was pretty hard to put all the content that I want in a 30 minute session, because of this, I will try next days to come with 2-3 post related to this subject. Until than I attached my slides from this session to this blog post.

Service Bus and Scalability

One of the beauty of cloud is scalability. You can scale as much you want, without having problem. Of course this is in theory. A cloud provider offer this feature to you, but a developer/architect needs to know how to use it in a way to design a high scalable system. Each cloud provider offer different points for scalability. We need to know very well what this points are and how we can use them in our benefit. As a messaging system, Windows Azure offer us Windows Azure Service Bus. This a powerful service when an application needs a messaging system. But of course, Service Bus, as well as other services has its own limitations. In this moment, the maximum number of subscription for a specific topic is 2.000. If we would have one client for each subscription that the maximum number of clients per topic is 2.000. What should we do if we would have 10.000 clients? It is clear that we cannot have 10.000 subscriptions per topic. We could send an email to Microsoft to ask them why they

Service Bus - Optimize consumers using prefetch and maximum concurent calls features

From same version ago, Windows Azure Service Bus supports 'event notification'. This means that we can register to an event that will be triggered each time when a new message is available for us. QueueClient client = QueueClient.Create("queue1"); client.OnMessage( OnMsgReceived, new OnMessageOptions()); ... void OnMsgReceived(BrokeredMessage message) { ... } This is a great feature, that usually make our life easier. By default, when we are consuming messages in this way, we will make a roundtrip to the Service Bus for each message. When we have applications that handle hundreds of messages, the roundtrip to the server for each message can cost us time and resources. Windows Azure Service Bus offer us the possibility to specify the number of messages that we want to prefetch. This means that we will be able to fetch 5, 10, 100, .. messages from the server using a single request. We could say that is similar to the batch mechanism, but can be used wit

Extract relative Uri using MakeRelativeUri method

Did you ever need to compare two web address and extract the relative part of them? For example if we have address " http://foo.com " and " http://foo.com/car/color/power " we would like to get "car/color/power". But if we would have " http://foo.com/car " and " http://foo.com/car/color/power " we would like to get "color/power". For this cases we should use "MakeRelativeUri". This method is part of Uri class and determines the delta between two URL address. This simple method extract the difference from two URL. Code examples: Uri uri = new Uri("http://foo.com"); Uri result = Uri.MakeRelativeUri("http://foo.com/car/color/power"); // result: "car/color/power" Uri uri = new Uri("http://foo.com/car"); Uri result = Uri.MakeRelativeUri("http://foo.com/car/color/power"); // result: "color/power" Uri uri = new Uri("http://foo.com"); Uri result = U

Sync Group - Let's talk about Performance

In one of my latest post I talked about synchronization functionality that is available for SQL Azure. There was a question related of the performance of this service. So, I decided to make a performance test, to see what are the performance. Please take into account that this service is in the preview and the performance will change when the service will be released. For this test I had the following setup: Database Size 7.2 GB 15 tables 2 tables with more than 30.000.000 of rows (one table had around 3.2 GB and the other one had 2.7 GB) 34.378.980 rows in total Database instances 1 DB in West Europe (Hub) 1 DB in West Europe 1 DB in North Europe 1 DB in North Central US Agent 1 agent in West Europe Configuration Hubs win Sync From Hub Scenario One: Initialize Setup I started from the presumption that your data were not duplicated yet on all the databases. First hit of the Sync button will duplicate the database schema of the tables that needs to be

[PostEvent] Slides from MSSummit 2013, Bucharest

Last week I had the opportunity to participate at MSSummit . During this event I had the opportunity to present and talk SignalR and load testing using Windows Azure and Visual Studio 2013. More about this event:  http://vunvulearadu.blogspot.ro/2013/11/postevent-mssummit-2013-bucharest.html You can find my session slides below: Real time fluent communication using SignalR and Cloud (Windows Azure) from Radu Vunvulea Load tests using visual studio 2013 and Cloud from Radu Vunvulea

How to get the instance index of a web role or worker role - Windows Azure

When we have one or more instances of a specific web role or worker role in the cloud, there are moments when we want to know from the code how many instances we have of the specific type or the index of current instance. The total number of instances can be obtained using RoleEnvironment.Roles[i].Value.Instance.Count To be able to detect the index of the current instance we need to parse the id of the role instance. Usually the id of the current instance ends with the index number. Before this number we have ‘.’ character if the instance is on the cloud or ‘_’ when we are using emulator. Because of this we will end with the following code when we need to get the index of the current instance: int currentIndex = 0; string instanceId = RoleEnvironment.CurrentRoleInstace.Id; bool withSuccess = int.TryParse(instanceId.Substring(instanceId.LastIndexOf(".") + 1, out currentIndex)); if( !withSuccess ) { withSuccess = int.TryParse(instanceId.Substring(instanceId.LastIndexOf(&

Sync Group - A good solution to synchronize SQL Databases using Windows Azure infrastructure

Working with Windows Azure becomes something that is more and more pleasant. In this post we will see how you can synchronize multiple databases hosted on Azure or on premise using Sync Group. First of all, let’s start with the following requirement: We have an application that contains a database that needs to be replicated in each datacenter. What should we do to replicate all the content in all the datacenters? A good response could be Sync Group (SDS). Using this feature we can define one or more instances of SQL Databases (from cloud or on premise) that will be synchronized. For this group we can specify the tables and columns that will be synchronized. Creating a SDS can be made very easily from Windows Azure portal. This feature can be found under the SQL DATABASES tab – SYNC. I will not start to explain how you can create this group, because it is very easily, you only need to know the databases server address, user names and passwords. I think that more important than th

Throttling and Availability over Windows Azure Service Bus

In today post we will talk about different redundancy mechanism when using Windows Azure Service Bus. Because we are using Windows Azure Service Bus as a service, we need to be prepared when something goes wrong. This is a very stable service, but when you design a solution that needs to handle millions of messages every day you need to be prepared for word case scenarios. By default, Service Bus is not geo-replicated in different data centers, because of this if something is happening on the data center where your namespace is hosted, than you are in big troubles. The most important thing that you need to cover is the case when the Service Bus node is down and clients cannot send messages anymore. We will see later on how we can handle this problem. First of all, let’s see why a service like Service Bus can go down. Well, like other services, this has dependencies to databases, storages, other services and resources. There are cases when we can detect pretty easily the cause of th

How to monitor clients that access your blob storage?

Some time ago I wrote about the monitor and logging support available for Windows Azure Storage. In this post we will talk about how we can use this feature to detect what clients are accessing storage. Let’s assume that we have a storage that is accessed by 100.000 users. At the end of the month we should be able to detect the users that downloaded a specific content with success. What should we do in this case? (Classic Solution) Well, in a classic solution, we would create an endpoint that will be called by the client after he download with success the specific content. In this case we would need to create a public endpoint, hosted it, persists the calls messages, manage and maintain the solution and so on. In the end we would have additional costs. What should we do in this case? (Windows Azure Solution) Using Windows Azure Storage, we can change the rules of the game. Windows Azure Storage offer us out of the box support for logging mechanism. All the request that ar

Digging through SignalR - Dependency Resolver

I’m continuing the series of post related to SignalR with dependency injection. When having a complicated business you can start to group different functionalities in classes. Because of this you can end up very easily with classes that accept in constructor 4-5 or even 10 parameters. public abstract class PersistentConnection { public PersistentConnection( IMessageBus messageBus, IJsonSerializer jsonSerializer, ITraceManager traceManager, IPerformanceCounterManager performanceCounterManager, IAckHandler ackHandler, IProtectedData protectedData, IConfigurationManager configurationManager, ITransportManager transportManager, IServerCommandHandler serverCommandHandler, HostContext hostContext) { } ... } Of course you have a decoupled solution that can be tested very easily, but in the same time you have a fat constructor. People would say: “Well, we have dependency injector, the resolver will handle the constructo

[Event] Global Day of Coderetreat in Cluj-Napoca! - December 14th, 2013 - Codecamp

We invite you to the Global Day of Coderetreat in Cluj-Napoca! Global Day of Coderetreat is a world-wide event celebrating passion and software craftsmanship. Last year, over 70 passionate software developers in Cluj-Napoca joined the 2000 developers in 150 cities around the world in spending the day practicing the craft of software development using the coderetreat format. This year, we want to repeat the experience and extend it even more. We will focus our practice on XP techniques like pair programming, unit testing,  or TDD and we'll give special attention to OOD and good code in general. Find more about this event! Reserve your seat, here! Co-organized by: RABS , Code Camp , Cluj.rb , Agile Works and Functional Programmers Cluj-Napoca

Windows Azure Service Bus - What ports are used

Windows Azure Service Bus is a great mechanism to distribute messages across components of your own system or to different clients. When you want to use this service for enterprise projects you will meet the IT guys. For them it is very important to control the ports that are used by applications. Because of one of the first’s question that they will ask you is: What are the ports that are used by Service Bus? When this question is asked you should be prepare to have the answer prepared. Looking over the documentation from MSDN, the following ports are used by Windows Azure Service Bus: 80 – HTTP connection mode 443 – HTTPS connection mode 5671 – Advanced Message Queuing Protocol (AMQP) 5672 – AMQP 9350 – WCF with connection mode Auto or TCP using .NET SDK 9351 – WCF with connection mode Auto or TCP using .NET SDK 9352 – WCF with connection mode Auto or TCP using .NET SDK 9353 – WCF with connection mode Auto or TCP using .NET SDK 9354 – WCF with connection mode Auto or TCP

Debugging in production

Abstract In this article we have discovered how we can create a dump and which the basic tools to analyze it are. By means of dump files we can access the information we could not normally access. Some data can only be accessed through these dumps and not by other means (Visual Studio debug). We could state that these tools are very powerful, but they are rather difficult to use, as they require quite a high degree of knowledge. How many times has it happened to you to have a problem in production or in the testing environment which you are unable to reproduce on the development machine? When this thing happens, things can go off the track, and we try out different ways of remote debugging. Without even knowing, helpful tools can be right near at hand, but we ignore them or simply don’t know how to use them. In this article I will present different ways in which we can debug without having to use Visual Studio. Why not using Visual Studio? Though Visual Studio is an extremely go

Simple load balancer for SQL Server Database

Two weeks ago I started to work on a PoC where the bottle neck is the database itself. We had a lot of complicated and expensive queries over the database. Because of this, if we wanted to have good performance we had to find a solution to have more than one instance of SQL Server. Because the solution was on Windows Azure, we wanted to test different configurations and solution. We wanted to see what the performance is if we use 1, 2 and 3 instances of SQL endpoints. Not only this, we had different times of SQL endpoints –Azure SQL (SAS), Virtual Machine with SQL Server and SQL Azure Azure SQL Premium. (SAS but with dedicated resources).  The good part in our scenario was that the data don’t change very often. We could start from the assumption that the database will be updated only one time per day. Because of this we could use different methods to create a load balancing from our SQL instances. Because the time was very limited and the only think that we needed was to distribu

[PostEvent] MSSummit 2013, Bucharest

WOW Why? Because last two days (6-7 November 2013) I had the opportunity to participate at MSSummit 2013 , that was held in Bucharest. It was one of the biggest IT event organized in Romania this year. There were more than 1000 people, 5 simultaneous tracks and more than 45 speakers. It’s been a while since I’ve seen such an event in Romania. I was impressed about the event itself, location and the number of people. The session list was extremely interesting, we had session held by David Chappell, Chris Capossela, Michael Palermo and so on. All the sessions were great, we had the opportunity to learn and discover new stuff. After this event we could say that Microsoft is full of surprising things (in the good way). At this event I was invited as a speaker and I held two sessions where I talked about how we can write and load tests using Visual Studio 2013 and Windows Azure. In the second session I talked about fluent communication between client – server – client using SignalR. Sp

How to read response time when you run a performance test

Measuring the performance of an application is mandatory before releasing it. Measuring the performance of a PoC is also mandatory to validate the base concepts and ideas. Performance of a system can be measured in different ways, from the process time, to numbers of users, processor level, memory level and so on. Before starting a performance test you should know exactly what you want to measure. When you want to measure the response time of a specific service/endpoint you should be aware how to interpret the results. Statistical information can be saw in different ways. Each of this view can give you a different perspective of the results. Average The average of the response time is calculated. This information is important if you want to know what is the average request time. Even if this information can be very useful, this information is misleading. For example from 1000 requests, you can have an average response time of 16 seconds even if you your chances to have a  reques

VM and load balancer, direct server return, availability set and virtual network on Windows Azure

In today post we will talk about how we can configure a load balancer, direct server return, availability set and virtual network when working with virtual machine on Windows Azure. For virtual machines we cannot talk about load balancer without talking about endpoints . Endpoints give you the possibility to specify the port (endpoint) that can be public accessed. For each endpoint you need to specify the public port, private port and protocol (TCP,UDP). The private and public port can be very useful when you want to redirect specific calls to different ports. If you don’t specify the endpoints for a virtual machine, the machine cannot be accessed from outside (for example when you host a web page and you need to have port 80 open).   For web and worker roles this load balancer is out of the box support. We could say that it is the same think for virtual machines also, but you need to make some custom configuration. You will need to specify that you will use the load balancer func