Skip to main content


Showing posts from December, 2013

NuGet packages, project references and wrong use of packages

In today post I will talk about what you should never do when you are using NuGet. We have an application developed 2-3 years ago. Coming to V2, means also uprating different version of stacked used like Entity Framework, Unity and so on. When the first version was develop, each package from NuGet was installed in one or two project and added as assembly reference in the other one. Until now everything sounds good. But there is a small mistake. When you reference the assemblies in the other project, you are referring them something like this “../packages/FooPackage 1.2.3”. When you will upgrade the package using NuGet, the references to the package in the projects where package was installed are updated to the new version of the package. But, in the project where there are only references to the assemblies from the package nothing will happen. This means that you will end up with two version of the same stack (package) in the same application. This is the dream of every develo

Windows Azure Emulator, Web Apps & 500 Errors caused by IIS Express

Windows Azure Emulator is a tool that provide us the ability to debug and run application for Windows Azure on our machine. When we have a WEB application, by default when we run our application on the local machine will use IIS Express. 90% of the people will say: That’s great, I don’t need a normal IIS for this. This is true until you use features of IIS that are not supported on IIS Express. For example “IPSecurity” section, that give us the ability to control what IPs we allow to access our application. If we run a web application into a web role with this configuration, we will receive a 500 error code. After this we will spend 1h to activate the feature that allow us to see the custom error and in the end we will see 500.19 internal error: “This configuration section cannot be used at this path. This happens when the section is locked at a parent level. Locking is either by default…”. What should we do next? An option is to comment the configuration but WAIT. If we open the pr

Windows Azure Storage and Read-Access Geo Redundant Feature (Part 2)

Part 1 In the last post we describe the new feature of Windows Azure Storage that give us the possibility to not only replicate the storage content to another datacenter, but also access in in read-only mode when the primary storage is down - Read Access Geo Redundant Storage (RA-GRS) In this post we’ll talk about some things that we should be aware when we are staring to use RA-GRS. Retry Policy First thing that we need to be aware is the retry policy. With geo redundant feature we need another retry policy that can automatically fall back to the second account when the first down cannot be accessed. For this purpose a new interface was created called “IExtendedRetryPolicy” was created. This new interface comes with a method called “Evaluate” that detect if the operation should be retried or not. Because we have two storage accounts that needs to be checked there is a small change in behavior. We need a mechanism to switch the two accounts and also to take into the consideration

Windows Azure Storage and Read-Access Geo Redundant Feature

The end this year brought to me a great news. Azure team announced that we have support for Read Access of Geo Redundant Storage. Until now, we could activate geo redundant feature, that guaranty us that all the content would be duplicated to another data center and in case of a disaster, Azure would be able to recover our content from the storage. But until the recovery would be made and so on we would not be able to access the content until the cluster failover storage recovery mechanism would be triggered. Read Access of Geo Redundant Storage is a long and complicate name. But more important than name is what the feature offer to us. From now, if we activate it and the data center used by to store our data goes down, we will be able to use the second one (that is used for geo-redundancy) to access our content. We’ll not be able to write content, but we can read content for it – in read only mode. For application that store updates, configuration or resources this is a crucial th

[Post Event] Global Day of Code Retreat 2013, December 14th,

Last Saturday I participate to a Code Retreat . This event was part of the “Global Day of Code Retreat 2013” where over 2.200 people participate. The Cluj-Napoca event was organized by RABS in collaboration with Codecamp, Cluj.rb, Agile Works and Functional Programmers Cluj-Napoca. We had 5 rounds where we resolve the classic Code Retreat problem – The Game of Live. During this rounds I had the opportunity to exercise and learn new things not only related to TDD but also related to how you should approach a problem. I had the opportunity to resolve the problem in C#, Java, JavaScript and Typescript. The round that I enjoyed the most was the one where you could not to talk with your college. In a situations like this it is crucial to name your classes, methods and TEST UNIT methods very explicit and good. From the code quality perspective and the way how we resolve the problem, I had the feeling that that round was the best one ever. The most important think that I learned was: sma

MVC - How to log requests that are coming to our web application

In this blog post we will see where we can hook our code if we need to log all the requests that are coming to our web application. There are moments when we want to log all the requests that are coming to our web application. From the requests URL, to the query or form parameters. This task can be done in different ways. In this blog post we will see four different ways to do this. Action Filter A solution could be to create an action filter that has access to all the request information. From request we access information like request URL, query params, form params and so on. public class FooActionFilterAttribute : ActionFilterAttribute { public override void OnActionExecuting(ActionExecutingContext filterContext) { HttpRequestBase request =filterContext.RequestContext.HttpContext.Request; string requestType = request.HttpMethod; string formParams = request.Form.ToString(); string queryParams = request.QueryString.ToString(); str

Digging through SignalR - Command Line Arguments

SignalR contains a console application that can be used to make stress tests. The solution that was used to parse the command parameters was ‘cmdline – Command Line Parser’. This is a pretty nice library that can be installed using NuGet. This command line parses works perfectly, this is why the last time when it was updated in in September 2012. Beside this solution, you can find another solution on Codeplex and github called very similar “Command Line Parser Library”. Both solutions are great and resolve the same problem in a very similar way. First time when I saw this two solutions I could swear that are the same solution (from the available API). I saw a lot of projects, where people started to implement their own arguments parsers, even if we have plenty of them on the marker. It is not so important what command line parser you use it, as long you use it and don’t rewrite it again and again. In general, the use cases that we need to support are pretty simple and are covered by

[Post Event] ITDays , December 5-6, 2013 - Extension Points in Enterprise Application

Yesterday I had the opportunity to participate at ITDays as a speaker. I had a 30 minutes session where I talked about Extension Points in Enterprise Application. It was pretty hard to put all the content that I want in a 30 minute session, because of this, I will try next days to come with 2-3 post related to this subject. Until than I attached my slides from this session to this blog post.

Service Bus and Scalability

One of the beauty of cloud is scalability. You can scale as much you want, without having problem. Of course this is in theory. A cloud provider offer this feature to you, but a developer/architect needs to know how to use it in a way to design a high scalable system. Each cloud provider offer different points for scalability. We need to know very well what this points are and how we can use them in our benefit. As a messaging system, Windows Azure offer us Windows Azure Service Bus. This a powerful service when an application needs a messaging system. But of course, Service Bus, as well as other services has its own limitations. In this moment, the maximum number of subscription for a specific topic is 2.000. If we would have one client for each subscription that the maximum number of clients per topic is 2.000. What should we do if we would have 10.000 clients? It is clear that we cannot have 10.000 subscriptions per topic. We could send an email to Microsoft to ask them why they