Skip to main content

First impression of Visual Studio Online

2015 started in full force. I’m starting a new project with an interesting setup.
Because of political reasons we decided to go with Visual Studio Online. The initial setup was pretty okay, we configured with success everything we need, from permissions and rights to folder structure and list of tasks.
I was impressed how nice and useful the task board in Visual Studio Online is. Nothings complicate, simple and easy to use. You have the ability to define custom flows, like in the on-premises version of TFS with Scrum and Agile support.
This is the second project where I am  involved where Visual Studio Online it used.

Document Management
The first thing that hit us pretty hard was the integration with a document management system. It seems that there is no full support in this moment to create and integrate a SharePoint web site with
Visual Studio Online.
I expected to be able to create directly from Visual Studio Online a SharePoint web site for my project or to offer another mechanism for document management. Until now there is no integration with Office 365 or other systems.
We could create on an on-premises instance of SharePoint or on Office365 project web site to store and manage the documentation related to project, but we say NO.
Our approach was a base one. On source control we created a folder dedicated for documentation – “Documentation”. Inside it we are adding and manage all the documents that we need.

Team Size
In Visual Studio Online, with a MSDN subscription you can add maximum 5 people in the team to have access to VSO for free. For each additional resource you will need to pay 33.52 per month (remarks: in this price you have the Visual Studio license included).
For all user accounts that are MSDN subscription, the cost is FREE (smile).

Continuous Integration – Automatically build
The build configuration for a project is very simple and strait. It is done in the same way like for TFS on-premises. Nothing special or extraordinary. The base build setup was done in 5 minutes, including unit tests and custom email alerts on build fail.
And now was the time to calculate the costs of automatically builds. In Visual Studio online every month you get 60 minutes free for build. Everything beyond this point needs to be paid. The price per minute is 0.0373e per minute for the first 1200 minutes and 0.0075e per minutes what is beyond 1200 minutes.
Surprise! We calculate how much will costs us to have automatically build on Visual Studio Online…
Build Duration: 20 minutes
Working window on a day: 10 hours
Number of build per day: 30 builds
Number of developers: 4-7 people
Number of days per month: 23 days
Total minutes per month: 13.800 minutes
Minutes cost: 60 minutes free + 1140 minutes at 0.0373e + 12600 minutes at 0.0075e
Cost: 0 + 1140*0.0373+12600*0.0075 
Total Cost: 137.022e
This is the maxim cost of build, because we configured only one build agent. Because of this, even if the build will run more than 20 minutes, the cost should remain the same because we have only one build agent – no builds in parallel.
In this moment I don’t know what to say about the costs. It is low, it is high, it is okay.
In my own opinion, the cost of build is acceptable, because we don’t need to manage the server and if we consume less minutes, then we will pay less. In 6-8 months I will come with come with some statistical data.

Conclusion
Until now the experience with Visual Studio Online was nice, but I keeping my eyes on costs.

Comments

  1. On "political issues" - I think the most difficult problem to overcome is for companies to trust and external hosting company with the source code. However, for startups and personal projects, it might be an alternative.

    ReplyDelete
    Replies
    1. We didn't had any kind of issues. And take into account that we are talking about the client that you work for also (smile). On our case one of our partners preferred VS Online.

      Delete
    2. I know, I was talking in general.. :)

      Delete

Post a Comment

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

Azure AD and AWS Cognito side-by-side

In the last few weeks, I was involved in multiple opportunities on Microsoft Azure and Amazon, where we had to analyse AWS Cognito, Azure AD and other solutions that are available on the market. I decided to consolidate in one post all features and differences that I identified for both of them that we should need to take into account. Take into account that Azure AD is an identity and access management services well integrated with Microsoft stack. In comparison, AWS Cognito is just a user sign-up, sign-in and access control and nothing more. The focus is not on the main features, is more on small things that can make a difference when you want to decide where we want to store and manage our users.  This information might be useful in the future when we need to decide where we want to keep and manage our users.  Feature Azure AD (B2C, B2C) AWS Cognito Access token lifetime Default 1h – the value is configurable 1h – cannot be modified

What to do when you hit the throughput limits of Azure Storage (Blobs)

In this post we will talk about how we can detect when we hit a throughput limit of Azure Storage and what we can do in that moment. Context If we take a look on Scalability Targets of Azure Storage ( https://azure.microsoft.com/en-us/documentation/articles/storage-scalability-targets/ ) we will observe that the limits are prety high. But, based on our business logic we can end up at this limits. If you create a system that is hitted by a high number of device, you can hit easily the total number of requests rate that can be done on a Storage Account. This limits on Azure is 20.000 IOPS (entities or messages per second) where (and this is very important) the size of the request is 1KB. Normally, if you make a load tests where 20.000 clients will hit different blobs storages from the same Azure Storage Account, this limits can be reached. How we can detect this problem? From client, we can detect that this limits was reached based on the HTTP error code that is returned by HTTP