Skip to main content

Posts

Showing posts from October, 2016

Where can I backup my content - Azure File Storage

Nowadays taking pictures is so easy, you can end up easily with 20-30 shots per day. After a few years of using external HDD as backups I realize that is a time for a change. Until now I used OneDrive to backup pictures from my phone and 3 external HDD that were putted in mirror. This setup works great as long you have enough storage and you external HDDs. But now I ended up with the OneDrive storage at 99.8% full (from 30GB) and with external storage almost full also. What I need? I looked up for a solution that is cheap, can be accessed from anywhere I don't need to install custom software on my machine to be able to access it. The ideal solution would be to be able to attach the storage as a partition. Buying external HDD s is not a solution anymore for me anymore. I would prefer a cloud or storage provider that could offer me a reliable storage for a good price. Azure Offer An interesting offer is coming from Microsoft Azure, more exactly from Azure File Storage. The s

Let’s be inventive on how we can process and collect data

Let's discover together another approach to collect and transform information that is send by devices. Context In the world of smart devices, devices are more and more chatty.  Let’s assume that we have a smart device that needs to sends every 30 seconds a Location Heartbeat that contains <device ID, GPS location, time sample>. Worldwide we have 1.000.000 devices that sends this information to our backend. At global level, backend runs on 4 different Azure Regions, with an equal distribution of devices. This means that on each instance of our backend there will be 250.000 devices that sends heartbeats with their locations. From a load perspective, it means that on each Azure Region there are around 8.300 heartbeats requests every second. 8K messages per second might be or not an acceptable load, it depends on what actions we need to do for each request. Requirements From the end user there are two clear requirements that needs to be full fit: Full history of device

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

Run native .NET application in Docker (.NET Framework 4.6.2)

Scope The main scope of this post is to see how we can run a legacy application written in .NET Framework in Docker. Context First of all, let’s define what is a legacy application in our context. By a legacy application we understand an application that runs .NET Framework 3.5 or higher in a production environment where we don’t have any more the people or documentation that would help us to understand what is happening behind the scene. In this scenarios, you might want to migrate the current solution from a standard environment to Docker. There are many advantages for such a migration, like: Continuous Deployment Testing Isolation Security at container level Versioning Control Environment Standardization Until now, we didn’t had the possibility to run a .NET application in Docker. With .NET Core, there was support for .NET Core in Docker, but migration from a full .NET framework to .NET Core can be costly and even impossible. Not only because of lack of features, but

Vampires | How to clean you Azure Subscription

Playing and testing different scenarios and use cases in Azure can generate a lot of ‘garbage’ under your subscriptions. Vampires When I say garbage I’m referring to different resources that you create but forget to remove. Don't think at resources like Storage, Web Apps or Worker Roles, but many times when you remove things in odd order, you can end up with some resources that are allocated, but you don’t use anymore and you forget about them. For example the storage of a VM, or Traffic Manager that was used for a Web App. I named 'vampires' this kind of resources. Why? They consume small amount of money every month. Even the value is not high, you can end up at the end of the month with $10 or $20 consumed on them. They are like electrical vampires that we have in our house – a TV that is in standby, a phone/tablet charger, audio system. Best Practice The best practice is very clear and simple. Remove all the time resources that you don’t use anymore. But, sometime

Team Service & TFS - Release Management and Multiple Environment Support

I was impressed to discover a new feature on Team Service and TFS 2015 - Release Management . Release Management allow us two things that can improve our release procedures: Define multiple environments and push the binaries (content) from one environment from another Request manual approve before pushing the packages from one environment to another Environments Definition Once the CI finish the build, we can specify in what environment we want to push the output. Once the content is pushed to a specific environment, we can run any kind of steps, from simples once like Integration Tests to more complex one, where people interaction is required.  Team Service support us to define one or more release definitions. Each release definitions represents the tasks that needs to be run on each environment. It can be a simple script that change a version, or adding/removing content from the build. In the end can be any action that needs to be executed when we deploy content to a sp

MVP for another year

Happy to be part of  Microsoft MVP for another year. The news I received in Atlanta, during Microsoft Ignite Conference , when I was exited to receive this great new. Proud and happy to be part of this of a group of people, that is open and ready to offer support for online and local communities with their technical expertise. As for the last year, I plan to sustain the local community. On top of this I have two new initiatives on my radar: One blog post per week that will review Azure Service (more details next week) More activity on the online communities