Skip to main content

Posts

Showing posts from September, 2017

Microsoft Ignite in a glance (Orladon, FL, 2017)

What a week! Last week of September was crazy from all the people that are working with Microsoft stack. The biggest Microsoft conference took place in Orlando, Florida. More than 25.000 people attended this year at Microsoft Ignite and as usually, it was an event will all the tickets sold out. There were many announcement that makes Microsoft a strong player for today needs, but also there is a clear vision where they want to go. Not only this, but it seems that the road is already defined and clear. The currents needs of the market are covered by Microsoft with Azure Stack, offering a good ground for hybrid solutions. Now we can use the core services of Microsoft Azure not only on Azure, but also on our on-premises infrastructure using Azure Stack. What is more interesting from devops and IT perspective is that you have the same experience, the same dashboard and you use the same scripts (no change is required). Mixed reality and AI are now more closers to the fields. Many

Why you should never put a junior in a team that is stressed and bad-mannered

It is nothing abnormal for a team to be stressed. During that times they might not have time for nothing else. The focus of the team is to deliver the business requirements in the IT solution. Unfortunately, this time periods can be longer than expected, even if we know that this is not good. I saw teams being more than 18 months in a phase like this. After a while in this state, they don’t even realize that they are in this state and what is the impact and project and people level. In this post, I will try to focus on the impact that such a phase can have at juniors and mid-levels in unhealthy teams. Why? When you are a junior you are in moment of you carrier when you want to learn. You know that you have things to learn, you are usually a fresh graduate with good theory knowledge and you want to put this in practice. I like to compare smart juniors with birds that have big and powerful wings, but they don't know to fly very well yet. They can reach the sky and accomplish ma

Azure Blob Storage - More storage and throughput

One of the core services of Microsoft Azure is Azure Storage that is used to store binary content, key-values pair (Azure Tables)  or message queue (Azure Queues). In today's post, we will discover how a small change in Azure Storage capabilities is changing our life and simplify our IT solutions. Current solutions The current maximum capacity of Azure Blob Storage used to be 500TB. Even if this might sounds a lot, there are multiple cases when you had to overcome this limits. If you have a system where devices and users are uploading content to your system, than you can reach easily 2-5TB per day that would force you to use a new Azure Storage account every 3 months. To overcome this limitation, your solution needs to be able to manage Azure Storage accounts automatically. Besides being able to clean and archive content automatically, you will need a system that can create a storage account on the fly and redirect traffic to it. When you use multiple Storage Account, you are f

Less than 1 week until Microsoft Ignite 2017

First time when I took part at Microsoft TechEd in 2012 in Amsterdam. It was one of my first conferences with more than 5k attendees. It was a wow, from all perspective. From then I participate to each TechEd and Microsoft Ignite . At Microsoft Ignite, attendees have the opportunity not only to learn and discover new stuff, but also to meet people all around the globe. It is that week in the year when you can meet face to face Program Managers from Microsoft together with people that you talk over Twitter from Japan, Australia, UK and USA in only one place. This year things will be a little different. It will be the first time when I participate at Microsoft Ignite not as attendee, but also as speaker. It is a joy to be invited to speak at a conference with more than 23000 attendees. If this is not enough, I will have 3 sessions where I will share my knowledge and experience related to IoT, security and NoSQL. If you want to find more about this subjects feel free to join my sessio

Is security and data privacy important on tracker devices like Fitbit?

A few days ago, I read about how insecure Fitbit devices are. There was a lot of noise created around it, explaining different ways how you can hack Fitbit device to gain access to personal data. My first reaction when I saw the title of article was “ So what!? ” and let me explain why I don’t see this a life treating or something that will stop me to use my Fitbit. Personal data It is true that a tracker contains personal data, but let us be realistic and look on what data it has. Most of the trackers contains information related to your past activity, heartbeat, number of steps and in some cases GPS information. Except GPS information, the rest of the data are not so sensitive. What do you think that a hacker can do if he knows that you done 10k steps this morning. Yes, he might know your habits and broke into your house when you are jogging or walk the dog. This scenario can be real, but the true is that there are so many ways to find out what are your habits that you would be

The scope of a PoC

Let us talk about what it should be the scope of a PoC and what you should or you should not have in a PoC. Purpose of PoC First, we need to define what is the purpose of a PoC is. The main purpose is to demonstrate the principles that are covered in technical documents (that it is not just theory and diagrams). Reusability It is already a deja vu for me to hear people that they want to reuse the PoC output in the main project. This happens because many times the PoC scope is too big and does not covers only the ideas that needs to be demonstrated. When you have a PoC that covers more than 15% of the implementation effort than you might have a problem. That is not a PoC anymore, it is a PILOT, that represents a system with a limited functionality that go in production. The Pilot might have many restrictions, from NFRs to business use cases that are covered, but it has some part that works. You will never want to invest in a PoC more than it is necessary and you shall always pus

Containerization without a microservices approach

The current trends are clear. We should develop software applications using only microservice approach. This sounds goods for new application, where system requirements guides us to go with a microservice approach. But what happens for the other types of systems. We might need to develop a normal web application, with some backend processing behind it. No crazy NFR, no need to scale to 100.000 RPS or similar stuff. Monolithic application As an example let us imagine that, we need to develop a web application that resize our pictures to Instagram size (1x1). There are no special requirements related to availability or scalability and the load on the system is a low. The system is used just by our employees (less than 5.000) for company images that needs to be published on commercial web sites. Of course, we can imagine a state of the art microservice implementation, with different services that scale by themselves. What if we do not need something like this, but is very appealing

List of IPs used by each Azure Resource (service)

It is not uncommon to configure the firewall and other security and control mechanism like User Defined Routes (UDR) and NGA (Network Security Groups) to restrict access to your Azure Resources. In the moment when we want to do such a thing we need to know the IPs that are used by Azure Infrastructure. Let’s take as example a web application that is hosted inside App Service (using VNETs, Traffic Manager, Azure Storage, Azure SQL and many more). To be able to properly configure the access rules, we need to know what are the IPs used by Azure Storage and Azure SQL in that region, Traffic Manager IPs used for probing and so on. Azure Region IP Range Most of this information can be found in a XML provided by Microsoft ( https://www.microsoft.com/en-us/download/details.aspx?id=41653 ), but I expect that this will not enought. You’ll find inside the document the IP ranges that are used by each Azure Region, but without a tag that specify what IP ranges are used by each Azure Resource i

Is RDP connection open by default for VMs inside Azure?

I saw on Twitter a discussion related to Azure VMs and RDP connection that are open by default. The main purpose of this topic is to present different use cases when the RDP connection (not) is available by default. Use Case 1: Single VM (VM with Public IP inside a default VNET) – RDP Active by default for public access In this context, we have a VM that is created from Azure Portal (or script) as a single entity. It is not part of any scale set or other type of custom configuration. It is just a simple Windows Server 2016 Datacenter machine, which is part of a default VNET with a Public IP allocated to it. In this case, by default the RDP will be configured. The default Network Security Group (NGS) that is created together with our VM will allow RDP connection to the machine. The default VNET allows RDP connection to our VM, because there are no custom NGS rules to restrict it and we have a Public IP attached to our VM. Use Case 2: Single VM (VM without Public IP in

Configure Traffic Manager and Web Apps over SSL (HTTPS) using custom domain

In this topic we will cover what we shall do when we: Configure Azure Traffic Manager on top of Web Applications hosted inside App Services  Over HTTPS With custom domain  Client certificates Context When we are using HTTPS in combination with App Services, everything will go smooth. You just need to activate the HTTPS and upload client certificate, if you want to use a custom one. Things are a little different when you want to configure HTTPS on top of Traffic Manager. In theory, the steps are clear and it should work as expected, but combined this with custom domain and client certificates things can end up with a 404 error code.. Initial Setup Pre requirements: Web Apps are configured over SSL using custom domain and works as expected. Let’s take a look on the base steps that needs to be done when you want such a configuration Create an instance of Traffic Manager inside Azure Portal and add your Web Apps that you already configured for HTTPS Add your custom domain