Skip to main content

Storing IoT data - Another perspective

The current trend in IT industry, especially in IoT world is to store all the data that is produced by all the devices or systems that are connected to a network. The storage is so cheap, that companies prefers to store and archive all this data, without caring if the data can be used now or in the future.
The main scope of this approach is to have all the information that might be needed in the future, if a system like Machine Learning or Hadoop is used. You never know what parameter or logs might become relevant in the future for an insight or a trend.

Things are becoming interesting in the moment when your devices contains more than100 or 200 sensors per device, with a sample rate of 1s or 0.1s.
Let’s take as example a bottle manufacture, that has plants all around the globe. Each plant has 800-1000 devices connected that runs 24/7.
The quantity of data that is produced every day can reach easily 0.2-0.5TB. Each day new data will arrive to the platform in the warehouse. Collected data might be processed and stored for later use. But, not all the time all the data is processed. You might want to keep data for later use for systems like Machine Learning.

The thing that should pop up in our mind is:
Do we need to store all the data that is produced by a plant?

At machine and gateway level, it might be very important to collect metrics at 0.01s or 0.05s time interval. But at plant level, this can be irrelevant. At plant level 0.5s or 1s time interval can be more than we need. This means that we already reduce the size of the data that is produced by a device from a plant with a factor of 5-10x.
When we look at global level, the information that are produced by devices are not relevant at second level. Not only this, but not all metrics are useful outside the plant. This means that from 100 or 200 counters we can end up with only half of them that are relevant, at global level.
This means that the data that we store at global level decrease drastically, but in the same time, we have all the relevant information that we might use in the future.

This approach doesn’t mean that we need at plant level complex processing system. On-premises or cloud, the system will look the same. It is our decision when and where we want to store data - at gateway, plant or global level. We could even have the gateway in cloud.

As an example we could have all the information produced by devices at gateway level stored for 7 days at device level. All the information that is older than 7 days is deleted automatically. The gateway can have a virtual plant in the cloud, where all data are stored for 1 year. At this two level the time interval of collected metrics is very low (0.1s and 1s). All this information is moved in the global repository that stores only a part of the counters at 1s time interval. This data can be later used for analytics and prediction.

As we can see we could have different approaches. Not all the time there is a need to store all the data that are produced by the devices forever. We can filter or decide what is the sample rate for each counter.



  1. Wow - first time when cloud is optimized. Keep going!


Post a Comment

Popular posts from this blog

How to check in AngularJS if a service was register or not

There are cases when you need to check in a service or a controller was register in AngularJS.
For example a valid use case is when you have the same implementation running on multiple application. In this case, you may want to intercept the HTTP provider and add a custom step there. This step don’t needs to run on all the application, only in the one where the service exist and register.
A solution for this case would be to have a flag in the configuration that specify this. In the core you would have an IF that would check the value of this flag.
Another solution is to check if a specific service was register in AngularJS or not. If the service was register that you would execute your own logic.
To check if a service was register or not in AngularJS container you need to call the ‘has’ method of ‘inhector’. It will return TRUE if the service was register.
if ($injector.has('httpInterceptorService')) { $httpProvider.interceptors.push('httpInterceptorService&#…

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

[Post-Event] Codecamp Conference Cluj-Napoca - Nov 19, 2016

Last day I was invited to another Codecamp Conference, that took place in Cluj-Napoca. Like other Codecamp Conferences, the event was very big, with more than 1.000 participants and 70 sessions. There were 10 tracks in parallel, so it was pretty hard to decide at  what session you want to join.
It was great to join this conference and I hope that you discovered something new during the conference.
At this event I talked about Azure IoT Hub and how we can use it to connect devices from the field. I had a lot of demos using Raspberry PI 3 and Simplelink SensorTag. Most of the samples were written in C++ and Node.JS and people were impressed that even if we are using Microsoft technologies, we are not limited to C# and .NET. World and Microsoft are changing so fast. Just looking and Azure IoT Hub and new features that were launched and I'm pressed (Jobs, Methods, Device Twin).
On backend my demos covered Stream Analytics, Event Hub, Azure Object Storage and DocumentDB.

What abo…