Skip to main content

[IoT Home Project] Part 8 - Connecting to Azure Function and to a virtual heat pump

In this post we will discover how to:
  • Push content from Azure Stream Analytics to Azure Service Bus
  • Write an Azure Function in Node.JS that fetch data from Azure Service Bus Topic and push it to Azure Table
  • Develop a ASP.NET Core application that plays the role of a heating system that start/stop the heating in a house
Story:
  • Use temperature data collected from sensors connected to Raspberry PI to start/shop a heating system from a house
Previous post: [IoT Home Project] Part 7 - Read/Write data to device twin
GitHub source code: https://github.com/vunvulear/IoTHomeProject 

Push content from Azure Stream Analytics to Azure Service Bus
This step is the most simple one. We just need to Add a new output to Stream Analytics and specify the Service Bus Topic where we want to push data. After this, we'll need to update the Stream Analytic query by adding:
SELECT 
    *
INTO 
    outputSensorDataTopic
FROM
    avgdata
, where 'outputSensorDataTopic' is the name of the output that we created a few seconds ago.

Remarks: Don't forget to create a subscription for the topic. I send all the information to topic (not only temperature), because I plan in the future to add filters to each subscription and root data based on this parameters.

Azure Function to push data from Service Bus Topic to Azure Table
Offtopic: This step is done artificially in our case, we could forward content directly to Azure Table from Stream Analytics. I wanted to do this because the main scope is to play with as many Azure Services as possible. Additional to this, inside Azure Function we could aggregate information from other sources to decide if we want to start/stop the heating system.

Now, we will create a Azure Table in our storage with two rows, that contain current temperature and desired temperature. We could also add another row that contain a flag that specify to the system to start/stop the heating system, but for now we will not add it.
The table should look like this:
From Azure Portal, in the moment when we want need to create the Function itself we shall specify that for Trigger we want to have Service Bus Topic. For output we'll specify Azure Tables. At this step you need to specify connection information for Service Bus and Azure Table. Once you done this, you'll be ready to write your first Azure Function in Node.JS
  module.exports = function (context, myQueueItem) {    
    context.bindings.outputtable = {
        "partitionkey": "system",
        "rowkey":"currenttemperature",
        "status": myqueueitem.avgtemp
     };
    
    context.log('Avg. temp: ', myQueueItem.avgtemp);
    context.done();
  };

Surprise! Even if the function is written as in the book, an error will pop-up. This happens because in this moment Azure Function allows us only to add new rows to an Azure Table. Updating existing one is not possible.
The good news is that there is a small hack that we can do. Nothing block us to install Azure Storage module from npm and use it directly inside our function. To do this, you'll need to access the Kudu portal of you Azure Function. The URL is https://[FunctionAppName]scm.azurewebsites.net/ and you'll authenticate with Azure credentials.
From there, go to 'Debug Console' and type 'npm install azure-storage'. Once you install this module you can make any calls to Azure Storage using Node.JS module.
process.env["AZURE_STORAGE_ACCOUNT"] = "vunvuleariotstorage";
process.env["AZURE_STORAGE_ACCESS_KEY"] = "@@@";
process.env["AZURE_STORAGE_CONNECTION_STRING "] = "@@@";


module.exports = function (context, myQueueItem) {
    var azure = require('azure-storage');
    var tableSvc = azure.createTableService();
    tableSvc.createTableIfNotExists('systemstatus', function (error, result, response) {
        if (!error) {
            var entityToUpdate = new Object();
            entityToUpdate.PartitionKey = "system";
            entityToUpdate.RowKey = "currenttemperature";
            entityToUpdate.Status = myQueueItem.avgtemp.toString();

            tableSvc.insertOrReplaceEntity('systemstatus', entityToUpdate, function (error, result, response) {
                if (!error) {
                    context.log('Avg. temp: ', myQueueItem.avgtemp);
                }
            });
        };
    });

    context.done();
};

I prefered to use process environments values to specify storage connection information. There is another way to do this, by specifying this data when you make the call to 'createTableService'. There is no error handling done, being a sample code.

ASP.NET Core that plays the role of heating system
There not to many things to say about it. It is a simple web application, that display minimum temperature, current temperature and heating system status.

Remember:

  • There is not yet full support for all Azure Storage actions from Azure Function
  • You can load any module in Azure Function using NPM
  • You can add or remove inputs and outputs for of a Stream Analytics as you wish

Comments

Popular posts from this blog

How to check in AngularJS if a service was register or not

There are cases when you need to check in a service or a controller was register in AngularJS.
For example a valid use case is when you have the same implementation running on multiple application. In this case, you may want to intercept the HTTP provider and add a custom step there. This step don’t needs to run on all the application, only in the one where the service exist and register.
A solution for this case would be to have a flag in the configuration that specify this. In the core you would have an IF that would check the value of this flag.
Another solution is to check if a specific service was register in AngularJS or not. If the service was register that you would execute your own logic.
To check if a service was register or not in AngularJS container you need to call the ‘has’ method of ‘inhector’. It will return TRUE if the service was register.
if ($injector.has('httpInterceptorService')) { $httpProvider.interceptors.push('httpInterceptorService&#…

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

[Post-Event] Codecamp Conference Cluj-Napoca - Nov 19, 2016

Last day I was invited to another Codecamp Conference, that took place in Cluj-Napoca. Like other Codecamp Conferences, the event was very big, with more than 1.000 participants and 70 sessions. There were 10 tracks in parallel, so it was pretty hard to decide at  what session you want to join.
It was great to join this conference and I hope that you discovered something new during the conference.
At this event I talked about Azure IoT Hub and how we can use it to connect devices from the field. I had a lot of demos using Raspberry PI 3 and Simplelink SensorTag. Most of the samples were written in C++ and Node.JS and people were impressed that even if we are using Microsoft technologies, we are not limited to C# and .NET. World and Microsoft are changing so fast. Just looking and Azure IoT Hub and new features that were launched and I'm pressed (Jobs, Methods, Device Twin).
On backend my demos covered Stream Analytics, Event Hub, Azure Object Storage and DocumentDB.

Title:
What abo…