Using the Sensor Emulator to Apply Custom Methods in DT Studio

Introduction

Designed to be a simple yet powerful interface for our customers to interact with and organize their sensors and related data, Disruptive Technologies (DT) Studio aims to get your up and running quickly when buying our products. With its customizable dashboard, multiple sensors can be monitored and compared in realtime, and notifications can be configured to trigger at certain events.

We continuously improve and implement new features for our users to interact with and explore their data. Many requested features are, however, quite user-specific and might take some time before receiving official support. Fortunately, DT Studio provides users with tools to tailor their own custom solutions, allowing quick prototyping and experimentation with data.

In this application note, custom functionality is added to DT Studio to model the core temperature of food within a fridge using only the ambient air temperature. By utilizing tools already built into DT Studio, the aim is to inspire users to experiment with their own data to extract even more information. An example code repository has also been provided to get you started.

Modeling Temperature Inertia

For those interested, the details surround the temperature model applied in this application note may refer to our article on Modeling Fridge Content Temperatures. In summary, Newton's Law of Cooling, a well-established mathematical model, can be applied to the ambient air temperature inside a fridge to estimate the content's core temperature. By configuring notifications to be triggered by the modeled temperature instead of the ambient temperature measured by a sensor directly, alarm fatigue caused by sharp spikes in temperature due to opening- and closing fridge doors may be reduced.

Implementation Overview

When visualization sensor data in DT Studio, the information is fetched from DT Cloud directly. Customers can not influence or modify the chain of events in this transfer. Therefore, to apply our custom function, we will use a Data Connector to forward each new sensor event to a cloud function to process the data freely. For sensors provided with a certain label and value, an emulated twin device will be created to contain the modeled data.

The event is sent using HTTP POST requests, which at arrival, spawns a new compute instance containing an implementation of our model in Python. Once the modeled temperature value has been calculated, the data is sent back to the emulated twin in DT Studio using the REST API. There it can be plotted or used for notifications. Figure 3 shows a simplified overview of the implementation flow.

This flow can, in practice, be reused for any number of different solutions. Everything except the temperature model part of the presented flow can essentially remain unchanged, whether implementing a simple moving average, an enveloping function, or other models. This allows for easy copy-and-paste expansion of more features should that be necessary.

Serverless Data Processing

Google Cloud has been chosen as the cloud platform on which our function is hosted. Any of the large cloud service providers can be used, and it is also possible to host it locally. However, Google Cloud offers a free trial and provides a very easy-to-use CLI for deployment, which saves a lot of hassle. The service also scales dynamically depending on load, allowing for any number of sensors to be modeled simultaneously.

Example Code Repository

An example implementation is provided on the official DT GitHub page for free under the MIT license. It can be accessed following this link. It contains the Python code necessary to receive, authenticate, and model the temperature data in real-time. You are encouraged to understand and modify this code to suit your needs.

Deploying the Function

Before the function can be deployed, which is done from the repository root, the environment variables needed to authenticate the request must be set. Create a file .env.yaml, in which you set the following variables.

SERVICE_ACCOUNT_KEY_ID: <YOUR_SERVICE_ACCOUNT_KEY_ID>
SERVICE_ACCOUNT_SECRET: <YOUR_SERVICE_ACCOUNT_SECRET>
SERVICE_ACCOUNT_EMAIL: <YOUR_SERVICE_ACCOUNT_EMAIL>
DT_SIGNATURE_SERCRET: <YOUR_DATACONNECTOR_SIGNATURE>
AUTH_ENDPOINT: https://identity.disruptive-technologies.com/oauth2/token
API_URL_BASE: https://api.disruptive-technologies.com/v2
EMU_URL_BASE: https://emulator.disruptive-technologies.com/v2

The values for Service Account key, secret, and email are found when creating your service account and are explained in the next section. The signature secret should be a strong and unique password of choice to validate the Data Connector's request content. After the variables are set, the function can be deployed to Google Cloud.

gcloud functions deploy function-name \
		            --entry-point main \
                --runtime python37 \
                --trigger-http \
                --allow-unauthenticated \
                --timeout 30s \
                --ignore-file .gcloudignore \
                --project your-project \
                --region your-region \
                --env-vars-file .env.yaml

DT Studio Setup

To securely forward data to a cloud service, a Service Account and Data Connector should be used. Both can be configured from within DT Studio, assuming you have developer privileges or higher.

Service Account

For our cloud service to interface with the Disruptive Technologies API, a logged-in Service Account is needed. Without it, the authentication step in our cloud function will be rejected, and execution terminated. If you don't already have a Service Account or are unfamiliar with the concept, read the Basics of Service Accounts to learn how to create one.

Data Connector

In DT Studio, create a Data Connector to forward your events to the HTTP endpoint. You should also include a Signature Secret for added security. If you are unfamiliar with the concept of Data Connectors or how to create them, our Basics of Data Connectors and Data Connectors Reference will get you started.

Applying the Model

With a Cloud Function running in the background, waiting for POST requests from our Data Connector, modeling the temperature data of a new sensor is as simple as adding a label. In our implementation, by providing a sensor with the label "inertia-model" and a value equalling the heat transfer coefficient, an emulated device twin will be spawned immediately. Following this, each new temperature event to this sensor will be accompanied by the emulated twin being updated with a modeled version of said event.

Last updated