With nio, you don’t need to be a PhD in computer or data science to make use of cutting edge technology. The industrial sector is overwhelmed with legacy passive data repositories ripe for machine learning.

Distributed computing

The distributed nature of the nio platform enables us to train machine learning models on powerful computers and then distribute the refined models to run on lightweight nodes to improve edge intelligence.

Unlimited extensibility

We determined this challenge could be solved using one of the many machine learning frameworks. In the case below, we elected to create a TensorFlow block using their open source library. Now, that TensorFlow block can be reused in any other nio system.

User friendly

Neural networks are complex. With nio, instead of digging through white papers and complex code you select parameters from dropdown menus. Even an experienced data scientist will appreciate the efficient iteration and rapid prototyping that the block provides.

Faster time-to-market

nio blocks can be reused and quickly reconfigured for each unique solution. With this sort of accessibility, the daunting task of implementing machine learning solutions has been streamlined.

Want to see for yourself? Watch this video

How we are applying machine learning using nio

Let’s take a closer look at the machine learning solution. The following service shows how you can incorporate neural networks into your nio systems.


The subscriber block pulls data from other services that retrieve the raw data from any disparate data source. It could be any data stream such as a database, cloud resource, or spreadsheet.


Data preparation is fundamental to successful machine learning and neural network processing. The modifier block enables us to cleanse and transform data into virtually any form that we need.

Data compressor

Neural networks require all samples in a dataset to be the same size for processing. In this example, we had variable-length time-series data that we normalized using the DataCompressor block.


The filter block allows us to separate data for training or prediction. This is a necessary step in order to ensure we are feeding the neural network with the proper information for processing.

Neural network
(powered by Tensorflow)

TensorFlow is a product of Google’s deep learning artificial intelligence research project that we packaged into a nio block. In this block configuration, we construct a neural network by selecting our preferred algorithms, defining our layers, and configuring network parameters.


Here is an example of the reusability of nio. This is the same block and configuration from earlier in the service. Again, we want to act differently on training and prediction data.

Plotly dash

Dash by Plotly is a Python framework for building analytical web applications. We packaged this framework in the PlotlyDash block and used it to view the output of the loss function as the neural network iterates through the learning process


The publisher block makes this information available to other nio services. These services can act on the data in any way including sending the data to a database, performing statistical post-processing, or even feeding this data into a separate neural network.

What we are finding

Before using machine learning, the output for each job is essentially random. The plot below illustrates the failure of the model to improve over time.

After: Once we applied machine learning the plot below shows loss tending to zero. After roughly 200 iterations of learning, the error rapidly decreases and the model gradually improves in the following iterations.

Retrospective: After training, each job’s output can be validated and, if necessary, presented to a human for manual correction. Approximately 96% of the output was validated and less than 10% measured an error in excess of 5%. Applying the results from this model increases throughput, lowers overhead, and improves the bottom line.

Hacking with nio

We aren't called niolabs for nothing! Check out the other ways we have experimented with nio on our blog.