« IoT Milestone: 10 Billion Signals Processed

The nio Platform just crossed a huge milestone: It processed its 10 billionth signal over the weekend! And that is only since we’ve started counting in February of this year.

That equals around 767 signals per second, on average, all around the world that have been processed by nio blocks over the last 5 months.



Any way you slice it, a number like this demonstrates that a cloud-first strategy won’t work for the Internet of Things. There is simply too much data and too many signals that need to be processed to send everything to the cloud.

Here is some additional context about the 10 billion signals and how we compute this number:

We use Google BigQuery to store anonymous analytic information from nio instances that opt to share diagnostic data with us. Rather than store every single signal (we’ve already told you that’s a bad idea!) we can store analytics about signal volume and block usage instead. For reference, the database that contains the analytics for these 10 billion signals is currently sitting at around 250 MB.

We use Google Data Studio to create a dashboard that we can track and share internally that connects to our BigQuery dataset and helps us visualize the data. Here is a screenshot of that dashboard right after it crossed the 10 billion mark.



There are many interesting insights we can gain from basic analytics like these. For example, we can see that the Join block processes the most signals but does not emit the highest volume. That makes sense, given that the behavior of the Join block is to take lists of signals and reduce them down into a single signal. Based on these metrics we can see that the average “rate of joining” is about 12 to 1.

Stay tuned for our 100 billion announcement, it will be here before you know it!