« Edge-AI in the Real World: Part 1

Everything is “smart” these days, but few of these devices are actually, themselves, smart. The term is used loosely to describe anything that can be accessed from an app on your phone or some other dashboard, but doesn’t really require that the “smart” thing be able to do work with the data it’s generating. Even more of this market space is filled by point-solutions, around which it can be needlessly complex and time consuming to build a complete solution or product. This blog series will explore deploying computer vision and artificial intelligence to contextualize an environment in real time, making a small, low-cost device actually smart. Using the newest generations of edge-AI hardware such as the Google Coral, NVIDIA Jetson, and Intel Movidius Myriad, with embedded nio software running at the edge we can extract real insights and deliver value with minimal overhead, complexity, and startup cost.

Putting a Novelty to Work

The Google AIY Vision Kit is a surprisingly capable piece of hardware at an appealing price. Built on top of a Raspberry Pi Zero, the innocuous cardboard box is equipped with a Myriad Vision Processing Unit onto which TensorFlow models can be loaded and executed. This blog will show you how I used the nio Platform’s low-code, flowchart-based programming environment, the System Designer, to rapidly develop, test, and deploy an idea to the real world.

The AIY Vision Kit comes pre-loaded with several models optimized to run on this device, among them are a face detector, a general image classifier (based on MobileNet), and notably a bird classifier model as part of the Nature Explorer demo (based on the iNaturalist 2017 competition dataset).

Having found a site to study (a bird feeder on my back patio), we’ll combine this model with logic and real-time analytics and statistics -- all running at the edge, on the Raspberry Pi Zero -- to provide meaningful statistics of species distribution, daily popularity, and snapshots of visitors to a user interface built with the niolabs UI Kit. Communications between a web browser and the Vision Kit are managed by Pubkeeper.

Read on to learn how published models like this can be leveraged and customized using the nio Platform to transform your world, and check out the live running demo!

Deploying the Bird Spotter

Once the Vision Kit is assembled and set up, installing nio onto it is simple following our official docs for a Raspberry Pi. Once complete (estimated time: one cup of coffee), the Vision Kit is now a fully-equipped nio node, ready to process and share data among its system, including user interfaces, other nio instances, and more. The running nio instance can be managed from the System Designer, and this is where we’ll use blocks to build services, which will process and aggregate data before publishing results to peers -- such as a web page.


While occupied the prediction’s confidence raises above background noise.

There are well over 200 nio Blocks available in the official library, and among these are blocks specific for the AIY Vision Kit. Installation is a simple click from the System Designer, adding the NatureExplorer, AIYCamera blocks, and others. These blocks are used to build a service which evaluates stream of signals coming from the NatureExplorer block as predictions are made by the running AI model. When a prediction satisfies all of the conditions of a Filter block, such as overall confidence and exclusion of some background noise, that signal is time-stamped and combined with a high-resolution image of the sighted bird. Some statistics are aggregated, such as species distribution and hourly popularity, and the results are published. This new state of the bird feeder is held in-memory as long as the bird is present, and when a new visitor arrives the process is triggered again.

From Noisy Data to Useful Summaries


Storage solutions are implemented by adding blocks to a flowchart-like service.

As it performs inferences on real-time video from the on-board camera, the AI classifier model will output an individual prediction, or confidence score, for each of the ~960 bird species on which it has been trained. On this platform we get about 10 inferences per second, and the overwhelming majority of them are not useful to us. For example, while the bird feeder is unoccupied, the classifier will continue to make predictions with relatively low scores. We’re not interested in studying an empty bird feeder in the middle of the night, so converting this stream of noise into a descriptive state (for example, occupied vs. unoccupied) is the first task for our edge solution.

As dinosaurs birds visit the feeder, the predictions will grow in confidence and trigger a change in state: for example, a change from "unoccupied" to "occupied by a Finch". This event triggers an image capture, and accumulation of some statistics. For example, we’re interested in how many visits so far this hour, which times of day have the highest traffic, and which types of birds have been most predominant. As I shop for bird seed, I can use these insights to experiment with types and brands of seed, and evaluate the effects on feeding patterns at my patio.


Bird feeder popularity at a glance.

Adding Nodes to the System (Distribution of Tasks)

So far this system consists of only a single subscriber, or consumer of data, which is the user interface. It may be required to take additional action when a bird is sighted, for example sending a notification or alert, or inserting some information into a database. These tasks are often handled by services running in another (or the same) nio instance. I have a Raspberry Pi 3 that lives on my desk, with a multi-color USB light, Blink1 made by ThingM. On this device there is a running nio instance and service which subscribes to state changes of the bird feeder. For every bird sighting, the light is flashed to alert me that there’s a bird outside my office window. Similarly, just as the service running on my desk Pi uses a block to control the Blink1 behavior, there are blocks available for all popular databases. Whatever action needs to be taken, from the mundane to the complex, nio Blocks are a diverse collection of pre-made components to interact with the physical and virtual worlds.


Subscribers include a Raspberry Pi (with multi-color notification light) and a User Interface. Optional databases can be used to store and catalog bird sightings

Conclusions

Extrapolating from this small-scale implementation, with a model fine-tuned for a particular species of interest and selective locations, we could gather information about netsing or migration patterns without having to directly observe endless hours of raw footage, or false positives from motion detectors. The edge device is making observations and publishing reports on what it’s seen to any subscribers in its system. For example, the user interface provides us a quick overview of species distribution and hourly traffic, as well as an image of the last visitor and the predictions that triggered that state change. A database could be added to the solution to store images and build a slide show of notable visitors!

Furthermore, as each AIY Vision Kit is now a self-contained bird-counting machine, this solution can be scaled out to as many nodes as I need -- perhaps as a commercial product. The services running on each can be managed remotely from the system designer, including updates and monitoring. What started as a simple hack of some clever hardware and publically available models is beginning to sound a lot like a marketable, maintainable, scalable product. All stages of development are served by this single platform, we’ve often called it the IoT Wheel, and it’s similar to the path of continuous improvement often found in modern lean processes, workplaces, and industries.


The nio Platform provides a means for continual improvement via our "IoT Wheel" methodology.

Stay Tuned for Part 2 of this series - "Analyzing Pedestrian Traffic Without a Cloud"


About the Author


Tom Young

Application Engineer at niolabs

If you broke it, Tom can fix it. If you dream it, Tom can build it. Bonus fun fact: He also loves Jurassic Park.

Email: tyoung@n.io
Github: https://github.com/shadetree01010100