Guest Blog by Arnon Amir, Brian Taba, and Timothy Melano
The Conference on Computer Vision and Pattern Recognition (CVPR) is widely considered as the preeminent conference for computer vision. This year the IBM Brain Inspired Computing team had the pleasure of demonstrating our latest technology at the CVPR 2016 Industry Expo, held in the air-conditioned conference halls of Caesars Palace, Las Vegas. The expo was co-located with academic poster presentations, which created an excellent opportunity for us to not only meet very interesting academics, but also to see the latest demos from other amazing companies, both large and small.
We too were excited to demonstrate our new Runtime API for TrueNorth. To showcase it, we connected an event-based vision sensor, the DVS128 (made by iniLabs), over USB to our NS1e board.
We used our Eedn framework to train a convolutional neural network on hand and arm gestures collected from our team, including air-drums and air-guitar! This Eedn network was used to configure the TrueNorth chip on the NS1e board. Overall, the system received asynchronous pixel events from the DVS128 sensor and passed them to TrueNorth. A new classification was produced every one millisecond, or at 1000 classifications per second.
The reaction to the real-time gesture classifications was very positive and drew large crowds (and other hardware vendors ;). People were blown away by that fact that we were running a convnet in real-time at 1000 classifications per second while consuming only milliwatts of power. We invited anyone who was interested to come behind our table to play with the gesture recognition. With a little bit of adjustment, people were able to interact with TrueNorth and have their gestures recognized. To many in the audience, the entire concept of neuromorphic engineering was new. Their visit to our booth was a great opportunity to introduce them to the DVS128, a spiking sensor inspired by the human retina, and TrueNorth, a spiking neural network chip inspired by the human brain!
A video can be seen here.
Previously, we have demonstrated that TrueNorth can perform greater than 1000 classifications per second on benchmark datasets. Therefore, the new Runtime API opens the interface to the NS1e board and the TrueNorth chip for many exciting real-time applications, processing complex data at very fast rates, yet consuming very low power.
We give special thanks to our teammates David Berg, Carmelo di Nolfo and Michael Debole for leading efforts to develop the Runtime API, to Jeff Mckinstry for performing the Eedn training, to Guillaume Garreau for his help with data preparation, and to the entire Brain Inspired Computing team for volunteering to create the training data set!