A new paper posted to arXiv: Spiking Optical Flow for Event-based Sensors Using IBM’s TrueNorth Neurosynaptic System.
This paper describes a fully spike-based neural network for optical flow estimation from Dynamic Vision Sensor data. A low power embedded implementation of the method which combines the Asynchronous Time-based Image Sensor with IBM’s TrueNorth Neurosynaptic System is presented. The sensor generates spikes with sub-millisecond resolution in response to scene illumination changes. These spike are processed by a spiking neural network running on TrueNorth with a 1 millisecond resolution to accurately determine the order and time difference of spikes from neighboring pixels, and therefore infer the velocity. The spiking neural network is a variant of the Barlow Levick method for optical flow estimation. The system is evaluated on two recordings for which ground truth motion is available, and achieves an Average Endpoint Error of 11% at an estimated power budget of under 80mW for the sensor and computation.
Please see a blog post by Brian Taba on our CVPR 2017 paper “A Low Power, Fully Event-Based Gesture Recognition System“. Here is PDF of the paper “A Low Power, Fully Event-Based Gesture Recognition System“. Video of the system in action follows: