Neuromorphic Embedded Systems

Neuromorphic Computers: SpiNNaker, TrueNorth, and Spikey

SpiNNaker (developed at the University of Manchester), TrueNorth (IBM Research) and Spikey (EU FET BrainScales) all are research neuro-computer systems designed for the efficient emulation of large-scale spiking neural networks. SpiNNaker tackles this problem by connecting thousands of low power ARM computer cores with a mesh network of special spike routers. The photo shows our local set-up of six interconnected SpiNNaker boards with more than 5000 independent processors that simulate beyond 1 Million spiking neurons in real-time. In contrast, by implementing the neuronal computations in state-of-the-art digital logic instead of general-purpose processors, IBM's TrueNorth potentially achieves a similar performance, but on a single chip and with a 1000x higher electric efficiency for the price of reduced flexibility. Spikey is an analog system simulating a small number of neurons with significantly reduced energy consumtions (in the pA range) at speeds magnitudes faster than real-time.

We utilize all these systems and their unprecedented real-time performance to, e.g., study bio-inspired motor control or low latency event-based stereo-vision with spiking neural networks.

OmniBot with Robot Arm

The mobile manipulator developed at NST for adaptive sensorimotor systems consists of an omni-directional (holonomic) mobile manipulation platform with embedded low-level motor control and multimodal sensors. The on-board micro-controller receives desired commands via WiFi and continuously adapts the platform's velocity controller. The robot’s integrated sensors include wheel encoders for estimating odometry, a 9DoF inertial measurement unit, a proximity bump-sensor ring and three event-based embedded dynamic vision sensors (eDVS) for visual input.

The mobile platform carries an optional 6 axis robotic arm with a reach of >40cm. This robotic arm is composed of a set of links connected together by revolute joints and allows lifting objects of up to 800 grams. The mobile platform contains an on-board battery of 360 Wh, which allows autonomous operation for well above 5h.

SpiNNaker OmniBot

The "SpomniBot" is a small holonomic robot with various on-board sensors and actuators, equipped with on-board neuronal computing abilities. Sensors include a stereo pair of event based embedded dynamic vision sensors (eDVS), a 9 DOF inertial measurement unit (IMU), a proximity bump-sensor ring, and wheel encoders for odometry. All sensors and actuators are directly interfaced to an integrated SpiNNaker neuromorphic computing board, which allows the simulation of up to 1 million simple neurons in real-time on board. The combination of (event-based) sensing and neuromorphic spike-based computing and actuation provides a simple yet very capable fully autonomous mobile robot under “brain control”. We use this platform to run various brain-inspired neuronal network approaches for sensing, reasoning, and motor control; e.g. learning to avoid obstacles based on stereoscopic vision, centering within a hallway using optic flow, or autonomous simultaneous localization and mapping.

RobotHead

The robot head is a 6-DOF platform that provides human-like ocular-motor control for visual applications. The system is equipped with a pair of embedde dynamic vision sensors (eDVS) and an inertial measurement unit (IMU), mounted on top of the head. We use high precision feedback servo motors to provide 6 degrees of freedom for arbitrary motion of eyes and the head. Each degree of freedom corresponds to a single eye or head vertical and horizontal rotation. To emulate the physiology of human eye motion, the retinas' lenses are aligned with the servos' rotation axes.

The goal of this neuromorphic robot is to provide a reliable realistic platform for testing and evaluating a wide range of algorithms addressing ocular-visual, and sensory-motor related tasks.

PushBot

The Pushbots are a small mobile robotic platform with limited on-board processing capabilities (ARM Cortex micro controller LPC4337). The on-board sensory system consists of an event-based dynamic vision sensor (DVS), an 9 DOF inertial measurement unit (IMU) and a wheel encoder for odometry. Actuators include adjustable LEDs on the front and back at the top, a beeper, and two on-board laser pointer. The top LED is useful for being tracked by other robots (see robot chain project), the laser might be used in conjunction with the eDVS for distance estimation. The robots optionally communicates with external (PC) systems via an on-board WIFI module.

The goal of the platform is to advance development of resource aware algorithms. Due to the limited capabilities of the LPC4337, any algorithm running on-board in real-time is restricted in memory usage (max 136KB SRAM) and CPU time (204MHz, 32bit). Still, mainly due to the advanced data representation of the eDVS, advanced algorithms are possible, for instance robots that follow each other or settings in which multiple robots cooperatively map an environment.

Miniature Indoor Quadcopter

The Neuromorphic Mini Quadcopter is based on a modified RC toy (JXD385, ~20 US$) with custom open source on-board flight stabili¬zation firmware. An independent on-board miniaturized main control board (216 MHz 32bit ARM CPU, 64MB SRAM) allows real-time implementation of neuromorphic algorithms, e.g. event-based autonomous simultaneous localization and mapping (SLAM). The main control CPU interfaces up to 5 orthogonally mounted meDVS to obtain raw or preprocessed vision events; and directs the flight control board, and optionally communicates through high-speed WLAN with a remote PC for debug purposes. The Neuromorphic Mini Quadcopter can be controlled through a user RC remote control (which overrides on-board control) or act completely autonomous based on on-board real-time vision processing algorithms. Ultimately, we envision a complete swarm of such mini quadcopters to quickly explore and map unknown indoor environments.

Compliant Musculoskeletal Robotic Actuators

The control of highly coupled and compliant musculoskeletal systems is a complex task. In contrast to the well established control concepts of stiff robots widely used in industry, control strategies and algorithms for musculoskeletal systems are still areas of active research. We explore brain-inspired models for compliant motor control running in real-time on neuromorphic hardware. The two joint setup (four degrees-of freedom) allow exploration of such controllers on real hardware for tasks such as pointing, trajectory-following or building up energy for controlled throwing.

Neuroglasses

Neuroglasses combines the low-latency, sparse data streams of embedded dynamic vision sensors (eDVS) with their low power consumption to facilitate a wearable mobile device. It is aimed towards enabling visually impaired people to move independently and provide warnings for obstacles by representing the environment in an acoustic fashion.

The system consists of a stereoscopic pair of eDVS. Their vision streams are combined and used to extract depth information on a small compute stick with a latency of only a few milliseconds. This obtained depth information is then conveyed to the user as simulated 3D-sounds via perforated headphones. Using head-related transfer functions, the sounds appear to originate externally from the locations where the eDVS estimated the obstacles, i.e. objects are transformed from visual percept to virtual sound sources. This gives the user the possibility to hear their environment and orient themselves independently.