Dextra

Dextra (Latin: right hand) beats humans at rock-paper-scissors.

It demonstrates quick and low cost activity-driven neuromorphic perception.

Dextra is composed of a DVS event camera, a small convolutional neural network, and a quick tendon-driven robot hand.

The key principle is activity-driven computing - the faster you move, the quicker Dextra computes its response.

Datasets and Code

Papers describing Dextra

History of development

We originally developed Dextra to demonstrate quick neuromorphic perception and inference during the NPP project and did the first public demonstration at NIPS 2016 in Barcelona. Since then, Dextra has evolved over many iterations, first to demonstrate NullHop, then to use a cheap laser cut hand (which proved to be too fragile), now finally with a robust and very quick tendon-driven hand. Dextra has been demonstrated in the ETH pavilion during the World Economic Forum in 2017, and at multiple public science events since 2017.

Dextra at 2017 WEF
Dextra at 2016 NIPS, Barcelona
Dexta on back cover of ETH Life magazine
Dextra in 2016 with CNN computed on NullHop FPGA accelerator
Dextra with CNN computed by the CSEM VIPS accelerator