Join the club for FREE to access the whole archive and other member benefits.

Piloting a drone using hand movements and gestures


Key points from article :

Human muscle movements combined with the wearable sensor Conduct-A-Bot can pilot a robot.

Electromyography and motion sensors are attached on the biceps, triceps, and forearms.

Measures signals from moving muscles and processes them to detect gestures in real time.

Used those movements and gestures to move a drone horizontally, vertically, rotate and stop.

Responded to 82 percent of gestures and identified 94 percent of cued gestures correctly.

Plans to test on more subjects to provide more predictive assistance or increase their autonomy.

Research by MIT published in Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction.

Remote exploration and assistive personal robots will be a possibility

Mentioned in this article:

Click on resource name for more details.


A system for realtime, wearable, hand movement and gesture sensoring

Daniela Rus

Professor of Electrical Engineering and Computer Science at MIT

Joseph DelPreto

Postdoctoral researcher working in the field of robotics at MIT

Massachusetts Institute of Technology (MIT)

Private land-grant research university

Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction

Proceedings of the 2020 ACM/IEEE international conference on human-robot interaction