MindDrone

Cal Hacks 2014

We wanted to hack a professional-grade quadcopter to use an intuitive interface. Using the Thalmic Myo, the drone is flown as a extension of your arm - the pitch and roll designated by natural rotations. Due to the lack of digital inputs to the radio controller, we had to explore how to communicate to the drone through hardware exploitations. After transcoding the Myo's digital input into an analog signal, we sent serial data through the Arduino to the radio controller's slave controller port.

We decided then to push the interaction one step further - controlling the drone using brain waves. By using current research techniques of signal processing, feature detection, and machine learning, we were able to decipher the thought of moving your left hand from moving your right hand - without physically moving. We used this to control the roll of the drone - thinking "left" moves the drone left, and thinking "right" moves the drone right.

BCI Explained

We used an OpenBCI to obtain raw EEG data from 6 channels, placing them 5cm from each other arranged in a 2x3 matrix on the motor cortex: 2 on the left, 2 on the right, and two on the longitudinal fissure. The locations were carefully chosen to measure activity from the motor cortex.

Signal processing was applied to remove noise and unwanted features and subtract activity recorded from the center electrodes. We analyzed activity activity within 14-20Hz and extracted relevant features. A k-nearest neighbors classifier was developed by Pierre to train the algorithm to recognize base and motor imagery of Tomas "thinking" about moving his left or right hand.

Share this project:

Updates