Teslapathic. What does that even mean? It’s the term used by Cal Hacks 3.0 — a group at UC Berkeley that works together on various projects — for the mind-controlled Tesla Model S tech that they’re working on.
A program that the group has created allows for some basic control of a Tesla Model S through the use of an EEG headset and some proprietary algorithms (to oversimplify it) — hence the term “Teslapathic.”
Rather than tease this too much, just watch the video below to get a better idea of what I’m talking about.
So, the group’s goal is “Mind control of a Tesla Model S” — as they put it.
Rather than paraphrase too much, I’ll just quote a bit from the group’s own rundown of what they’re doing (via Devpost):
tl;dr — An EEG headset determines whether the user is thinking “Stop” or “Go,” which is translated into an analog signal, then broadcast by an RC radio, and articulated by actuators on the pedals and a motor on the steering wheel.
Teslapathic is comprised of three primary systems: Machine learning with OpenBCI, a digital to analog interface through Arduino, and a hardware control system.
OpenBCI: We created a machine learning training program that compiles averages of the user’s neural activity when thinking “Stop” and “Go.” The user is also encouraged to assign the thought of a physical action with each command when creating their activity profile, as focusing the EEG nodes around the brain’s motor cortex while imagining physical motion in tandem with the desired command had the highest rate of success. For example, Casey would think of tapping his right foot for “Go” and clenching his left hand for “Stop.” A k-nearest neighbors algorithm was employed to reduce signal noise. After ascertaining the user’s intent, corresponding variables are then generated and passed off to an Arduino for conversion to an analog signal.
Analog conversion: In order for our digital system to interact with our analog hardware, we leveraged an off the shelf RC radio — a Futaba T9CHP — and exploited its trainer feature to allow for communication between the OpenBCI and the driving hardware. By having an Arduino mimic the PPM timings sent by a slave radio, the T9CHP effectively becomes an analog pass-through and delivery method. The PPM signal is manipulated in accordance with the user’s intent, which results in articulation of the driving hardware. The head-mounted gyro was spliced inline between the Arduino and the radio and results in additional signal manipulation.
Hardware control: Linear actuators were affixed to the pedals, and a windshield wiper motor fitted with a potentiometer was mounted to the steering wheel. “Go” (in the form of the corresponding analog signal) results in the brake actuator receding and the accelerator actuator engaging, “Stop” results in the opposite. Left and right movement from the gyro results in left and right movement from the wheel. Still reading? Congratulations! You made it through my convoluted explanation!
Safety: We implemented multiple safety measures: an emergency brake in the Arduino portion of the code in case of failure, the user needs to be holding a dead-man’s switch in order for the signal to broadcast, we wedged a physical block behind the accelerator to prevent it from going too fast, the user can take manual control through the radio at any time, and if all else fails the actuators were pressure fit so the user could reach their leg into the driver footwell and kick them away from the pedals.
Interestingly, the group noted that training the algorithms to clearly differentiate between “Go” and “Stop” was a laborious process, but they now think that they’ve “achieved a high degree of accuracy.”
Not bad for a 36-hour project. 🙂