In this article, I will talk about a research side project I’ve been working on and that shows how AI can be applied to an EEG connected to the human brain in order to create new applications.
By Giacomo Sardo
9 June 2021
The research's goal
In 2018 I and my fellow friend became interested in EEG and how these instruments could make our lives easier. We came up with the idea of creating something that would use brain waves to trigger sets of commands so that people could for instance play video games or turn on devices only with the use of their brains without actually touching anything.
What is an EEG?
EEG stands for electroencephalography: a monitoring method to record electrical activity in the surface layer of the brain. We used a popular device called Muse, a wearable brain-sensing headband. The device measures brain activity via 4 sensors.
How we set up the research
To transform brain impulses into commands we connected Muse to the computer and programmed an interface using JS, C # and Muse’s library. After that, we asked some friends to try our device. In order to make them use the device, we developed a 2D videogame where a running character had to jump over some random obstacles.
At this point, we gathered a substantial amount of data, but we were struggling to have full autonomy in performing the jumps, in fact, users had to actually blink their eyes in order to trigger the jump. In 2019 we managed to participate in an IT fair and got lots of people trying our machine. It was important for us to get as many people as possible to test it in order to gather a larger set of data.
How we used AI to master our inputs
Now we needed to use the data to create a perfect model that would, without error, transform the right signals in the right actions. This is where AI came into play. AI was implemented on the data collection with . In this ambient we plugged-in the values we had set after various internal tests that defined the incoming impulses for making the character jump. If the brain data were similar to those defined by us and the action was successful then Tensorflow saved them, if not it discarded them. The AI also learned from errors: if errors occurred but the action was still successful, then AI would average the data and instead of using our default sets it would modify them by adjusting the range. Ultimately we were able to have the character successfully jumping via the only impulses of the brain.
AI is a great part of what we do in Extendi and this side project of mine does reflect our passion for the subject and how much we love what we do. AI already is a big part of our lives and in the future will radically change the way we interact with devices. Just imagine getting home and being able with only the use of your mind to turn on the heat, program the alarm or switch off a lightbulb. Of course, we are still far from that, we are just now starting to see vocal assistants spread, but with our project we proved that what a few years ago sounded like sci-fi is today possible.
Giacomo is a brilliant developer with a strong passion for enabling technology. When he is not coding... You can find him coding on side projects!
In this article, I will cover why building an ecommerce with the Jamstack architecture can benefit your business immensely and how, at Extendi, we rely on Commerce Layer’s headless commerce platform to do so.
Here at Extendi, we are always working on best practices and best solutions for creating first-class designs and top-notch front ends. In this article, we will go through the differences between frontenders and designers and how these two figures can work together while creating amazing synergies.
Get periodic updates, the best insights and some precious tips.