This weekend I will be joining the Hack the Brain hackathon, organized by Waag Society, Donders Institute, TNO and Total Active Media. During three days 45 programmers, designers, artists and scientists will try to hack the human brain in the centre of Amsterdam.
During the hackathon several EEG scanners, including the Emotiv EPOC will be available to use. With these devices it’s possible to read out electric activity around your scalp directly into your computer. Many projects making use of these devices create a Brain Computer Interface (BCI): a way to control software on your computer or an interactive installation by detecting your brain activity. Controlling your computer using only your thoughts is a very exciting idea to many people. Generally I’m a bit skeptical about BCIs using EEG though.
The reason of this skepticism is probably that I’ve seen quite a few installations that attempted to use the EEG data in a meaningful way. This appears to be quite hard, only few mental events can be measured accurately. The attempts I’ve seen fall roughly into three categories:
- Games in which your level of relaxation is measured to increase or decrease the difficulty of the game, to enhance a certain experience or to beat your opponent. While these games are quite fun to play, I don’t think they really live up the expectations we have of a BCI.
- Interactive art installations of which I’m not sure where the artistic freedom of the creator exactly begins. Some of these installations are visually very interesting, such as Euonia, a project by Lisa Park.
This article describes how in this project the artist’s brainwaves are translated into sound waves that are played by five speakers, each representing an emotion. Although she definitely seems dedicated to controlling her mind so that the EEG scanner receives ‘pure’ signals, I wonder whether the measured activity can really be accurately correlated with these five different emotions.
- Projects that detect the electrical activity of your brain when you want to move a certain part of your body. After the program has learned the pattern it receives when you’re thinking about moving your right arm, this pattern can be associated as input to move the mouse cursor, control the lights in your house, or change the direction of your game character.
This last type of applications can be very useful for people who suffer from paralysis and can’t use their body to control computers with a mouse or touchscreen. Or the occasional Brain in a vat might benefit from this. For the rest of us who do have working sensorimotor functions, this type of applications isn’t very interesting though. When you think about moving your right arm, you might as well actually move your right arm. When you actually use your body instead of just thinking about it, way more precise measurements can be made by motions sensing devices such as Microsoft’s Kinect or the Myo by Thalmic Labs. When making a Brain Computer Interface, it shouldn’t be forgotten that our brains have natively supported input and output modules: our body and senses.
Despite my skepticism I think some cool projects are going to come out of the Hack the Brain hackathon! Although EEG sensors have their limitations, their unique characteristics also offer possibilities for novel, interesting ways of interaction. I’m especially curious to see projects which combine the EEG devices with other sensors, or use them to provide neurofeedback for the user.