A Brain Connection Device for Education, Feedback, Gaming, Hands-free Interaction, Joy, Know-how, Learning and More.....
New techniques based on brain signals aim to make navigation through virtual worlds more intuitive
Controlling your movement through a virtual world can be a cognitively demanding task. To make navigation more intuitive, controllers based on hand and body movements such as the Nintendo Wii were recently introduced. The future generation of game controllers aims to be even more intuitive by directly translating brain signals into navigation commands.
Traditional navigation interfaces such as joysticks and gamepads are often not intuitive. This means that users either have to invest many hours of training or allocate significant cognitive resources to the navigation task, reducing overall task performance. To make navigation more intuitive, this project looks into the possibilities of using brain signals for navigation. Our goal is to implement hands-free navigation in at least three dimensions (left, right, rotate). The ultimate goal is that the user's cognitive resources could be fully used for the content of the game instead of the interaction with the interface.
Brain Computer Interfaces (BCI's) enable direct communication between the brain and a computer and come in many different sorts. We are developing passive BCIs: BCIs that use the brain's reaction to specific probe stimuli. The advantage of passive BCIs is that they do not require training but tap into the normal responses of the brain to for instance stimuli that are of particular interest. However, these brain responses are still under voluntary control of the user, making them well suited for BCIs. We explore several types of probe stimuli and brain responses. One of these brain responses is the Steady State Evoked Potential (SSEP). SSEPs are induced by probe stimuli that for instance flicker with a specific frequency. The flicker frequencies can be distilled from the brain signal. When multiple probe stimuli (each with a different frequency) are presented, the user's attention affects which of the stimuli has a stronger effect on the brain signals. In other words: the user can choose from several options by paying attention to one of several probe stimuli. So far, we have developed and tested a BCI based on a visual SSEP. However, the disadvantage of visual probe stimuli is that they require eye movements and may interfere with the visual game environment.
To overcome the disadvantages of a BCI based on visual probe stimuli, we started to develop BCIs based on touch stimuli. The sense of touch is often underutilized in gaming and does not require (the equivalent of) eye movements. We are now looking into the feasibility of using tactile stimuli with different vibration frequencies as probe stimuli. Of special interest is a set-up in which the probe stimuli are placed inside a belt worn around the user's torso. Choosing left, right, forward, and backward may become very intuitive this way, bringing us a step closer to our goal of hands-free, intuitive navigation. We are also exploring the possibilities of combing visual and touch stimuli to see if this increases the speed or quality of the BCI.
3.3 A Brain Connection Device for Education, Feedback, Gaming, Hands-free Interaction, Joy, Know-how, Learning and More.....
TNO Human Factors
Brouwer, A.-M. et al. (2010). A tactile P300 brain-computer interface. Frontiers in Neuroscience, 4:19, 1-12.
Thurlings, M.E. et al. (2010). EEG-Based navigation from a Human Factors perspective. In: D.S. Tan & A. Nijholt (Eds.) Brain-Computer Interfaces, Human-Computer Interaction series, pp. 117-132. London: Spinger-Verlag.
Coffey, E.B.J. et al. (2010). Brain-Machine Interfaces in space: using spontaneous rather than intentionally generated brain signals. Acta Astronautica, 67, 1-11. More publications
Jan van Erp, TNO Human Factors