Articles in Control Magazine

Download our regular contributions to Control Magazine.

Issue 30, March 2012
Issue 29, January 2012
Issue 28, December 2011
Issue 27, November 2011
Issue 26, August 2011
Issue 25, July 2011
Issue 24, April 2011
Issue 23, February 2011
Issue 22, January 2011
Issue 21, November 2010
Issue 20, October 2010
Issue 18, June 2010
Issue 17, April 2010
Issue 16, January 2010

Control International Edition July 2011
Control International Edition March 2011
Control International Edition April 2010
Control International Edition August 2010 

About GATE

GATE final publication 2012
Results from the GATE research project
a 75 page overview (pfd 4.7 Mb)

GATE Magazine 2010
a 36-page overview of the GATE project (pdf 5.3 Mb

Research themes:
Theme 1: Modeling the virtual World
Theme 2: Virtual characters
Theme 3: Interacting with the world
Theme 4: Learning with simulated worlds

Pilots:
Pilot Education Story Box
Pilot Education Carkit
Pilot Safety Crisis management
Pilot Healthcare Scottie
Pilot Healthcare Wiihabilitainment

Knowledge Transfer Projects:
Sound Design 
CIGA 
Agecis 
CycART 
VidART
Motion Controller
Compliance
Mobile Learning
Glengarry Glen Ross
CASSIB
EIS
Enriching Geo-Specific Terrain
Pedestrian and Vehicle Traffic Interactions
Semantic Building Blocks for Declarative Virtual World Creation 
Computer Animation for Social Signals and Interactive Behaviors

Address

Center for Advanced Gaming and Simulation
Department of Information and Computing Sciences
Utrecht University
P.O. Box 80089
3508 TB Utrecht
The Netherlands
Tel +31 30 2537088

Acknowledgement

 ICTRegie is a compact, independent organisation consisting of a Supervisory Board, an Advisory Council, a director and a bureau. The Minister of Economic Affairs, and the Minister of Education, Culture and Science bear the political responsibility for ICTRegie. The organisation is supported by the Netherlands Organisation for Scientific Research (NWO) and SenterNovem.

WP 3.3 Brain Connection Devices

A Brain Connection Device for Education, Feedback, Gaming, Hands-free Interaction, Joy, Know-how, Learning and More.....

New techniques based on brain signals aim to make navigation through virtual worlds more intuitive

Controlling your movement through a virtual world can be a cognitively demanding task. To make navigation more intuitive, controllers based on hand and body movements such as the Nintendo Wii were recently introduced. The future generation of game controllers aims to be even more intuitive by directly translating brain signals into navigation commands.

Traditional navigation interfaces such as joysticks and gamepads are often not intuitive. This means that users either have to invest many hours of training or allocate significant cognitive resources to the navigation task, reducing overall task performance. To make navigation more intuitive, this project looks into the possibilities of using brain signals for navigation. Our goal is to implement hands-free navigation in at least three dimensions (left, right, rotate). The ultimate goal is that the user's cognitive resources could be fully used for the content of the game instead of the interaction with the interface.

Brain Computer Interfaces (BCI's) enable direct communication between the brain and a computer and come in many different sorts. We are developing passive BCIs: BCIs that use the brain's reaction to specific probe stimuli. The advantage of passive BCIs is that they do not require training but tap into the normal responses of the brain to for instance stimuli that are of particular interest. However, these brain responses are still under voluntary control of the user, making them well suited for BCIs. We explore several types of probe stimuli and brain responses. One of these brain responses is the Steady State Evoked Potential (SSEP). SSEPs are induced by probe stimuli that for instance flicker with a specific frequency. The flicker frequencies can be distilled from the brain signal. When multiple probe stimuli (each with a different frequency) are presented, the user's attention affects which of the stimuli has a stronger effect on the brain signals. In other words: the user can choose from several options by paying attention to one of several probe stimuli. So far, we have developed and tested a BCI based on a visual SSEP. However, the disadvantage of visual probe stimuli is that they require eye movements and may interfere with the visual game environment.

To overcome the disadvantages of a BCI based on visual probe stimuli, we started to develop BCIs based on touch stimuli. The sense of touch is often underutilized in gaming and does not require (the equivalent of) eye movements. We are now looking into the feasibility of using tactile stimuli with different vibration frequencies as probe stimuli. Of special interest is a set-up in which the probe stimuli are placed inside a belt worn around the user's torso. Choosing left, right, forward, and backward may become very intuitive this way, bringing us a step closer to our goal of hands-free, intuitive navigation. We are also exploring the possibilities of combing visual and touch stimuli to see if this increases the speed or quality of the BCI.

Workpackage
3.3 A Brain Connection Device for Education, Feedback, Gaming, Hands-free Interaction, Joy, Know-how, Learning and More.....

Partners
TNO Human Factors
Utrecht University

Key Publications
Brouwer, A.-M. et al. (2010). A tactile P300 brain-computer interface. Frontiers in Neuroscience, 4:19, 1-12.
Thurlings, M.E. et al. (2010). EEG-Based navigation from a Human Factors perspective. In: D.S. Tan & A. Nijholt (Eds.) Brain-Computer Interfaces, Human-Computer Interaction series, pp. 117-132. London: Spinger-Verlag.
Coffey, E.B.J. et al. (2010). Brain-Machine Interfaces in space: using spontaneous rather than intentionally generated brain signals. Acta Astronautica, 67, 1-11.
More publications

Contact details
Jan van Erp, TNO Human Factors
Jan.vanerp(at)tno.nl