Publication details
Eyetracking for two-person tasks with manipulation of a virtual world
Jean Carletta, Robin L. Hill, Craig Nicol, Tim Taylor, Jan Peter Ruiter, Ellen Gurman Bard
2010
Abstract
Eyetracking facilities are typically restricted to monitoring a single person viewing static images or prerecorded video. In the present article, we describe a system that makes it possible to study visual attention in coordination with other activity during joint action. The software links two eyetracking systems in parallel and provides an on-screen task. By locating eye movements against dynamic screen regions, it permits automatic tracking of moving on-screen objects. Using existing SR technology, the system can also cross-project each participant’s eyetrack and mouse location onto the other’s on-screen work space. Keeping a complete record of eyetrack and on-screen events in the same format as subsequent human coding, the system permits the analysis of multiple modalities. The software offers new approaches to spontaneous multimodal communication: joint action and joint attention. These capacities are demonstrated using an experimental paradigm for cooperative on-screen assembly of a two-dimensional model. The software is available under an open source license.
Reference
Carletta, J., Hill, R. L., Nicol, C., Taylor, T., de Ruiter, J. P., & Bard, E. G. (2010). Eyetracking for two-person tasks with manipulation of a virtual world. Behavior Research Methods, 42(1), 254–265. https://doi.org/10.3758/brm.42.1.254
BibTeX
@article{carletta2010eyetracking, author = {Carletta, Jean and Hill, Robin L. and Nicol, Craig and Taylor, Tim and de Ruiter, Jan Peter and Bard, Ellen Gurman}, title = {Eyetracking for two-person tasks with manipulation of a virtual world}, journal = {Behavior Research Methods}, year = {2010}, month = feb, publisher = {Springer}, volume = {42}, number = {1}, pages = {254--265}, doi = {10.3758/brm.42.1.254}, category = {journal}, keywords = {psychology} }