Skip to main content

P2-12 Eye Tracking for Collaborative Music Experiences: A Framework for Recording and Analysing Collective Attention in Naturalistic Concert Settings

P2-12 Eye Tracking for Collaborative Music Experiences: A Framework for Recording and Analysing Collective Attention in Naturalistic Concert Settings

Name:Shreshth Saxena

School/Affiliation:Department of Psychology, Neuroscience and Behaviour, McMaster University

Co-Authors:Areez Visram, Neil Lobo, Zahid Mirza, Mehak Rafi Khan, Biranugan Pirabaharan, Alexander Nguyen, Lauren K. Fink

Virtual or In-person:In-person

Abstract:

Eye movements are pervasive in scientific and applied fields as a multi-modal attention indicator and interaction tool. As a result, their adoption in music cognition has opened new avenues for studying human behaviour and developing novel music generation tools. Music, however, is a highly collaborative and social activity which has proven to be a challenging operational context for eye-tracking tools, restricting most applications of eye-tracking to single-person controlled settings. In our work, we scale eye-tracking to naturalistic multi-person settings. We designed and developed a software framework \citep{socialEyes} to record, synchronise, and analyse eye-tracking measures from portable eye-tracking glasses. We validated our framework in a real-world public event where 30 people simultaneously watched a film screening and movie concert. Across two days, we collected gaze, blinks, and pupil data from a total of 60 participants. Recorded clock drifts between independent eye-tracking devices show precise time synchronisation to the order of 10ms, using our framework. The framework also projects the individual gaze of each participant to a common coordinate system for spatial gaze comparisons across participants. We present our framework architecture, as well as new tools for visualising and analysing collective gaze data in such social settings. Illustrating the utility of our approach, in the current study, we analysed collective gaze metrics during each musical section of the performance (17 movements of 1-5 mins each) and found consistent gaze, blinks, and pupil activity patterns across participants on both days. Overall, our method scales eye-tracking to real-world social contexts and promotes ecological validity in music and eye-tracking research. Our framework additionally lays the foundation for development of innovative music collaboration tools that use eye movements as an interface.

Poster PDFPoster PDF