fNIRS and Neurocinematics R. Ramchurn, H.A. Maior, M.L. Wilson, S. Martindale, S. Benford Mixed Reality Lab, University of Nottingham, UK Corresponding author e-mail address: horia.maior@nottingham.ac.uk Recent advancements in Brain Computer Interfaces (BCIs), where sensors have become less invasive, more cost-effective, and increasingly portable, have facilitated the emergence of these techniques into new fields and research, such as Human-Computer Interaction [3]. Our prior work has, so far, used data from a commercially available EEG headset to continuously monitor a person’s attention and meditation levels whilst watching a movie. Furthermore, these measurements were used during the experience to suggest cuts and have a level of control over the narratives and changes of scenes in the movie [2]. In our ongoing work, we propose instead to use a research-grade, fully portable (wireless) fNIRS device (Artinis Octamon 1 ) as means to continuously monitor cognitive changes, with the aim to subsequently use fNIRS as a reactive input to an interactive cinematic experience. The first step towards accomplishing this goal, however, is to have a better understanding of how fNIRS data is affected by cinema viewing. Although the field of neurocinematics [1] has found that inter-subject correlation exists between viewers watching certain movie segments using fMRI techniques, work has yet to study responses in the more natural conditions afforded by fNIRS. For summer 2018, we have designed a study to measure frontal cortex activity while watching classic cinematic vignettes, where we intend to replicate the results of e.g. Hasson et al [1] but with fNIRS. Our study will ask participants to watch the same well studied movie clips, and combine [de]Oxy analysis with HCI interview protocol data. Research Questions: 1. How is fNIRS data affected by different cinematic techniques? 2. Is it possible to find inter-subject correlation with fNIRS while subjects are watching well understood cinematic vignettes? 3. How can we use features in fNIRS data in order to control an adaptive cinematic experience? Future work: As per our previous work, subsequent work would begin to create working experiences that use fNIRS to drive an interactive movie, and aim to present them at large impact public engagements, for arts, creative industries, and therapeutic applications. These performance-based research engagements will allow us to extend our existing framework for BCI film interactions, affects, design implications, and algorithms: Adapt an existing, well studied, interactive film to use fNIRS data to allow interaction. Explore neuro therapeutic applications that engage patients by using an element of creativity and entertainment as part of the system (explore these collaboratively with neuroscience researchers). References: [1] Hasson, U., Landesman, O., Knappmeyer, B., Vallines, I., Rubin, N., & Heeger, D. J. (2008). Neurocinematics: The neuroscience of film. Projections, 2(1), 1-26. [2] Matthew Pike, Richard Ramchurn, Steve Benford, and Max L. Wilson. 2016. #Scanners. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI’16, 5385–5396. https://doi.org/10.1145/2858036.2858276. [3] Solovey, E. T., Girouard, A., Chauncey, K., Hirshfield, L. M., Sassaroli, A., Zheng, F., ... & Jacob, R. J. (2009, October). Using fNIRS brain sensing in realistic HCI settings: experiments and guidelines. In Proc. UIST’09 (pp. 157-166). ACM. 1 https://www.artinis.com/octamon/ III