An alternative controller system where brain-computer interface (BCI) data informs generative visuals.
Years Active: 2021–2023
My master thesis project explores how brain-computer interface (BCI) signals can be used to drive generative visual feedback in interactive experiences. We designed and evaluated a system where real-time functional near-infrared spectroscopy (fNIRS) data modulates visual elements, enabling users to influence digital art and game environments with their mental state.
We have all had this experience: your mind starts to wander, and you are no longer focused on the task at hand. You might notice the person or object right in front of you, but everything else fades into a blur. Then, when you snap back into focus, distant objects become crystal clear, and the world seems brighter and more vivid than ever before. With this project, I aim to create this visual experience in video games using brain sensing techniques to add a new level of immersion and realism by letting the player’s mental state shape what they see.
Video Demo
Research Highlights & Insights
BCI-driven Visual Feedback: The system maps fNIRS values (e.g., oxyhemoglobin, deoxyhemoglobin) to generative art parameters, creating a closed feedback loop between user state and visuals.
Open-sourced Turbo Satori -> Unity Bridge: I developed and released an open-source bridge that streams real-time fNIRS data from Turbo Satori into Unity for interactive applications.
Design Exploration: Compared multiple visual feedback strategies, including abstract generative visual effects and game-based feedback, to assess their impact on user experience.
References
2023
Impact of BCI-Informed Visual Effect Adaptation in a Walking Simulator
Max Chen, Erin Solovey, and Gillian Smith
In Proceedings of the 18th International Conference on the Foundations of Digital Games, Lisbon, Portugal, 2023
In this paper, we explore the use of brain-computer interface (BCI)-adapted visual effects to support atmosphere in a walking simulator, and investigate its impact on player-reported immersive experience. While players were using a keyboard or joystick controller to control the basic movement of a character, their mental state was accessed by a non-invasive BCI technique called functional near-infrared spectroscopy (fNIRS) to implicitly adjust the visual effects. Specifically, when less brain activity is detected, the players’ in-game vision becomes blurry and distorted, recreating the impression of losing focus. With this biological indication, we designed a BCI-controlled game, in which the vision becomes blurry and distorted when less brain activity is detected, recreating the impression of losing focus. To analyze the player’s experience, we conducted a within-subjects study where participants played both a BCI-controlled and non-BCI-controlled game and completed a questionnaire after each session. We then conducted a semi-structured interview to investigate player perceptions of the impact the BCI had on their experiences. The results showed that players had slightly improved immersion in the BCI-adaptive game, with the temporal dissociation score significantly different. Players also reported the BCI-adaptive visual effects are realistic and natural, and they enjoyed using BCI as a supplemental control.
@inproceedings{chen2023impact,author={Chen, Max and Solovey, Erin and Smith, Gillian},title={Impact of BCI-Informed Visual Effect Adaptation in a Walking Simulator},year={2023},isbn={9781450398558},publisher={Association for Computing Machinery},address={New York, NY, USA},url={https://doi.org/10.1145/3582437.3582448},doi={10.1145/3582437.3582448},booktitle={Proceedings of the 18th International Conference on the Foundations of Digital Games},articleno={5},numpages={8},keywords={BCI, Immersive Experience, Personalized Experience, Visual Effect, Walking Simulator, fNIRS},location={Lisbon, Portugal},series={FDG '23},}