BCI-informed Visuals

An alternative controller system where brain-computer interface (BCI) data informs generative visuals.

Years Active: 2021–2023

My master thesis project explores how brain-computer interface (BCI) signals can be used to drive generative visual feedback in interactive experiences. We designed and evaluated a system where real-time functional near-infrared spectroscopy (fNIRS) data modulates visual elements, enabling users to influence digital art and game environments with their mental state.

Read more about this work: (Chen et al., 2023).

Visuals & Demo


We have all had this experience: your mind starts to wander, and you are no longer focused on the task at hand. You might notice the person or object right in front of you, but everything else fades into a blur. Then, when you snap back into focus, distant objects become crystal clear, and the world seems brighter and more vivid than ever before. With this project, I aim to create this visual experience in video games using brain sensing techniques to add a new level of immersion and realism by letting the player’s mental state shape what they see.

Video Demo


Research Highlights & Insights


  • BCI-driven Visual Feedback: The system maps fNIRS values (e.g., oxyhemoglobin, deoxyhemoglobin) to generative art parameters, creating a closed feedback loop between user state and visuals.
  • Open-sourced Turbo Satori -> Unity Bridge: I developed and released an open-source bridge that streams real-time fNIRS data from Turbo Satori into Unity for interactive applications.
  • Design Exploration: Compared multiple visual feedback strategies, including abstract generative visual effects and game-based feedback, to assess their impact on user experience.

References

2023

  1. bci_visuals_design_comparision.png
    Impact of BCI-Informed Visual Effect Adaptation in a Walking Simulator
    Max Chen, Erin Solovey, and Gillian Smith
    In Proceedings of the 18th International Conference on the Foundations of Digital Games, Lisbon, Portugal, 2023