Wearable Brain-Computer Interfaces and VR : Neuroergonomics Meets the Metaverse

Wearable Brain-Computer Interfaces and VR : Neuroergonomics Meets the Metaverse

Abstract: The so-called “Metaverse” has been defined as a network of virtual worlds focusing on social connection and enabled by innovations in virtual and augmented reality in which users blend real and virtual reality in a seamless manner. As the worldwide COVID-19 pandemic has shown us, however, existing virtual reality (VR) applications are still far from achieving this goal and social interactions in VR are still not very realistic. To solve this problem, new technologies are emerging in which additional senses can be stimulated, thus improving the user’s sense of presence and immersion. Representative examples include scent diffusers attached to VR headsets, haptic suits and hands-free ultrasounds-based haptics, digital taste stimulators, and multisensory VR pods in which somatosensory stimuli can also be included (e.g., with fans, heaters, and vapours). While multisensory experiences hold promise, it is not known today what direct impact stimulating these additional senses has on the overall user experience and if, indeed, improved sense of presence is achieved. To help answer these questions, we have recently embedded a wearable brain-computer interface directly into the head-mounted display, allowing for real-time measurement of electroencephalography (EEG), electrocardiography (ECG), electrooculography (EOG) and facial electromyography (EMG). With these signals, real-time measurement of the user’s mental states, eye gaze, facial gestures, and head movements is possible even in highly ecological settings. Using machine learning tools, these metrics can then be combined to predict the user’s sense of presence, immersion, engagement, and cybersickness. An application in which the gamers’ quality of experience, sense of presence and immersion, and engagement levels were monitored remotely from their homes will be showcased. Moreover, access to such biosignals may allow for development of bio/neuromarkers to monitor health intervention outcomes, thus opening doors for new clinical applications. In this presentation, I will show two case studies that are currently underway with our clinical partners: relaxation of nurses in a ward for aggressive patients and improving cognitive functioning for patients with post-traumatic stress disorder.

Bio: Dr. Tiago H. Falk is a Full Professor at the Institut national de la recherche scientifique, University of Quebec in Montreal, Canada. He is Co-Chair of the IEEE SMC Technical Committee on Brain-Machine Interface Systems, has served as Co-Chair of the IEEE SMC Brain-Machine Interface Workshop since 2018, and is an Associate Editor of the IEEE SMC Transactions on Human-Machine Systems and of the Frontiers in Neuroergonomics journal. He directs the Multisensory Signal Analysis and Enhancement Lab where he and his team work on integrating signal processing and machine learning to make next-generation (multisensory) human-machine interfaces usable in highly ecological settings. His work has been showcased in over 340 journal papers, conference proceedings, books and book chapters, and patents. Dr. Falk received his postdoctoral training in neuroengineering at the University of Toronto, his PhD and MSc degrees in multimedia signal processing and machine learning from Queen’s University in Kingston (Canada), and his BSc in Electrical Engineering from the Federal University of Pernambuco (Brazil).

 

Tiago H. Falk, Full Professor, Institut national de la recherche scientifique, University of Québec, Canada