Talk to sales

Bridging Neuroscience and XR Insights Session: Your Questions Answered

February 10, 2023
by Conor Russomanno, Co-Founder and CEO, OpenBCI, Eva Esteban, Embedded Software Engineer, OpenBCI
|
Medical Research

We got so many questions during our Insight Session on bridging neuroscience and XR that there wasn’t enough time to go through all of them during the live session. Below you can find answers to the remaining questions from OpenBCI’s Co-Founder & CEO Conor Russomanno and Embedded Software Engineer Eva Esteban. Get the full recording of the event here.

Contents:

  1. SENSORS & DATA
  2. MEASURING EMOTIONS
  3. GENERAL INTEREST, ETHICS & REGULATION
  4. USE CASES
  5. GALEA HEADSETS
  6. DEVELOPERS

Q&A

SENSORS & DATA

  • With 8 EEG channels can you infere the activation of specific brain areas? Could you develop Galea specific scripts for the EEG analisys?
    Galea has EEG electrodes located along the midline of the head (Fz, Cz), the occipital lobe (Oz, O1, O2), parietal lobe (Pz, P1, P2) and the prefrontal cortex (Fp1, Fp2). We are developing several toolsets that will ship with Galea and can be used for EEG analysis. It is also possible to use the open-source BrainFlow project to interface with the headset.

 

  • Is non invasive BCI “less” powerful than invasive one? I mean is the type of data that can be gathered by non invasive BCI somewhat limited?
    Invasive BCIs can provide more precise and localized information. This comes with tradeoff of requiring surgical implantation. There are use cases for both system types. Invasive BCIs are useful in applications where information is needed from a specific location in the brain (e.g. medical applications). Non-invasive BCIs provide data that is not as well localized, but have the advantages of rapid setup, configurability, and generalization to a large population of users. They also do not come with the risks and complexity associated with surgical implementation.

 

  • How many EEG electrodes are there on the Galea beta?
    There are 10 EEG electrodes on the Galea Beta device. These are located at the frontal lobe (F1, F2), central area (Cz, C3, C4), parietal lobe (Pz, P3, P4) and occipital lobe (O1, O2).

 

  • Why isnt eyetracking used as a primary pointer input mechanism vs handtracking and controllers?
    Eye tracking can be a useful pointer system, but it is not always the case that a player will want to interact with the object they are looking at. For example, imagine a use case where the player is watching a scene and waiting for an action to occur before they press a button. If eye tracking were the only available pointer interface, they would be unable to perform both actions at once. A combination of multiple modalities is probably best – just as in real life!

 

  • Does the system apply filters or other pre-processing steps to the EEG data?
    Yes it does. These filters can be configured through through the Galea GUI utility or at the SDK level depending on the task and environmental requirements.

 

  • Can you record and download raw signals?
    Yes, you can record and save the raw data. You can also import it into your applications in real time.

 

  • How is the signal quality of Galea?
    Early comparisons vs. Biosemi’s ActiveTwo system were extremely favorable and we   are working with external partners on publications comparing Galea vs other systems.

 

  • How much preprocessing is already done? In other words: how much of the EEG data is noise?
    Several provided filters can be configured through through the Galea GUI utility or at the SDK level depending on the task and environmental requirements.

 

  • Are you looking into ways to optimize the data in real world scenarios? Traditionally the signals we receive in real world use cases produces very messy data. Easy to use in a laboratory setting but often unsusable when the user is physically moving in a real environment, etc.
    We are implementing features on both the hardware and software sides of the platform to optionally remove, reduce, and label motion artifacts when they occur.

 

  • Is there some sort of a measurement threshold for head movements in the Galea? If so, how much does it compensate for? Does the useful signal get retained during movement noise?
    The degree to which a signal of interest can be retained after removing a motion artifact is dependent on the artifact in question. Galea Beta devices will include features that optionally remove, reduce, and label motion artifacts (depending on user need and the artifact in question) when they occur.

 

  • Are you working on collecting data with subjects in physical motion, and are the signals clean enough to interpret them?
    There will be cases where motion artifacts will appear in the data, but early Galea Beta prototypes are demonstrating robustness to motion artifacts in a variety of VR environments. We will also provide tools to optionally remove, reduce, and label noise artifacts.

 

  • I’m interested in the intergration of heart rate and EEG in the VR environment. Since the heart rate is recorded with a PPG sensor that use a optical sensor, what is the time resolution of heart rate? In other words, how often is the heart rate sampled?
    The sampling rate of the PPG sensor is 50Hz.

 

  • Is there connectivity for integrating into other systems? Such as a manufacturing environment? Co-bot or other device control or manipulation? recording and reporting on metrics?
    The Galea GUI application provides the ability to stream data via Serial, UDP, LSL, and OSC protocols. There are also language bindings to Python, C++, Java, C#, R, Matlab, Julia, and Rust via the open-source BrainFlow project.

 

  • Efforts toward collecting data and in real world scenarios with subjects physically moving (walking, running). This can often produce messy, unusable data. Any work toward providing cleaner signals in these use cases?
    There will be cases where motion artifacts will appear in the data, but early Galea Beta prototypes are demonstrating robustness to motion artifacts in a variety of VR environments. We will also provide tools to optionally remove, reduce, and label noise artifacts.

 

  • While wearing the XR in living rooms or somewhere else, how do you differentiate the movement in sculp unintentionally and the brain data?
    There are a variety of methods for detecting problems with signal quality. Motion artifacts in particular are often loud events which clearly contrast with the much quieter EEG brain data. The multimodal nature of the headset, along with the built-in tracking of the XR headset also contribute to identifying motion artifacts.

 

  • What are the regions you track from the EEG electrodes?
    Galea Alpha has EEG electrodes located along the midline of the head (Fz, Cz), the occipital lobe (Oz, O1, O2), parietal lobe (Pz, P1, P2) and the prefrontal cortex (Fp1, Fp2).There are 10 EEG electrodes on the Galea Beta device. These are located at the frontal lobe (F1, F2), central area (Cz, C3, C4), parietal lobe (Pz, P3, P4) and occipital lobe (O1, O2).

MEASURING EMOTIONS

  • How is stress measured?
    Stress is measured using a combination of EEG, EDA, pupillometry, and heart rate.

 

  • Are these Stress Increase and Heartrate Increase markers the First Derivative markers Eva mentioned a few minutes ago?
    Not exactly. “First derivative” or “first tier” metrics relate to information that can be determined purely from raw signals. EEG band power, heart rate, and blood oxygenation are good examples of this. Higher-tier metrics use a combination of raw signals and lower-tier metrics to infer more abstract information (e.g., stress, cognitive load, drowsiness).

 

  • Do you offer cognitive state estimation algorithms (e.g. emotions) based on the biosignals?
    We are developing a suite of tools for classifying a set of specific cognitive states that will ship with Galea Beta. There is also promising work related to determining emotional valance and arousal is being conducted by academic partners using Galea.

 

  • Do you label the data, indicating when a state of flow is achieved or when someone is bored?
    We are developing a suite of tools for classifying a set of specific cognitive states that will ship with Galea Beta. These tools will include features for labeling when cognitive states are entered and exited.

 

  • How is the parameter of stress quantified? is it based on any biosensing information, such as eye movement?
    Stress is measured using a combination of EEG, EDA, pupillometry, and heart rate.

 

  • Have you integrated a deep learning solution in your software to identify emotions (e.g., happiness, sadness, fear, disgust, etc.) from Galea’s neuron signals?
    We are developing a suite of tools for classifying a set of specific cognitive states that will ship with Galea Beta. We are currently demonstrating preliminary versions of some of these classification solutions.

 

  • Is it possible to read these emotions via the Unreal or Unity SDK?
    It is not yet possible to determine the generalized emotional state of a wide variety of users, but promising work related to determining emotional valance and arousal is being conducted by academic partners using Galea. We are developing a suite of tools for classifying a set of specific cognitive states that will ship with Galea Beta. Galea Alpha and Beta devices both support Unreal Engine and Unity.

 

  • Do you offer cognitive state estimation algorithms (e.g., emotions) based on the biosignals?
    We are developing a suite of tools for classifying a set of specific cognitive states that will ship with Galea Beta.

GENERAL INTEREST, ETHICS & REGULATION

  • How far out do you think you are from being able to send inputs to the brain from the BCI rather than just getting information from the brain?

    We can already send inputs to the brain via the audiovisual XR aspect of the Galea headset. The key enabler of the platform is the ability to tightly synhcronize the presentation of complex stimuli to users (graphics, audio), the measurement of physiological responses to those stimuli, and the dynamic adaptation of the environment in response to those reactions.
  • Do you guys plan on ever implementing a way to write data to the brain similar to neuralink?
    At this time, we don’t have plans to support magnetic or electrical stimulation with Galea.
  • With ML/AI learning and a lot more user of Galea, how far / decades would you say we are from getting close to the “Ready Player Two” device and capabilities?
    We are at least a couple of decades away from approaching Harman’s “brain in a vat” concept (or any of its philosophical predecessors). Some of the largest obstacles are high-resolution motor imagery (or micro gestures) and high-resolution sensory feedback. One area we are exploring with Galea today is how virtual environments can interact better with users in a social context. As humans, we are very good at reading subtle social and emotional cues. Galea can help applications better understand these contexts and provide developers with tools to create more immersive, responsive, and human experiences.

 

  • How does the user use facial expressions to control the virtual object? Does the user need to learn and practice how to move her cheek muscle, for example? Does it need to build a face muscle memory?
    Currently, we have several face gestures that are classified by generalized machine-learning models and mapped to actions in the XR environment (much like game controller buttons).

 

  • How would you feel, ethically, about using this system to qualify and categorize candidates for employment? Could this create an unintentional bias?
    We haven’t explored using Galea in a recruiting context. Galea is a tool, and if used to evaluate candidates in a recruiting context, it would be important for the interviewer to understand and adjust their evaluations based on the benefits and limitations of the recruitment device and environment (similar to resumes, interviews, technical questions, and personality quizzes).

 

  • Can this device be used to support the metaverse? If possible, how?
    The Galea headset leverages the Varjo Aero VR headset, which includes support for OpenVR and OpenXR-compatible applications like VRChat and Second Life.

 

  • Is there also a way to have something like thinking control? Thinking of opening a door or something like this?
    The neurotechnology community uses OpenBCI products for all sorts of cool applications! (https://openbci.com/community/category/tutorials).

 

USE CASES

  • I have Multiple Sclerosis, are products being developed specifically for neurological-based illnesses? BTW I use the Aero to continue flying with a flight simulator, and the experience is amazing!
    We’re currently working with one of our partners, who is severely motor disabled, on a project that will allow him to pilot a drone from VR using Galea. In the future, it might be possible to adapt the interface we’re building for our partner to other individuals.

 

  • Which market do you feel will be the largest initial one to adopt Galea
    The largest industries that have expressed interest in Galea are training and simulation, entertainment and gaming, healthcare and wellness, and user experience. We plan to explore smaller markets over the next few months as well.

 

 

GALEA HEADSETS

  • Are Galea XR headsets wireless or tethered? And what’s the price point for a beta headset?
    The HMD component of the device is wired. The biosensing component is wireless and uses Wi-Fi for data transmission. The Aero Galea package costs $25,000 and the XR-3 Galea package costs $34,000.

 

  • Will all the data (EMG, HR, EEG, etc.) be available if purchasing the Galea? How heavy is the HMD with all the equipment?
    Our software SDK will give you access to all the data types collected by Galea: EEG, EMG, EOG, EDA, PPG, and image-based eye-tracking. The estimated weight of the   device is 800-900 grams.

 

  • Is there a path for companies that already have XR-3’s (multiple) to leverage this and use galea and the beta program without purchasing another such as buy the galea aspect and the software to support the beta?
    This option is not available at the moment. Galea units must be purchased as  offered on galea.co.

 

  • What’s that technology called, where the headset tracks the user’s focal spot, and reduces resolution or blurs the regions not in the user’s focal spot? does XR-3 do this?
    This is called foveated rendering. It is supported by both the Varjo XR-3 and Varjo Aero headsets as well as their integrations with Galea. https://developer.varjo.com/docs/native/foveated-rendering-api

 

  • Is there (or is planned) any wireless version of Galea? Does it has spatial awareness capabilities, like Hololens?
    The Galea biosensors are actually already streamed wirelessly system due to the need to electrically separate the system from ground noise that would otherwise overwhelm EEG and other electrical signals. The Varjo XR-3 (and its Galea integration) currently has beta support for inside-out tracking.

 

  • Do you have a wireless version of the headset?
    At the moment, the HMD component of the device is wired. The biosensing component is wireless and uses Wi-Fi for data transmission.

 

  • Can Galea detect VR motion sickness before it becomes too much for the user?
    This is on our roadmap. Research indicates that there are clear physiological indicators of VR sickness that could be detected by the sensor modalities supported by Galea.

DEVELOPERS

  • Do you think these derivative datasets for stress, attention, etc. will also be open source in the future?
    We are working to find the right balance between the open-source mission at the heart of OpenBCI and the costs associated with curating statistically significant datasets.

 

  • To what extent does Galea provide an API to understand what brain activity implies? Or are there only raw data samples available that require additional processing by an expert?
    We are developing a suite of tools for classifying a set of specific cognitive states that will ship with Galea Beta. We are currently demonstrating preliminary versions of some of these classification solutions. Raw data samples will also be available for manual processing if use-case-specific classifications are required.
  • How do you access this data for an application? Do you need your specific SDK, or is there also planned an OpenXR extension in the future?
    The XR capabilities of the headset are already compatible with OpenXR (Khronos) and OpenVR (Valve). The biosensing capabilities of the headset are currently available via an SDK.

 

 

WATCH THE FULL RECORDINGRead more on Galea.co

Organizations

Business customers have access to our full product range:

  • Varjo XR-3
  • Varjo VR-3
  • Varjo Aero

Available in over 35 countries

Individuals

Private customers can order Aero through our webstore.

  • Varjo Aero

Available in the United States, Canada, United Kingdom, EU, Switzerland, Iceland, and Australia.

Organizations

Business customers have access to our full product range.

Individuals

Private customers can order Aero through our webstore.