Virtual reality labs reshape how we process information

X3

We live in a time when the scale of scientific research is undergoing an unprecedented exponential growth, which contributes to the generation of equally unprecedented amounts of data. Disciplines like neuroscience, astronomy or particle physics are piling up so much information that finding and implementing new ways of representing, navigating and manipulating this information is rapidly becoming a pressing necessity.

One specifically promising method relies on the use of virtual and mixed reality platforms. What could be more intuitive and useful for, say, a neuroscientist trying to make sense of a huge and seemingly chaotic brain data set than an ability to fly through its virtual gesture-controlled representation and actually experience the properties of data in search for meaningful patterns.

The eXperience Induction Machine (XIM), built in the SPECS lab at Pompeu Fabra University in Barcelona, is one example of such immersive spaces, which is currently applied to work precisely with data collected from the human brain. XIM allows researchers to visualize a brain connectome, the network of nodes and connections that defines what is going on in our vital organ. XIM is now a key part of the Collective Experience of Emphatic Data Systems (CEEDs), a European project seeking to develop a whole set of tools to bring big data visualisation to a new level.

XIM can be hooked up to a series of sensors that measure such parameters as the user’s heart rate, skin conductance, eye gaze and brain activity. This allows the system to register certain subconscious patterns, associated with how we perceive and process information, and guide the user’s attention to areas of potential interest that would otherwise remain unnoticed. This feature, along with XIM’s increased interactivity, is what really makes XIM stand out in comparison with some other state-of-the-art virtual and mixed reality systems such as the AlloSphere at the California Nanosystems Institute or the CAVE2 at the University of Illinois at Chicago. 

Earlier this month, SPECS and CEEDs showcased their platform for embodied exploration of neural data at the 16th edition of Laval Virtual, the largest virtual technology conference in Europe. You can see a complete photo report from the event HERE.

 

Share this article:

    2 thoughts on “Virtual reality labs reshape how we process information

    1. Pingback: How to make your brains feel at home | Convergent Science Network

    2. Pingback: Don’t be afraid of big data | Convergent Science Network

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>