<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Convergent Science Network &#187; Virtual Reality</title>
	<atom:link href="https://csnblog.specs-lab.com/tag/virtual-reality/feed/" rel="self" type="application/rss+xml" />
	<link>https://csnblog.specs-lab.com</link>
	<description>Blog on Biomimetics and Neurotechnology.     With [writers] Michael Szollosy, Dmitry Malkov, Michelle Wilson, and Anna Mura [editor]</description>
	<lastBuildDate>Tue, 27 Sep 2022 14:58:43 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>How to make your brains feel at home</title>
		<link>https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/</link>
		<comments>https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/#comments</comments>
		<pubDate>Wed, 21 May 2014 14:45:18 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[architecture]]></category>
		<category><![CDATA[Art and design]]></category>
		<category><![CDATA[CAVE]]></category>
		<category><![CDATA[Intelligent environment]]></category>
		<category><![CDATA[Interactive architecture]]></category>
		<category><![CDATA[neuroarchitecture]]></category>
		<category><![CDATA[Neuroscience]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[Synthetic Oracle]]></category>
		<category><![CDATA[Virtual Reality]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5247</guid>
		<description><![CDATA[The physical spaces we inhabit have a direct influence on how we feel, think and behave. Understanding this implicit dialogue between built environments and our minds continues to open new ways for architects to design physical spaces that better meet &#8230; <a href="https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5266" style="width: 594px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/Spacemaker_Founder.jpg" rel="attachment wp-att-5266"><img class="size-large wp-image-5266" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/Spacemaker_Founder-1024x575.jpg" alt="Spacemaker VR is an application for Oculus headset that allows designers to walk through their creations Source: Digital Physical" width="584" height="327" /></a><p class="wp-caption-text">Spacemaker VR is an Oculus-based virtual reality system that allows designers to walk through their designs<br />Source: Digital Physical</p></div>
<p>The physical spaces we inhabit have a direct influence on how we feel, think and behave. Understanding this implicit dialogue between built environments and our minds continues to open new ways for architects to design physical spaces that better meet people’s needs. Neuro-architecture, interactive architecture, intelligent environments and virtual reality technology are among those exciting and partially overlapping disciplines that are currently on the frontline of the ongoing architectural revolution.</p>
<p><span id="more-5247"></span></p>
<p><strong>Neuro-architecture</strong></p>
<p>Where previously architects had to count on purely anecdotal and intuitive principles, neuro-architecture is now promising to provide a truly evidence-based neurobiological rationale for designing architectural spaces, be it your office, school or hospital. This is achieved thanks to a variety of techniques that allow researchers to quantify and measure human responses to different components that constitute a particular architectural environment, including by measuring the relevant regions of the brain.</p>
<p>Understanding the precise effects of each component on our mental and physical health is difficult to overestimate. Imagine being able to design classrooms whose very architectural configuration aids students’ concentration and improves learning or hospitals that accelerate patients’ recovery. The research can be carried out on existing buildings, models or in virtual reality simulations, before the actual structures are even built.</p>
<p>Virtual reality, in fact, can be extremely helpful when it comes to neuro-architecture research for several reasons. It allows to set up virtual environments where participants can navigate in life-like conditions, while at the same time researchers can have a systematic control of the introduced stimuli. Importantly, the response can be measured on different scales starting from an entire building to the scale of a room to a single architectural feature such as the height of the ceiling or the amount and quality of light allowed into the space. Some <a href="http://eaedesign.com/InnovativeDesignScience.com/Research_Activities_-_CAVE_Technology.html">interesting research</a> in this direction was conducted by the professor <a href="http://cala.arizona.edu/users/eve-edelstein-phd?destination=user/1386">Eve Edelstein</a> with the use of the virtual reality platform <a href="http://en.wikipedia.org/wiki/Cave_automatic_virtual_environment">CAVE</a>.</p>
<p>Those who think that the CAVE and other virtual reality rooms are not immersive enough, think, for instance, of the possibilities opened up by the much-talked-about Oculus headset. Contrary to virtual reality rooms, which are not easily accessible to everyone, virtual reality headsets are potentially available to every designer. Instead of fiddling around with physical prototypes, designers could now walk through their own creations and actually experience them. <a href="http://digitalphysical.com/spacemaker/">Spacemaker VR</a> from <a href="http://digitalphysical.com/">Digital Physical</a> is one example of how this technology can be used for the benefit of architects.</p>
<p>Read <a href="http://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/#more-5239">our previous post</a> to learn about the eXperience Induction Machine, another exciting application of virtual reality.</p>
<p><strong>Interactive architecture and intelligent environments:</strong></p>
<p>The two terms are often interchangeable in many contexts, perhaps with a slightly more artistic connotation for the former and a more functional one for the latter. Whatever the difference between them, both are guided by the increasing penetration of computing into our daily lives to develop dynamic environments that can adapt their physical properties to the behaviour of the inhabitants. The ultimate goal, of course, is make people feel more at home and in harmony with their physical surroundings.</p>
<p>Many of the examples of interactive architecture are born from a mixture of artistic thinking and computational engineering. So far the researchers have been toying with some of the most fundamental parameters that are known to alter our state of mind. Unsurprisingly, light is one of the favorites when it comes to interactive architecture. Check out the two examples below, which include the <a href="http://www.iua.upf.edu/syntheticOracle/">Synthetic Oracle</a> (former Hello Stranger) from the <a href="http://specs.upf.edu/home">SPECS</a> group at <a href="http://www.upf.edu/es/">Pompeu Fabra University</a> and <a href="http://www.behance.net/gallery/BIOSTAGOG/7609469">BIOSTAGOG</a> developed jointly by <a href="http://www.platige.com/">Platige Image</a> and <a href="http://www.brdg.pl/">Bridge</a>.</p>
<p><iframe width="584" height="438" src="https://www.youtube.com/embed/SAeys1fK3Zo?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p><iframe src="//player.vimeo.com/video/66800080" width="584" height="329" frameborder="0" title="INTERACTIVE INSTALLATION BY BRIDGE AND PLATIGE IMAGE." webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Virtual reality labs reshape how we process information</title>
		<link>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/</link>
		<comments>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/#comments</comments>
		<pubDate>Wed, 07 May 2014 05:40:39 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[big data]]></category>
		<category><![CDATA[brain]]></category>
		<category><![CDATA[CEEDS]]></category>
		<category><![CDATA[eXperience Induction Machine]]></category>
		<category><![CDATA[Laval Virtual]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[Neuroscience]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<category><![CDATA[XIM]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5239</guid>
		<description><![CDATA[We live in a time when the scale of scientific research is undergoing an unprecedented exponential growth, which contributes to the generation of equally unprecedented amounts of data. Disciplines like neuroscience, astronomy or particle physics are piling up so much &#8230; <a href="https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/X31.jpg" rel="attachment wp-att-5257"><img class="aligncenter size-large wp-image-5257" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/X31-1024x501.jpg" alt="X3" width="584" height="285" /></a></p>
<p>We live in a time when the scale of scientific research is undergoing an unprecedented exponential growth, which contributes to the generation of equally unprecedented amounts of data. Disciplines like neuroscience, astronomy or particle physics are piling up so much information that finding and implementing new ways of representing, navigating and manipulating this information is rapidly becoming a pressing necessity.</p>
<p><span id="more-5239"></span></p>
<p>One specifically promising method relies on the use of virtual and mixed reality platforms. What could be more intuitive and useful for, say, a neuroscientist trying to make sense of a huge and seemingly chaotic brain data set than an ability to fly through its virtual gesture-controlled representation and actually experience the properties of data in search for meaningful patterns.</p>
<p>The <a href="http://specs.upf.edu/research_in_mixed_and_virtual_reality">eXperience Induction Machine</a> (XIM), built in the <a href="http://specs.upf.edu/">SPECS lab</a> at <a href="http://www.upf.edu/en/">Pompeu Fabra University</a> in Barcelona, is one example of such immersive spaces, which is currently applied to work precisely with data collected from the human brain. XIM allows researchers to visualize a brain connectome, the network of nodes and connections that defines what is going on in our vital organ. XIM is now a key part of the <a href="http://ceeds-project.eu/">Collective Experience of Emphatic Data Systems</a> (CEEDs), a European project seeking to develop a whole set of tools to bring big data visualisation to a new level.</p>
<p><iframe width="584" height="329" src="https://www.youtube.com/embed/PRXuMIZDucc?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>XIM can be hooked up to a series of sensors that measure such parameters as the user’s heart rate, skin conductance, eye gaze and brain activity. This allows the system to register certain subconscious patterns, associated with how we perceive and process information, and guide the user’s attention to areas of potential interest that would otherwise remain unnoticed. This feature, along with XIM&#8217;s increased interactivity, is what really makes XIM stand out in comparison with some other state-of-the-art virtual and mixed reality systems such as the <a href="http://www.allosphere.ucsb.edu/index.php">AlloSphere</a> at the <a href="http://www1.cnsi.ucla.edu/index">California Nanosystems Institute</a> or the <a href="http://www.evl.uic.edu/core.php?mod=4&amp;type=1&amp;indi=424">CAVE2</a> at the <a href="http://www.uic.edu/uic/">University of Illinois at Chicago. </a></p>
<p>Earlier this month, SPECS and CEEDs showcased their platform for embodied exploration of neural data at the 16<sup>th</sup> edition of <a href="http://www.laval-virtual.org/en/">Laval Virtual, </a>the largest virtual technology conference in Europe. You can see a complete photo report from the event <a href="http://ceeds-project.eu/2014/04/14/ceeds-laval-virtual-2014-in-pictures/">HERE</a>.</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Europe&#039;s Integrative Technology</title>
		<link>https://csnblog.specs-lab.com/2011/12/15/2607/</link>
		<comments>https://csnblog.specs-lab.com/2011/12/15/2607/#comments</comments>
		<pubDate>Thu, 15 Dec 2011 13:00:10 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Europe]]></category>
		<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[EU Research]]></category>
		<category><![CDATA[MINDWALKER]]></category>
		<category><![CDATA[Rehabilitation]]></category>
		<category><![CDATA[RGS]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[Universidad Pompeu Fabra]]></category>
		<category><![CDATA[University of Twente]]></category>
		<category><![CDATA[Virtual Reality]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=2607</guid>
		<description><![CDATA[Robots for stroke patients and more&#8230; The video above features the LOPES  (Lower Extremity-Powered ExoSkeleton) developed by Dr. ir Herman van der Kooij  and his team at the University of Twente, Netherlands to assist stroke patients who are learning how &#8230; <a href="https://csnblog.specs-lab.com/2011/12/15/2607/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>Robots for stroke patients and more&#8230;</strong><br />
<iframe src="http://www.youtube.com/embed/ShFvaOuhkF0" frameborder="0" width="560" height="349"></iframe><br />
The video above features the LOPES  (Lower Extremity-Powered ExoSkeleton) developed by <a href="http://www.bw.ctw.utwente.nl/organisation/staff/Scient_Staff/vanderKooij/index.html">Dr. ir Herman van der Kooij </a> and his team at the <a title="University of Twente" href="http://www.utwente.nl/en" target="_blank">University of Twente</a>, Netherlands to assist stroke patients who are learning how to walk again. It&#8217;s a critical time to invest in projects such as this one as Europeans- and many other populations around the world- are ageing  while the number of care giving professionals is dwindling.<br />
<span id="more-2607"></span>Technology of this nature can provide important support for rehabilitation specialists  as traditional methods for this type of rehabilitation are very labour intensive,  often putting  physiotherapists at risk of injury.</p>
<p>The LOPES has already been included in a couple European projects; formerly in one named Everyon and currently in the 3 year project<a title="mindwalker" href="https://mindwalker-project.eu/" target="_blank"> MINDWALKER</a> (Mind controlled orthosis and VR training environment for walk empowering ). This project is capitalising on the synergy that exists between diverse research fields in Europe; as potential robotics applications are becoming more and more concrete and plausible likewise, recent brain research is delivering promising results with new potential applications. Exploiting these concurrent developments, MINDWALKER incorporates the LOPES, Brain Computer Interface technologies and Virtual Reality in a comprehensive rehabilitation system for the lower limbs. Similarly, the <a title="specs upf" href="http://specs.upf.edu/" target="_blank">SPECS Lab</a> at the <a title="UPF" href="http://www.upf.edu/" target="_blank">Universitat Pompeu Fabra</a> has developed <a title="RGS" href="http://rgs-project.eu/" target="_blank">RGS </a>(Rehabilitation Gaming System), a virtual reality tool that exploits what we know about the brain&#8217;s  plasticity to rehabilitate arm movement in stroke patients.</p>
<p><a title="robocom" href="http://www.robotcompanions.eu/home" target="_blank">Robot Companions for Citizens</a> is another example of a European sponsored initiative which integrates advances in neuroscience, robotics and new materials. This initiative aims to create novel benefits for society by fostering  the development of new types of robots and a new industry at large. <strong><br />
</strong><strong><br />
</strong></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2011/12/15/2607/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
