<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Convergent Science Network &#187; Cognitive Sciences</title>
	<atom:link href="https://csnblog.specs-lab.com/category/cognitive-sciences/feed/" rel="self" type="application/rss+xml" />
	<link>https://csnblog.specs-lab.com</link>
	<description>Blog on Biomimetics and Neurotechnology.     With [writers] Michael Szollosy, Dmitry Malkov, Michelle Wilson, and Anna Mura [editor]</description>
	<lastBuildDate>Tue, 27 Sep 2022 14:58:43 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>On Anthropomorphisation</title>
		<link>https://csnblog.specs-lab.com/2015/02/23/on-anthropomorphisation/</link>
		<comments>https://csnblog.specs-lab.com/2015/02/23/on-anthropomorphisation/#comments</comments>
		<pubDate>Mon, 23 Feb 2015 15:24:33 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[AI]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Ethics]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Uncategorized]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5546</guid>
		<description><![CDATA[Article by Michael Szollosy &#8220;The desire to anthropomorphise, the need to connect, is powerful, and that is why this thing is going to sell.&#8221; So says Daniel Graystone, inventor and CEO of Graystone industries in the American network series Caprica. The prequel to &#8230; <a href="https://csnblog.specs-lab.com/2015/02/23/on-anthropomorphisation/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5558" style="width: 276px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/02/HeadCADRefFrameV2-e1424704957858.jpg"><img class="wp-image-5558 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/02/HeadCADRefFrameV2-266x300.jpg" alt="HeadCADRefFrameV2" width="266" height="300" /></a><p class="wp-caption-text">from icub.org</p></div>
<p><span style="color: #373737;">Article by </span><a style="color: #617c96;" href="https://www.shef.ac.uk/scharr/sections/hsr/mh/sectionstaff/mszollosy">Michael Szollosy</a></p>
<p style="text-align: justify;"><em>&#8220;The desire to anthropomorphise, the need to connect, is powerful, and </em>that <em>is why this thing is going to sell</em>.&#8221;</p>
<p style="text-align: justify;">So says <a href="http://en.battlestarwiki.org/wiki/Daniel_Graystone">Daniel Graystone</a>, inventor and CEO of Graystone industries in the American network series <a href="http://en.battlestarwiki.org/wiki/Caprica_(series)"><em>Caprica</em></a>. The prequel to the <a href="http://www.imdb.com/title/tt0407362/">2004 remake of <em>Battlestar Galactica</em></a>, <em>Caprica</em> tells the story of how the genocidal <a href="http://en.battlestarwiki.org/wiki/Cylons_(RDM)">Cylons </a>came into existence. Graystone is trying to develop a robot for use by the military, but realises that his will be more successful if his robots look and act like human beings. First, it needs to be pointed out – evidently with some frequency – that <a href="http://www.techrepublic.com/blog/geekend/sci-fi-rant-why-giant-mecha-robots-are-stupid/">bipedal robot soldiers</a> are probably the most <em>inefficient</em> way that robots can be used in military combat, and not at all what a truly sophisticated artificial intelligence would use to take over the planet and enslave the human race.</p>
<p style="text-align: justify;">But given that, Graystone makes a very important point: there is a very deeply-rooted impulse to <a href="http://en.wikipedia.org/wiki/Anthropomorphism">anthropomorphise </a>– to attribute human qualities to things that are not human – and this seems to be a big factor in the development of human-robot interactions. <span id="more-5546"></span></p>
<p style="text-align: justify;">This became apparent in a recent post on this blog about the race to create ‘personal robots’: Amidst some genuinely beneficial devices and some really exciting innovations, we found some very ambitious promises about just how much these robots will serve not just as useful machines but also as ‘companions’. Sometimes, these robots involved little more than putting an animated face on a baby monitor, or <a href="http://myfuro.com/furo-i/service-feature/">putting a tablet on wheels</a> and <a href="https://www.kickstarter.com/projects/403524037/personal-robot">adding a soft female voice</a>. (Why these robots are so often anthropomorphised as female is perhaps something that must be addressed in another post.)</p>
<p style="text-align: justify;">But these robots, looking and sounding increasingly human, look as though they are going to sell.</p>
<p style="text-align: justify;">The impulse to anthropomorphise is an irrational drive, sometimes leading us to <a href="https://www.youtube.com/watch?v=n9TWwG4SFWQ">draw some strange conclusions</a>. It has little to do with the instrumental utility of a device.  A good hammer is effective at putting nails into wood. Would a hammer with a personality, with a face, be more effective at that task?</p>
<p style="text-align: justify;">Well, maybe, yes, as it turns out.  As we humans are instinctively social animals, perhaps there are some clear benefits to robots with whom we can interacts on a social level: robots with <a href="http://venturebeat.com/2014/03/08/how-these-social-robots-are-helping-autistic-kids/">material bodies</a>, instead of disembodied intelligences, and, perhaps best of all, <a href="https://www.plymouth.ac.uk/news/social-robots-helping-young-with-diabetes">robots with faces</a>, with whom we can more naturally interact.</p>
<p style="text-align: justify;">The impulse to anthropomorphise is an important part of human evolution, and plays an important part of our learning. As <a href="https://www.youtube.com/watch?x-yt-cl=85027636&amp;v=8OVInlqTrME&amp;x-yt-ts=1422503916">Tony Belpaeme explains</a>, this impulse can be seized upon to massively improve the effectiveness of technology in applications like learning and in caring, especially with children.</p>
<p style="text-align: justify;">However, the impulse to anthropomorphise also leads to some assumptions about robots that are unrealistic and, in some cases, dangerous. If one is presented with a robot that has a face, one automatically, instinctive, makes assumptions about those things of which the robot is capable. One might expect, for example, that the face staring back at us shares our intellectual capacity, or our ability to empathise. (This might be in some way what is responsible for the phenomenon known as <a href="http://www.strangerdimensions.com/2013/11/25/10-creepy-examples-uncanny-valley/">the uncanny valley</a>, where one experiences a degree of discomfort when in the presence of a life-like humanoid robot.)</p>
<p style="text-align: justify;">Anthropomorphisation, more worryingly, might lead us to expect that robots share human abilities to exercise judgement, for example, in combat situations. In a <a href="https://www.youtube.com/watch?v=kjRV9FzdQNk">2013 TED<sup>x</sup> lecture, Noel Sharkey</a> describes how military planners, having seen impressive killing machines, make all sorts of promises about how robot soldiers will be able to autonomously identity and eliminate targets. But these planners have no conception of the serious <a href="https://www.youtube.com/watch?v=GfeqbWxKoTE">perceptual and intellectual limitations of robots</a>, let alone their <a href="http://www.stopkillerrobots.org/the-problem/">complete lack of moral agency</a>, emotional engagement or critical faculties in the exercise of judgment.</p>
<p style="text-align: justify;"><iframe src="https://www.youtube.com/embed/kjRV9FzdQNk" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe> Noel Sharkey – Toy Soldiers to Killer Robots</p>
<p style="text-align: justify;">Looking at a robot with a cute cartoon face, or even a mean-looking Schwarzenegger look-alike, one might assume – automatically, unconsciously – that robot capable of all sorts of human behaviours, feelings and thoughts of which it is simply not capable. And that’s even before the marketing men and overly keen programmers (with an overestimation of their abilities) make their promises and videos that seduce us even further. As ever, what is needed is an informed discussion, and some careful thinking how to effectively and intelligently use our tendency to anthropomorphise, not exploit it.</p>
<p style="text-align: justify;"><iframe src="https://www.youtube.com/embed/8OVInlqTrME" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe> Tony Belpaeme – The power of robots with a face</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2015/02/23/on-anthropomorphisation/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Should we be worried about the Technological Singularity?</title>
		<link>https://csnblog.specs-lab.com/2014/09/25/should-we-be-worried-about-the-technological-singularity/</link>
		<comments>https://csnblog.specs-lab.com/2014/09/25/should-we-be-worried-about-the-technological-singularity/#comments</comments>
		<pubDate>Thu, 25 Sep 2014 15:21:00 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Ethics]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Science Fiction]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Alan Winfield]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Technological Singularity]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5432</guid>
		<description><![CDATA[Technological Singularity is based on the prediction that the development of AI powerful enough to surpass human intelligence will change the world as we know it, leading either to a catastrophic end of the human kind or to its miraculous ascent. &#8230; <a href="https://csnblog.specs-lab.com/2014/09/25/should-we-be-worried-about-the-technological-singularity/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/esq-hal-9000-xlg.jpg" rel="attachment wp-att-5436"><img class="aligncenter size-full wp-image-5436" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/esq-hal-9000-xlg.jpg" alt="esq-hal-9000-xlg" width="614" height="290" /></a></p>
<p>Technological Singularity is based on the prediction that the development of AI powerful enough to surpass human intelligence will change the world as we know it, leading either to a catastrophic end of the human kind or to its miraculous ascent.</p>
<p>In a recent article in the Guardian, <a href="http://www.ias.uwe.ac.uk/~a-winfie/">Alan Winfield,</a> professor of electronic engineering at the <a href="http://www.uwe.ac.uk/">University of the West of England,</a> Bristol, discusses the pitfalls of being overly pessimistic or optimistic about the Technological Singularity.</p>
<p><span id="more-5432"></span></p>
<p>In his judgment, the best way to approach the issue is to be both a little cautious and at the same time a little optimistic. The key, of course, is remaining within reasonable limits. For instance, believe it or not, the risk of an Apocalyptic event induced by an almighty AI is unreasonable, because it requires a very improbable sequence of events to occur, one of them being the very invention of such AI, which, according to Winfield, may be as far in the future as the invention of faster than light travel.</p>
<p>So for those on the other side of the spectrum who think that the arrival of ultra-sophisticated AI is inevitable in our life-time and will solve all our problems, you should probably let it go. Yes, AI systems are all around us today: they can drive cars, recognize speech and do dozens of other useful things, often making humans look silly. However, human intelligence is not about reaching perfection in one task. It is about learning, generalizing what has been learned, creating new knowledge, understanding meaning and context, and of course, being self-aware. These goals are far beyond our current understanding of AI.</p>
<p>The singularity talk, as Alan Winfield notes, is not completely innocent. Being too pessimistic or optimistic about the Technological Singularity is to indulge in the fallacy of privileging the hypothesis. Focusing on some hypothetical apocalyptic scenario may not be the best of ideas, when we should be focused on combating more pressing and equally apocalyptic scenarios such as climate change.</p>
<p>You can read the full story by Alan Winfield in the Guardian <a href="http://www.theguardian.com/technology/2014/aug/10/artificial-intelligence-will-not-become-a-frankensteins-monster-ian-winfield?CMP=twt_gu">HERE</a>.</p>
<p>Read our <a href="http://csnblog.specs-lab.com/2013/07/26/when-machines-get-super-savvy-will-human-intelligence-become-obsolete/">previous post </a>on singularity to learn about another take on the issue.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/09/25/should-we-be-worried-about-the-technological-singularity/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Don’t be afraid of big data</title>
		<link>https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/</link>
		<comments>https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/#comments</comments>
		<pubDate>Sun, 17 Aug 2014 14:44:55 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Europe]]></category>
		<category><![CDATA[BrainX3]]></category>
		<category><![CDATA[CEEDS]]></category>
		<category><![CDATA[European Commiss]]></category>
		<category><![CDATA[eXperience Induction Machine]]></category>
		<category><![CDATA[Jonathan Freeman]]></category>
		<category><![CDATA[Neelie Kroes]]></category>
		<category><![CDATA[Pompeu Fabra University]]></category>
		<category><![CDATA[SPECS]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5402</guid>
		<description><![CDATA[European Commission bets on data-driven economy Information can be scary, and even more so when we find ourselves humbled by its immensity. In a press release issued earlier this week, the European Commission has once again demonstrated that it is not afraid of &#8230; <a href="https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<h2>European Commission bets on data-driven economy</h2>
<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/08/images-Ceeds-image.jpg" rel="attachment wp-att-5406"><img class="alignleft wp-image-5406" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/08/images-Ceeds-image.jpg" alt="images-Ceeds image" width="500" height="352" /></a></p>
<p>Information can be scary, and even more so when we find ourselves humbled by its immensity.<a href="http://europa.eu/rapid/press-release_IP-14-916_en.htm"> In a press release</a> issued earlier this week, the European Commission has once again demonstrated that it is not afraid of big data. Quite the opposite, Europe is more than ever ready to embrace it – a gesture, which is reflected in Europe&#8217;s strong bet on research projects like <a href="http://ceeds-project.eu/">CEEDs</a>, which uses big data to enhance human cognition and improve problem solving.</p>
<p><span id="more-5402"></span><a href="http://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/">In a previous post</a>, we already discussed CEEDs and the <a href="http://specs.upf.edu/research_in_mixed_and_virtual_reality">eXperience Induction Machine</a> (XIM), the heart of the project, located in the <a href="http://specs.upf.edu/">SPECS lab</a> at <a href="http://www.upf.edu/en/">Pompeu Fabra University</a> in Barcelona. The press release singles out CEEDs as an example of successful and highly promising big data research initiative.</p>
<p>Although XIM has so far mainly been applied to visualising brain (<a href="http://www.brainx3.com/">BrainX3</a>) and historical (<a href="http://specs.upf.edu/installation/2772">Bergen-Belsen reconstruction</a>) data and will certainly bring about a huge qualitative change in how scientists work with tremendous amounts of information, the integration of this technology into more down-to-earth application fields seems imminent.</p>
<p>The press release reports that early interest in the XIM technology is already coming from several museums in Germany, the Netherlands, the UK and the United States, where it could potentially help with gathering and reacting to feedback from visitors. This naturally applies to many other public spaces such as shops, libraries and concerts. The CEEDs team is also conducting negotiations with several public, charity and commercial organisations to further extend the scope of application of the platform.</p>
<p>The CEEDs project coordinator <a href="http://www.gold.ac.uk/psychology/staff/freeman/">Jonathan Freeman</a>, Professor of Psychology at <a href="http://www.gold.ac.uk/">Goldsmiths</a>, <a href="http://www.lon.ac.uk/">University of London</a> pointed out that “anywhere where there’s a wealth of data that either requires a lot of time or an incredible effort, there is potential.” In science, whole disciplines, from satellite imagery inspection to oil prospecting and astronomy, could benefit immensely from this novel approach to processing information.</p>
<p>With projects like CEEDs, Europe is working its way towards a new data-driven economy, a long-time goal, which the European Commission is now actively promoting across national governments. The European approach towards big data is perhaps best expressed in the words of the vice-president of the European Commission <a href="http://ec.europa.eu/commission_2010-2014/kroes/">Neelie Kroes</a>: “Big data doesn’t have to be scary. Projects like this enable us to take control of data and deal with it so we can get to solving problems. Leaders need to embrace big data.”</p>
<p>You can also read <a href="http://www.cbronline.com/news/tech/software/businessintelligence/the-5-coolest-eu-big-data-projects-4340683">this article</a> to learn about some other exciting big data projects backed by the European Commission.</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Human Brain Project under attack</title>
		<link>https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/</link>
		<comments>https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/#comments</comments>
		<pubDate>Fri, 18 Jul 2014 15:42:40 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Europe]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[European Commission]]></category>
		<category><![CDATA[FET Flagship]]></category>
		<category><![CDATA[Human Brain Project]]></category>
		<category><![CDATA[ICT]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5378</guid>
		<description><![CDATA[Last week, the eyes of the scientific community were fixed on the € 1.2 billion Human Brain Project (HBP) as more than 150 European neuroscientists raised concerns over the project&#8217;s management in an open letter to the European Commission. One &#8230; <a href="https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<h1 style="text-align: justify"></h1>
<div id="attachment_5379" style="width: 683px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/human-brain-project.jpg" rel="attachment wp-att-5379"><img class="size-full wp-image-5379" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/human-brain-project.jpg" alt="Credit: Human Brain Project" width="673" height="378" /></a><p class="wp-caption-text">Credit: Human Brain Project</p></div>
<p>Last week, the eyes of the scientific community were fixed on the € 1.2 billion <a href="https://www.humanbrainproject.eu/">Human Brain Project</a> (HBP) as more than 150 European neuroscientists raised concerns over the project&#8217;s management in <a href="http://www.neurofuture.eu/">an open letter</a> to the European Commission.</p>
<p>One of the two Europe’s <a href="http://cordis.europa.eu/fp7/ict/programme/fet/flagship/home_en.html">Flagship Initiatives</a>, the HBP spans 112 research institutions across 24 countries and was launched last year with the grand vision of creating a long-needed ICT infrastructure for future brain research. Not without controversy, the project adopted a bottom-up approach to build a computer simulation of the brain based exclusively on the fundamental understanding of neurons and their interactions.</p>
<p><span id="more-5378"></span></p>
<p>The public outcry is not surprising given that the project has been surrounded by heated discussions from the very beginning when a number of labs refused to be part of the project because of its narrow focus on ICT and an apparent lack of basic neuroscience. Now many researchers fear that the inevitable failure of the project will cause a wave of adverse reaction to neuroscience undermining the future of the field.</p>
<p>The letter was largely driven by the recent changes made in the project plans for the next stage, which limits the role of cognitive scientists who pursue the difficult task of understanding the brain on the level of thought and behaviour. Now the labs working in this direction are to be repositioned from the project’s core to what is known as partnering projects (PPs). The concern is that, while the resulting computer simulations may not be completely useless, without a more pronounced theoretical component they will fail to elucidate brain functions.</p>
<p>A detailed review of the second stage by the EU commission is scheduled for January 2015 and the letter’s authors hope to bring the attention of the reviewers to the flaws in both science and management of the project. The second stage is expected to receive € 100 million over the course of 2 to 3 years, with a 50/50 split between the CP and the PPs.</p>
<p><a href="https://www.humanbrainproject.eu/documents/10180/17646/HBP-Statement.090614.pdf">The official response</a>, released by the HBP two days after the letter, shows signs of disposition and openness to dialogue. The response states that “the members of the HBP are saddened by the open letter” and invite the signatories to engage in direct discussion with the project leaders. Importantly, the response strongly suggests that cognitive neuroscience and other basic research will have an increasingly crucial role in the project as the required ICT platform comes into place.</p>
<p>Lots of researchers still firmly stand by the project arguing it&#8217;s a long-needed change in brain research. You may also be interested in reading <a href="http://www.newscientist.com/article/mg22329784.400-defending-the-grand-vision-of-the-human-brain-project.html#.U8Zj542Szbw">this article</a> defending the project by <a href="http://www.unil.ch/lren/en/home/menuinst/lab-members/honorary-pis/richard-frackowiak.html">Richard Frackowiak,</a> the co-executive director of the HBP.</p>
<p>What is clear is that the HBP has not managed to entirely unite neuroscientists, but when it comes to such grand projects this is not as surprising as it may seem. The management might need to become more consensual and we can only hope that HBP will continue its 10-year journey to unravel the universe inside our heads.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Telluride neuromorphic engineering workshop celebrates 20 years</title>
		<link>https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/</link>
		<comments>https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/#comments</comments>
		<pubDate>Thu, 10 Jul 2014 14:24:30 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Project News]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Convergent Science Network]]></category>
		<category><![CDATA[CSN]]></category>
		<category><![CDATA[neuromorphic engineering]]></category>
		<category><![CDATA[Telluride workshop]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5360</guid>
		<description><![CDATA[Every year, Telluride, a small mountain town in Colorado, attracts an international roster of scientists from several disciplines for three weeks of intensive discussion and exchange of ideas about neuromorphic engineering, a rapidly expanding research field that promises to bridge &#8230; <a href="https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Brain_Chip_Wide.jpg" rel="attachment wp-att-5362"><img class="alignleft wp-image-5362" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Brain_Chip_Wide-300x199.jpg" alt="Brain_Chip_Wide" width="510" height="340" /></a></p>
<p>Every year, Telluride, a small mountain town in Colorado, attracts an international roster of scientists from several disciplines for three weeks of intensive discussion and exchange of ideas about neuromorphic engineering, a rapidly expanding research field that promises to bridge the gap between the lifeless silicon of computer chips and the very much lively brain-based biological systems. This year is not an exception: the <a href="http://ine-web.org/telluride-conference-2014/telluride-2014/index.html">Telluride workshop</a> is now in full swing and will continue until July 19.</p>
<p><span id="more-5360"></span></p>
<p>What is special about this year&#8217;s edition is that the workshop, organised by the <a href="http://ine-web.org/index.php">Institute of Neuromorphic Engineering</a>, celebrates its 20<sup>th</sup> anniversary and the sense of historical perspective is more perceptible than ever. The workshop was founded in 1994 by <a href="http://en.wikipedia.org/wiki/Christof_Koch">Christoph Koch</a>, <a href="http://en.wikipedia.org/wiki/Terry_Sejnowski">Terry Sejnowsky</a>, <a href="http://www.ini.uzh.ch/people/rjd">Rodney Douglas</a> and others. Merely five years before that, the concept of neuromorphic engineering was for the first time introduced by Carver Mead and many discussions at this year&#8217;s workshop revolve around what has been achieved in the past years and what future contributions we can expect in the next 25 years.</p>
<p>One of the major goals of the workshop is to reduce the distance between senior and junior researchers in the field of neuromorphic engineering, and this year students participating in the workshop have a chance to interact with some of the most important contributors to the field. The workshop includes numerous background lectures on a variety of topics in systems and cognitive neuroscience, practical tutorials, hands-on projects and interest groups. There are six topic areas this year ranging from human auditory cognition and neuromorphic Olympics to embodied neuromorphic architectures of perception, cognition and action.</p>
<p>Other priorities of the workshop include the encouragement of collaborative activities emerging from the workshop and the promotion of neuromorphic engineering as a self-sustaining research field.</p>
<p>The event is sponsored by some of the biggest players in neuromorphic research worldwide including the <a href="http://csnetwork.eu/">Convergent Science Network Project</a>, which, among other things, contributed eight scholarships for European applicants. You can always learn more about the application requirements and other activities supported by CSN <a href="http://csnetwork.eu/activities">HERE</a>.</p>
<p>Neuromorphic engineering was included in this year’s top 10 Breakthrough Technologies report published by <a href="http://www.technologyreview.com/">MIT Technology Review</a>. Read <a href="http://www.technologyreview.com/featuredstory/526506/neuromorphic-chips/">this article</a> to learn why neuromorphic engineering matters and how brain-based computer chips are preparing to revolutionise computing as we know it.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>What You Say Is What You Did</title>
		<link>https://csnblog.specs-lab.com/2014/07/07/what-you-say-is-what-you-did/</link>
		<comments>https://csnblog.specs-lab.com/2014/07/07/what-you-say-is-what-you-did/#comments</comments>
		<pubDate>Mon, 07 Jul 2014 12:03:16 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[EFAA]]></category>
		<category><![CDATA[human-robot interaction]]></category>
		<category><![CDATA[icub]]></category>
		<category><![CDATA[Italian Institute of Technology]]></category>
		<category><![CDATA[Pompeu Fabra University]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[What You Say Is What You Did]]></category>
		<category><![CDATA[WYSIWYD]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5351</guid>
		<description><![CDATA[A new European project hopes to make robots more trustworthy Year by year, robots become better and better at negotiating each time more complex social interactions with humans. However, much as their social intelligence has improved, these interactions still suffer &#8230; <a href="https://csnblog.specs-lab.com/2014/07/07/what-you-say-is-what-you-did/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<h2><strong>A new European project hopes to make robots more trustworthy</strong></h2>
<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Home_Slide3.jpg" rel="attachment wp-att-5357"><img class="aligncenter size-full wp-image-5357" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Home_Slide3.jpg" alt="Home_Slide3" width="1000" height="500" /></a></p>
<p>Year by year, robots become better and better at negotiating each time more complex social interactions with humans. However, much as their social intelligence has improved, these interactions still suffer from a lack of transparency. In other words, unlike humans, robots are not capable of understanding and explaining their actions in intentional terms, which prevents them from having more effective communication with humans. To the joy of robots and humans alike, this challenge is now addressed by the <a href="http://wysiwyd.upf.edu/">What You Say Is What You Did (WYSIWYD) project</a>, launched earlier this year.</p>
<p><span id="more-5351"></span></p>
<p>The project, coordinated by the <a href="http://specs.upf.edu/">SPECS lab</a> at<a href="http://www.upf.edu/en/"> Pompeu Fabra University</a> in Barcelona, will develop an autobiographical memory that can store data streams obtained by the robot in the form of a consistent personal narrative of the interaction history. Furthermore, the researchers intend to devise a mechanism of conversion of this memory data into meaningful linguistic structures that can be subsequently expressed in speech and communicative actions through a specific channel dubbed WYSIWYD Robotese, thus improving mutual understanding between robots and humans.</p>
<p>WYSIWYD is an interdisciplinary effort that will draw from the fields of robotics, cognitive science, psychology and computational neuroscience. The project largely builds on the previous success of the <a href="http://efaa.upf.edu/">efAA projec</a>t, also coordinated by SPECS. WYSIWYD is scheduled to run for 3 years, and hopefully will bring about a qualitative change in human robot interaction and cooperation as well as unlock new application areas in robotics.</p>
<p>The main research platform for the project is everybody’s favourite <a href="http://www.icub.org/">iCub</a> robot, developed by the <a href="http://www.iit.it/">Italian Institute of Technology </a>in Milan, which is also one of the universities participating in the collaboration. iCub will be used in combination with another amazing piece of technology <a href="http://www.reactable.com/products/live/">Reactable</a>, an interactive table interface.</p>
<p>iCub has recently celebrated its 10<sup>th</sup> anniversary. Watch the video below to see how the robot and its capabilities evolved throughout a decade.</p>
<div style="width: 584px; max-width: 100%;" class="wp-video"><video class="wp-video-shortcode" id="video-5351-2" width="584" height="329" preload="metadata" controls="controls"><source type="video/mp4" src="http://www.iit.it/images/images/icub-facility/videos/icub_bday_noaudio.mp4?_=2" /><a href="http://www.iit.it/images/images/icub-facility/videos/icub_bday_noaudio.mp4">http://www.iit.it/images/images/icub-facility/videos/icub_bday_noaudio.mp4</a></video></div>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/07/07/what-you-say-is-what-you-did/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
<enclosure url="http://www.iit.it/images/images/icub-facility/videos/icub_bday_noaudio.mp4" length="62801887" type="video/mp4" />
		</item>
		<item>
		<title>Living Machines 2014</title>
		<link>https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/</link>
		<comments>https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/#comments</comments>
		<pubDate>Thu, 19 Jun 2014 07:00:22 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Project News]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Convergent Science Network]]></category>
		<category><![CDATA[CSN]]></category>
		<category><![CDATA[Da Vinci Museum of Science and Technology]]></category>
		<category><![CDATA[Italian Institute of Technology]]></category>
		<category><![CDATA[Living Machines]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5335</guid>
		<description><![CDATA[The 3rd Conference on Biomimetic and Biohybrid Systems will be held this year from 30 July to 1 August in Milan. As has become a tradition, the three-day event, organised by the Convergent Science Network, will be hosted at a &#8230; <a href="https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/Screen-Shot-2014-06-18-at-15.35.48.png" rel="attachment wp-att-5338"><img class="aligncenter wp-image-5338 size-full" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/Screen-Shot-2014-06-18-at-15.35.48.png" alt="Screen Shot 2014-06-18 at 15.35.48" width="1152" height="294" /></a></p>
<p><a href="http://csnetwork.eu/livingmachines/conf2014">The 3<sup>rd</sup> Conference on Biomimetic and Biohybrid Systems</a> will be held this year from 30 July to 1 August in Milan. As has become a tradition, the three-day event, organised by the <a href="http://csnetwork.eu/">Convergent Science Network</a>, will be hosted at a fantastic venue consistent with the spirit of the conference: the <a href="http://www.museoscienza.org/english/">Da Vinci Museum of Science and Technology</a>, one of the largest technology museums in Europe.</p>
<p><span id="more-5335"></span></p>
<p>The conference will be packed with fascinating talks on a variety of topics related to the development of technologies at the intersection of living and artificial systems, including six plenary lectures from some of the most distinguished experts in the field. The plenary lectures will be complemented by nearly 20 short talks on diverse topics such as soft robotics, active sensing, neuromechanics and others.</p>
<p>You can find out more about the plenary speakers <a href="http://csnetwork.eu/livingmachines/conf2014/speakers">HERE</a> and check out the full conference programme <a href="http://csnetwork.eu/livingmachines/conf2014/programme">HERE</a>.</p>
<p>This year, the Living Machines conference will be preceded by a one-day satellite event, hosted by the <a href="http://www.iit.it/">Italian Institute of Technology</a> and consisting of a series of research-oriented workshops. Learn more about the workshops <a href="http://csnetwork.eu/livingmachines/conf2014/workshops">HERE</a>.</p>
<p>We are looking forward to seeing you this year!</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Neuroprosthetics: wearable tech inside the brain</title>
		<link>https://csnblog.specs-lab.com/2014/05/25/neuroprosthetics-wearable-tech-inside-the-brain/</link>
		<comments>https://csnblog.specs-lab.com/2014/05/25/neuroprosthetics-wearable-tech-inside-the-brain/#comments</comments>
		<pubDate>Sun, 25 May 2014 15:15:44 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Argus]]></category>
		<category><![CDATA[Brain implant]]></category>
		<category><![CDATA[cerebellum chip]]></category>
		<category><![CDATA[Neuroprosthetics]]></category>
		<category><![CDATA[Second Sight]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[Universitat Pompeu Fabra]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5268</guid>
		<description><![CDATA[Wiring electronic devices directly into your brain may not sound like a very pleasant idea, but this is exactly what so many scientists around the world seem to be quite excited about. The reason is that, far from being your &#8230; <a href="https://csnblog.specs-lab.com/2014/05/25/neuroprosthetics-wearable-tech-inside-the-brain/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/IMG_0188.jpg" rel="attachment wp-att-5271"><img class="aligncenter size-large wp-image-5271" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/IMG_0188-1024x682.jpg" alt="IMG_0188" width="584" height="388" /></a></p>
<p>Wiring electronic devices directly into your brain may not sound like a very pleasant idea, but this is exactly what so many scientists around the world seem to be quite excited about. The reason is that, far from being your worst cyborg nightmare, brain implants – also called neuroprostheses – can do true miracles. Connected to the nervous system, these little chips can make the blind see, the deaf hear and even allow the paralysed to once again gain control over the physical world.</p>
<p><span id="more-5268"></span></p>
<p>The principle is shared by most existing neuroprostheses. An external device captures sensory information no longer obtainable by biological means, converts it into a series of electrical signals interpretable by the brain and sends them to the implant, which in turn passes the information to the brain. That said, the implants can be either attached to some kind of nerve – like the optic or auditory – or directly to the required area of the cortex, in which case the signals can take a shortcut.</p>
<p><a href="http://www.2-sight.eu/en/product-en">Argus II </a>developed and commercialised by <a href="http://www.2-sight.eu/en/">Second Sight</a> is the only approved visual neuroprosthesis currently available on the market. The device is a <a href="http://en.wikipedia.org/wiki/Retinal_implant">retinal implant</a>, designed to bypass the damaged biological eye photoreceptors in patients suffering from severe consequences of the condition known as <a href="http://en.wikipedia.org/wiki/Retinitis_pigmentosa"><em>retinitis pigmentosa</em></a>. For now, the image reconstructed by Argus is only a low-resolution approximation of the real thing, but as technology continues to advance, the capacity of such implants can improve beyond imaginable.</p>
<p>The system follows the principle described above and consists of a video camera, a video processing unit (VPU), and the implant itself. Watch the animation below to see how it works.</p>
<p><iframe width="584" height="329" src="https://www.youtube.com/embed/ZyVjK7sktvw?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>While visual neuroprostheses are only beginning to gain impulse, nearly 300,000 people around the world already use brain implants to restore another sense, their hearing. <a href="http://en.wikipedia.org/wiki/Cochlear_implant">Cochlear implant</a>, the most widely used neuroprosthesis, is the only hope for thousands of people with an ear malfunction. Below is another video, which shows the reaction of a 2-year-old boy hearing his mother’s voice for the first time. For a detailed overview of how the implant works, watch <a href="https://www.youtube.com/watch?v=zeg4qTnYOpw">this video</a>.</p>
<p><iframe width="584" height="329" src="https://www.youtube.com/embed/o_M28C-U9G0?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>Another application of neuroprosthetics promises to one day restore lost learning functions in humans. A <a href="http://journal.frontiersin.org/Journal/10.3389/fbioe.2014.00014/full">study</a>, published recently in <a href="http://www.frontiersin.org/bioengineering_and_biotechnology"><em>Frontiers in Bioengineering and Biotechnology</em></a> by a group of researchers, led by the <a href="http://specs.upf.edu/">SPECS group </a>at<a href="http://www.upf.edu/"> Pompeu Fabra University i</a>n Barcelona, demonstrates how a chip implanted into the brain of a living rat can actually restore a disabled function of the cerebellum – the part of the brain heavily responsible for the acquisition of motor memories. Specifically, with its cerebellum anaesthetised, the rat was conditioned to the acquisition of an eye-blink response, thus successfully using the neuroprosthetic chip to regain a disabled learning function.</p>
<p>Today, brain implants are still in their infancy. However, this does not prevent scientists from envisioning implants that can give us perfect memory, night vision and instant thought access to information. There is a whole bunch of bioengineering obstacles that need to be addressed (<a href="http://www.engadget.com/2014/05/19/wireless-implant-charging/">HERE is one that has just been overcome</a>) for brain implants to become safe and accepted in society, but our future already seems inevitably cybernetic.</p>
<p>Read <a href="http://online.wsj.com/news/articles/SB10001424052702304914904579435592981780528">this article</a> to learn more about how neuroprosthetics will change the world.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/05/25/neuroprosthetics-wearable-tech-inside-the-brain/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>How to make your brains feel at home</title>
		<link>https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/</link>
		<comments>https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/#comments</comments>
		<pubDate>Wed, 21 May 2014 14:45:18 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[architecture]]></category>
		<category><![CDATA[Art and design]]></category>
		<category><![CDATA[CAVE]]></category>
		<category><![CDATA[Intelligent environment]]></category>
		<category><![CDATA[Interactive architecture]]></category>
		<category><![CDATA[neuroarchitecture]]></category>
		<category><![CDATA[Neuroscience]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[Synthetic Oracle]]></category>
		<category><![CDATA[Virtual Reality]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5247</guid>
		<description><![CDATA[The physical spaces we inhabit have a direct influence on how we feel, think and behave. Understanding this implicit dialogue between built environments and our minds continues to open new ways for architects to design physical spaces that better meet &#8230; <a href="https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5266" style="width: 594px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/Spacemaker_Founder.jpg" rel="attachment wp-att-5266"><img class="size-large wp-image-5266" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/Spacemaker_Founder-1024x575.jpg" alt="Spacemaker VR is an application for Oculus headset that allows designers to walk through their creations Source: Digital Physical" width="584" height="327" /></a><p class="wp-caption-text">Spacemaker VR is an Oculus-based virtual reality system that allows designers to walk through their designs<br />Source: Digital Physical</p></div>
<p>The physical spaces we inhabit have a direct influence on how we feel, think and behave. Understanding this implicit dialogue between built environments and our minds continues to open new ways for architects to design physical spaces that better meet people’s needs. Neuro-architecture, interactive architecture, intelligent environments and virtual reality technology are among those exciting and partially overlapping disciplines that are currently on the frontline of the ongoing architectural revolution.</p>
<p><span id="more-5247"></span></p>
<p><strong>Neuro-architecture</strong></p>
<p>Where previously architects had to count on purely anecdotal and intuitive principles, neuro-architecture is now promising to provide a truly evidence-based neurobiological rationale for designing architectural spaces, be it your office, school or hospital. This is achieved thanks to a variety of techniques that allow researchers to quantify and measure human responses to different components that constitute a particular architectural environment, including by measuring the relevant regions of the brain.</p>
<p>Understanding the precise effects of each component on our mental and physical health is difficult to overestimate. Imagine being able to design classrooms whose very architectural configuration aids students’ concentration and improves learning or hospitals that accelerate patients’ recovery. The research can be carried out on existing buildings, models or in virtual reality simulations, before the actual structures are even built.</p>
<p>Virtual reality, in fact, can be extremely helpful when it comes to neuro-architecture research for several reasons. It allows to set up virtual environments where participants can navigate in life-like conditions, while at the same time researchers can have a systematic control of the introduced stimuli. Importantly, the response can be measured on different scales starting from an entire building to the scale of a room to a single architectural feature such as the height of the ceiling or the amount and quality of light allowed into the space. Some <a href="http://eaedesign.com/InnovativeDesignScience.com/Research_Activities_-_CAVE_Technology.html">interesting research</a> in this direction was conducted by the professor <a href="http://cala.arizona.edu/users/eve-edelstein-phd?destination=user/1386">Eve Edelstein</a> with the use of the virtual reality platform <a href="http://en.wikipedia.org/wiki/Cave_automatic_virtual_environment">CAVE</a>.</p>
<p>Those who think that the CAVE and other virtual reality rooms are not immersive enough, think, for instance, of the possibilities opened up by the much-talked-about Oculus headset. Contrary to virtual reality rooms, which are not easily accessible to everyone, virtual reality headsets are potentially available to every designer. Instead of fiddling around with physical prototypes, designers could now walk through their own creations and actually experience them. <a href="http://digitalphysical.com/spacemaker/">Spacemaker VR</a> from <a href="http://digitalphysical.com/">Digital Physical</a> is one example of how this technology can be used for the benefit of architects.</p>
<p>Read <a href="http://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/#more-5239">our previous post</a> to learn about the eXperience Induction Machine, another exciting application of virtual reality.</p>
<p><strong>Interactive architecture and intelligent environments:</strong></p>
<p>The two terms are often interchangeable in many contexts, perhaps with a slightly more artistic connotation for the former and a more functional one for the latter. Whatever the difference between them, both are guided by the increasing penetration of computing into our daily lives to develop dynamic environments that can adapt their physical properties to the behaviour of the inhabitants. The ultimate goal, of course, is make people feel more at home and in harmony with their physical surroundings.</p>
<p>Many of the examples of interactive architecture are born from a mixture of artistic thinking and computational engineering. So far the researchers have been toying with some of the most fundamental parameters that are known to alter our state of mind. Unsurprisingly, light is one of the favorites when it comes to interactive architecture. Check out the two examples below, which include the <a href="http://www.iua.upf.edu/syntheticOracle/">Synthetic Oracle</a> (former Hello Stranger) from the <a href="http://specs.upf.edu/home">SPECS</a> group at <a href="http://www.upf.edu/es/">Pompeu Fabra University</a> and <a href="http://www.behance.net/gallery/BIOSTAGOG/7609469">BIOSTAGOG</a> developed jointly by <a href="http://www.platige.com/">Platige Image</a> and <a href="http://www.brdg.pl/">Bridge</a>.</p>
<p><iframe width="584" height="438" src="https://www.youtube.com/embed/SAeys1fK3Zo?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p><iframe src="//player.vimeo.com/video/66800080" width="584" height="329" frameborder="0" title="INTERACTIVE INSTALLATION BY BRIDGE AND PLATIGE IMAGE." webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/05/21/how-to-make-your-brains-feel-at-home/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Virtual reality labs reshape how we process information</title>
		<link>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/</link>
		<comments>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/#comments</comments>
		<pubDate>Wed, 07 May 2014 05:40:39 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[big data]]></category>
		<category><![CDATA[brain]]></category>
		<category><![CDATA[CEEDS]]></category>
		<category><![CDATA[eXperience Induction Machine]]></category>
		<category><![CDATA[Laval Virtual]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[Neuroscience]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<category><![CDATA[XIM]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5239</guid>
		<description><![CDATA[We live in a time when the scale of scientific research is undergoing an unprecedented exponential growth, which contributes to the generation of equally unprecedented amounts of data. Disciplines like neuroscience, astronomy or particle physics are piling up so much &#8230; <a href="https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/X31.jpg" rel="attachment wp-att-5257"><img class="aligncenter size-large wp-image-5257" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/X31-1024x501.jpg" alt="X3" width="584" height="285" /></a></p>
<p>We live in a time when the scale of scientific research is undergoing an unprecedented exponential growth, which contributes to the generation of equally unprecedented amounts of data. Disciplines like neuroscience, astronomy or particle physics are piling up so much information that finding and implementing new ways of representing, navigating and manipulating this information is rapidly becoming a pressing necessity.</p>
<p><span id="more-5239"></span></p>
<p>One specifically promising method relies on the use of virtual and mixed reality platforms. What could be more intuitive and useful for, say, a neuroscientist trying to make sense of a huge and seemingly chaotic brain data set than an ability to fly through its virtual gesture-controlled representation and actually experience the properties of data in search for meaningful patterns.</p>
<p>The <a href="http://specs.upf.edu/research_in_mixed_and_virtual_reality">eXperience Induction Machine</a> (XIM), built in the <a href="http://specs.upf.edu/">SPECS lab</a> at <a href="http://www.upf.edu/en/">Pompeu Fabra University</a> in Barcelona, is one example of such immersive spaces, which is currently applied to work precisely with data collected from the human brain. XIM allows researchers to visualize a brain connectome, the network of nodes and connections that defines what is going on in our vital organ. XIM is now a key part of the <a href="http://ceeds-project.eu/">Collective Experience of Emphatic Data Systems</a> (CEEDs), a European project seeking to develop a whole set of tools to bring big data visualisation to a new level.</p>
<p><iframe width="584" height="329" src="https://www.youtube.com/embed/PRXuMIZDucc?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>XIM can be hooked up to a series of sensors that measure such parameters as the user’s heart rate, skin conductance, eye gaze and brain activity. This allows the system to register certain subconscious patterns, associated with how we perceive and process information, and guide the user’s attention to areas of potential interest that would otherwise remain unnoticed. This feature, along with XIM&#8217;s increased interactivity, is what really makes XIM stand out in comparison with some other state-of-the-art virtual and mixed reality systems such as the <a href="http://www.allosphere.ucsb.edu/index.php">AlloSphere</a> at the <a href="http://www1.cnsi.ucla.edu/index">California Nanosystems Institute</a> or the <a href="http://www.evl.uic.edu/core.php?mod=4&amp;type=1&amp;indi=424">CAVE2</a> at the <a href="http://www.uic.edu/uic/">University of Illinois at Chicago. </a></p>
<p>Earlier this month, SPECS and CEEDs showcased their platform for embodied exploration of neural data at the 16<sup>th</sup> edition of <a href="http://www.laval-virtual.org/en/">Laval Virtual, </a>the largest virtual technology conference in Europe. You can see a complete photo report from the event <a href="http://ceeds-project.eu/2014/04/14/ceeds-laval-virtual-2014-in-pictures/">HERE</a>.</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
	</channel>
</rss>
