<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Convergent Science Network &#187; Events</title>
	<atom:link href="https://csnblog.specs-lab.com/category/events/feed/" rel="self" type="application/rss+xml" />
	<link>https://csnblog.specs-lab.com</link>
	<description>Blog on Biomimetics and Neurotechnology.     With [writers] Michael Szollosy, Dmitry Malkov, Michelle Wilson, and Anna Mura [editor]</description>
	<lastBuildDate>Tue, 27 Sep 2022 14:58:43 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Here Space of Memory: Conserving, Presenting and Elaborating the Memory of the Holocaust</title>
		<link>https://csnblog.specs-lab.com/2016/03/23/here-space-of-memory-conserving-presenting-and-elaborating-the-memory-of-the-holocaust/</link>
		<comments>https://csnblog.specs-lab.com/2016/03/23/here-space-of-memory-conserving-presenting-and-elaborating-the-memory-of-the-holocaust/#comments</comments>
		<pubDate>Wed, 23 Mar 2016 13:27:47 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[Cultural Heritage]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Project News]]></category>
		<category><![CDATA[science]]></category>
		<category><![CDATA[society]]></category>
		<category><![CDATA[Digital heritage]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5884</guid>
		<description><![CDATA[blog by Paul Verschure [@Paul.Verschure] “Wir wissen nur dass wenn wir hier rauskommen, das wir alles dass wir hier erlebt haben in die Welt hinaus schreien müssen, anders kann man nicht leben” “We only know that when we get out &#8230; <a href="https://csnblog.specs-lab.com/2016/03/23/here-space-of-memory-conserving-presenting-and-elaborating-the-memory-of-the-holocaust/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>blog by Paul Verschure [@Paul.Verschure]</p>
<div id="attachment_5889" style="width: 310px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2016/03/77197.jpg" rel="attachment wp-att-5889"><img class="wp-image-5889 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2016/03/77197-300x232.jpg" alt="Soon after liberation, camp survivors await their ration of potato soup. Bergen-Belsen, Germany, April 28, 1945. — US Holocaust Memorial Museum" width="300" height="232" /></a><p class="wp-caption-text">Soon after liberation, camp survivors await their ration of potato soup. Bergen-Belsen, Germany, April 28, 1945.<br /> — US Holocaust Memorial Museum</p></div>
<p>“Wir wissen nur dass wenn wir hier rauskommen, das wir alles dass wir hier erlebt haben in die Welt hinaus schreien müssen, anders kann man nicht leben”</p>
<p>“We only know that when we get out of here, we must shout out into the world about everything that we have experienced here. Otherwise one cannot live.”</p>
<p style="color: #555555;">These are the words of Charlotte Grunow <a style="color: #0065a2;" href="http://www.bbc.co.uk/archive/holocaust/5111.shtml">recorded</a> on April 20, 1945 by BBC reporter <a style="color: #0065a2;" href="https://en.wikipedia.org/wiki/Patrick_Gordon_Walker">Patrick Gordon Walker</a>.</p>
<p style="color: #555555;"><span id="more-5884"></span>Charlotte Grunow was arrested in Berlin in April 1943, transported to <a style="color: #0065a2;" href="https://www.ushmm.org/wlc/en/article.php?ModuleId=10005189">Auschwitz-Birkenau</a> and moved with a large group of female prisoners in November 1944 to <a style="color: #0065a2;" href="https://www.ushmm.org/wlc/en/article.php?ModuleId=10005224">Bergen-Belsen</a>. She was liberated on April 15, 1945 with about 55000 other prisoners, 10000 of which were dead and a further 15000 would die after the liberation from disease and starvation due to a deliberate SS policy of neglect. The gruesome reality the Charlotte Grunow of April 1945 wants us to know about was largely unknown and unimagined by the liberating countries and was<a style="color: #0065a2;" href="http://www.bbc.co.uk/archive/holocaust/5115.shtml">described</a> by the BBC reporter <a style="color: #0065a2;" href="https://en.wikipedia.org/wiki/Richard_Dimbleby">Richard Dimbleby</a> as &#8220;the world of a nightmare&#8221;. This reality at the collapse of the Third Reich, could be found in <a style="color: #0065a2;" href="https://www.ushmm.org/research/publications/encyclopedia-camps-ghettos">over 42000 identified</a>collection, concentration and killing centers and <a style="color: #0065a2;" href="http://www.yahadinunum.org/">sites</a>, transports and <a style="color: #0065a2;" href="https://www.ushmm.org/wlc/en/article.php?ModuleId=10005162">death marches</a>across the Europe created by the Nazis. But is Charlotte being heard, then after the liberation, now seventy years later and in the future?</p>
<p style="color: #555555;">We are facing a transition in the commemoration of the Holocaust. The authentic voices reporting on the horrendous crimes humans are capable of will soon fall silent. Just for Bergen Belsen, key witnesses such as ex-prisoners <a style="color: #0065a2;" href="http://www.abendblatt.de/kultur-live/article107721249/Holocaust-Ueberlebender-Ich-komme-mit-offenem-Herzen.html">Gyorgy Denes</a> and <a style="color: #0065a2;" href="http://collections.ushmm.org/search/catalog/irn502750">Arieh Koretz</a> and liberators <a style="color: #0065a2;" href="http://www.bbc.co.uk/history/ww2peopleswar/user/83/u747283.shtml">Maj. Leonard Berney</a> and <a style="color: #0065a2;" href="https://en.wikipedia.org/wiki/Eric_Brown_(pilot)">Captain Eric “Winkle” Brown</a> have all died in the last year. How to deliver on the solemn pledge we have repeated for the last 70 years, that “we must never forget”? Have we succeeded to transform these testimonies into understanding, meaning or a society the victims hoped for? The answer unfortunately is “No”. For instance, although few systematic surveys exist <a style="color: #0065a2;" href="http://www.holocausteducation.org.uk/research/young-people-understand-holocaust/">a recent UK survey</a> among 8000 high-school students showed that the majority only has a cursory understanding of the Holocaust. The same holds for the rest of Europe. Hence, at the end of the period of the witness, we face a memory crisis in terms of the conservation and presentation of the events and experiences at the heart of European history and identity.</p>
<p style="color: #555555;">We have developed a novel approach towards answering the memory crisis: the <a style="color: #0065a2;" href="http://www.futurememoryfoundation.org/">Future Memory</a> project. At the start of Future Memory stands a personal experience when I visited the Bergen Belsen campsite where my grandfather <a style="color: #0065a2;" href="https://nl.wikipedia.org/wiki/Jan_Verschure">Jan Verschure</a>, a Dutch resistance fighter died: I found an empty landscape. The chirping birds provided a score to this peaceful and well-kept heath park that had integrated the elevated tops of the known mass-graves. Note that in 1945 birds avoided the place and the mass graves containing the remains of about 20000 victims are still not localized. However, behind this pastoral façade with no intrinsic footholds to assist in understanding and commemorating resides the ultimate “witness”: space itself. Future Memory aims at reclaiming this space in the service of the preservation of history and the shaping of collective memory now and in the future. Future Memory<a style="color: #0065a2;" href="http://www.belsen-project.specs-lab.com/">digitally enhances space</a> so that it becomes a medium though which historical sources and narratives can be discovered. The Future Memory project has started in 2010, in collaboration with the <a style="color: #0065a2;" href="http://bergen-belsen.stiftung-ng.de/en/home.html">Bergen Belsen memorial site</a> and was partially supported through the FET project <a style="color: #0065a2;" href="http://ceeds-project.eu/">CEEDS</a>.</p>
<p><iframe src="http://www.euronews.com/embed/327089/" width="640" height="360" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p style="color: #555555;">Future Memory builds historical learning on a twofold use of physical space. First, it acknowledges <a style="color: #0065a2;" href="http://www.cell.com/neuron/abstract/S0896-6273%2810%2900940-2?_returnURL=http%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0896627310009402%3Fshowall%3Dtrue">the fundamental role</a> that space and action play in the formation of memory and experience. A scientific discovery worthy of a <a style="color: #0065a2;" href="http://www.nobelprize.org/nobel_prizes/medicine/laureates/2014/">2014 Nobel price.</a> We have build on this link in our <a style="color: #0065a2;" href="http://specs.upf.edu/installations">exhibitions and performances</a> and the <a style="color: #0065a2;" href="https://ec.europa.eu/digital-single-market/en/blog/rehabilitation-gaming-system-healing-brain-interactive-virtual-reality-systems">advanced neurorehabilitation technologies</a> <a style="color: #0065a2;" href="http://specs.upf.edu/">we</a> have developed and deployed. Secondly, physical space is a permanent source for the authentication of historical knowledge: “this happened here”. Through the right use of technologies, spaces can be physically and virtually explored and discovered now and in the future, because they are laden with historical sources and reflections of the experiences of those who have been there. We have installed a number of integrated systems at the memorial site Bergen Belsen under the name “Here: Space of Memory” that implement these considerations. At the heart of this approach stands a 3D reconstruction of the former camp together with a database with geo-localized source material including diary fragments, images, drawings, video and audio clips. Visitors can access this physical/virtual space through an <a style="color: #0065a2;" href="http://specs.upf.edu/XIM">immersive virtual reality environment</a> or by walking on the terrain itself using <a style="color: #0065a2;" href="http://www.belsen-project.specs-lab.com/summers-fruits-a-new-app-version/">an augmented reality tablet App</a>. By wandering among the reconstructed buildings, visitors explore historical sources in situ. Lastly, we have installed a sound installation that presents visitors with voices, including that of Charlotte Grunow, as they walk from the museum to the former campsite, creating a personal encounter with the fleeting past. The effectiveness of Future Memory can is evidenced through the associated educational program that is intensely used by visiting school classes and booked out for many months to come. After this important validation of the Future Memory approach, our goal is to digitally reconstruct, enhance and link together at least 100 sites across Europe, to show the system level organization of the murder machine created by the Nazi’s. We have started the <a style="color: #0065a2;" href="http://futurememoryfoundation.org/">Future Memory Foundation</a> with the purpose to realize a neutral ground from which we can support this objective through both private and public support.</p>
<p style="color: #555555;">How close are we to our target? The UK holocaust education survey makes it painfully clear that at best there is a modest impact and we have to ask why the approaches followed over the last 70 years such as professionalizing commemoration, archiving and researching of historical sources, monumentalizing historical sites and offering museums has not translated into more societal impact? Possibly we still have not identified en effective way to link historical information to understanding. The Future Memory project builds a bridge between history, experience and meaning by advancing an integrated approach comprising science, technology, humanities and the arts, that not only investigates and presents “what happened here” but also how we can narrate this central chapter of European history to its citizens now and in the future as a source for continuous learning and reflection. This is a new and complementary approach to existing ones that can assist us in overcoming the memory crisis.</p>
<p style="color: #555555;">Today as the age of the witness is coming to a close, an enormous amount of work still needs to be done. We only have a few years left to <a style="color: #0065a2;" href="http://www.belsen-project.specs-lab.com/interviews-for-reconstruction/">conserve the living memory</a>of the sites of the Holocaust, while for some sites it is <a style="color: #0065a2;" href="http://www.theguardian.com/world/2016/feb/20/samuel-willenberg-survivor-of-nazi-death-camp-treblinka-dies-aged-93">already too late</a>. It is true that Europe has supported some important initiatives such as the <a style="color: #0065a2;" href="http://www.ehri-project.eu/">EHRI network</a> and<a style="color: #0065a2;" href="http://www.europeana.eu/">Europeana</a>. But what has been done so far has not been enough, as the current state of Europe’s response to global humanitarian crises and rising anti-Semitism shows. There is a belief that enough is being done, but this is not supported by fact. Also in our case, despite the great interest that our project inspires, including at the level of the European Commission and their staff members, their requests for information have not translated into action, rather into “I have no time”. However, urgent action is required and a large-scale no holds barred European initiative must be undertaken, circumventing old habits and inertia in order to salvage the past to help us shape our European future.</p>
<p style="color: #555555;">Future Memory answers and propagates Charlotte’s Grunow’s rallying cry. We, the descendants of the victims, perpetrators, traitors, bystanders, survivors and resisters have an obligation to conserve the memory we risk to loose through the mortality of the survivor. To honor the victims, to safe guard and elaborate our European identity and to reflect on the darkest crevasses of the human soul, so that we may transcend them and find meaning and virtue in a deep understanding of who we have been, are and can become.</p>
<p style="color: #555555;"><span style="color: #333333;">Get in touch with me <span style="color: #0066cc;"><a class="ProfileHeaderCard-screennameLink u-linkComplex js-nav" style="color: #0065a2;" href="https://twitter.com/PaulVerschure" target="_blank">@<span class="u-linkComplex-target">PaulVerschure</span></a></span></span></p>
<p style="color: #555555;"><span style="color: #333333;">This blog was originally </span><span style="font-weight: normal; color: #777777;">published in </span><a href="https://ec.europa.eu/digital-single-market/en/blog_home">DAE blog</a><span style="font-weight: normal; color: #777777;"><a href="https://ec.europa.eu/digital-single-market/en/blog_home"> </a>on 21/03/2016</span></p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2016/03/23/here-space-of-memory-conserving-presenting-and-elaborating-the-memory-of-the-holocaust/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Living Machines 2015</title>
		<link>https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/</link>
		<comments>https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/#comments</comments>
		<pubDate>Tue, 04 Aug 2015 16:54:45 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[science]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5747</guid>
		<description><![CDATA[Article by Michael Szollosy Just last week, La Pedrera, Barcelona, has hosted the Living Machines 2015, the 4th International conference on biomimetics and biohybrid systems. Running from the 28th – 31st of July, Living Machines 2015 is sponsored by the Convergent Science Network &#8230; <a href="https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p style="text-align: justify;"><span style="color: #373737;"><br />
Article by </span><a style="color: #617c96;" href="https://www.shef.ac.uk/scharr/sections/hsr/mh/sectionstaff/mszollosy">Michael Szollosy</a></p>
<p style="text-align: justify;"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/header_v1_big_full.png" rel="attachment wp-att-5749"><img class="alignright wp-image-5749 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/header_v1_big_full-300x127.png" alt="header_v1_big_full" width="300" height="127" /></a>Just last week, La Pedrera, Barcelona, has hosted the <a href="http://csnetwork.eu/livingmachines/conf2015"><em>Living Machines 2015</em></a>, the 4<sup>th</sup> International conference on biomimetics and biohybrid systems.</p>
<p style="text-align: justify;"><span id="more-5747"></span><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/La-Pedrera-Vista-general.jpg" rel="attachment wp-att-5750"><img class="alignright wp-image-5750 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/La-Pedrera-Vista-general-300x225.jpg" alt="La Pedrera Vista general" width="300" height="225" /></a>Running from the 28<sup>th</sup> – 31<sup>st</sup> of July, <em>Living Machines </em>2015 is sponsored by the <a href="http://www.csnetwork.eu/">Convergent Science Network </a>and feature plenary talks by internationally-renowned researchers in roboticists and <a href="http://csnetwork.eu/livingmachines/conf2015/workshops">workshops</a> examining the intersection of living and artificial systems. There were also <a href="http://csnetwork.eu/livingmachines/conf2015/spotlights">poster spotlights</a> and poster <a href="http://csnetwork.eu/livingmachines/conf2015/posters">sessions</a>, and robot and media demonstrations. A <a href="http://csnetwork.eu/livingmachines/conf2015/programme">full programme of the events can be found here</a>.<!--more--></p>
<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/mantisbot_whole.jpg" rel="attachment wp-att-5748"><img class="alignright wp-image-5748 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/mantisbot_whole-300x200.jpg" alt="mantisbot_whole" width="300" height="200" /></a></p>
<p style="text-align: justify;"><em>Biomimetic</em> systems are technologies that draw their inspiration from biological systems; these can be used to improve artificial systems and offer solutions to technological and engineering, and can also be used to explore in greater depth natural systems themselves. <em>Biohybridity</em> refers to the merging of living and artificial systems to create new entities, and are used, for example, in robotics, materials, computing, brain-machine interfaces (e.g. neural implants), artificial organs and body parts.</p>
<p style="text-align: justify;"><a href="http://csnetwork.eu/livingmachines/conf2015/plenaryspeakers">Plenary speakers</a> at this year’s conference include:</p>
<ul>
<li style="text-align: justify;"><a href="http://biorobots.case.edu/personne/roger-quinn/">Roger Quinn</a>: Director of the Centre for <a href="http://biorobots.cwru.edu/">Biologically Inspired Robotics Researc</a>h at Case Western Reserve University in Cleveland, Ohio. Professor Quinn will talk on ‘Animals as models for robot mobility and autonomy:  Crawling, walking, running, climbing, and flying’</li>
<li style="text-align: justify;"><a href="http://mbr.iit.it/people/barbara-mazzolai.html">Barbara Mazzolai</a>: Director of the <a href="http://mbr.iit.it/">Centre for Micro-BioRobotics (CMBR) of the Istituto Italiano di Tecnologia (IIT) </a>of Genoa, Italy, and Deputy Director for Supervision and Organization of IIT Centres Network. Professor Mazzolai will be giving a talk entitled ‘From plants and animals to robots: movement, sensing and control’</li>
<li style="text-align: justify;"><a href="http://www.researchgate.net/profile/Ryad_Benosman">Ryad Benosman</a>: Professor at the University Pierre and Marie Curie, Paris, France, leading the <a href="http://www.institut-vision.org/index.php?option=com_content&amp;view=article&amp;id=283%3Aequipe-de-r-benosman&amp;catid=17%3Afiches&amp;Itemid=15&amp;lang=en">Natural Computation and Neuromorphic Vision Laboratory</a>, Vision Institute, Paris. Professor Benosman will be giving a talk entitled ‘Neuromorphic Event-based time oriented vision: A framework to unify computational and biological vision.’</li>
<li><a href="http://www.personal.leeds.ac.uk/~menrcr/">Robert Richardson</a>: Director of the Institute of <a href="https://www.engineering.leeds.ac.uk/idro/">Design, Robotics and Optimisation at the University of Leeds</a>. Professor Richardson will be talking about using robots for safety and security, surgical technologies for health and well-being, and rehabilitation and prosthetics.</li>
<li><a href="http://www.lied-pieri.univ-paris-diderot.fr/spip.php?article93">José Halloy</a>: Professor of <a href="http://www.univ-paris-diderot.fr/english/sc/site.php?bc=formations&amp;np=ficheufr&amp;n=13&amp;g=sm">Physics at Université Paris Diderot</a>. Professor Halloy will be speaking about collective intelligence in natural and artificial systems.</li>
</ul>
<p>And <a href="http://csnetwork.eu/livingmachines/conf2015/workshops">workshops</a> included discussion on topics such as:</p>
<ul>
<li><a href="http://csnetwork.eu/system/files/living-machines-files/robot_self_call_for_participation_1.pdf">The robot self</a></li>
<li><a href="http://csnetwork.eu/system/files/living-machines-files/nature_inspired_manufacturing_workshop_programme.pdf">Nature-inspired manufacturing</a></li>
<li><a href="http://csnetwork.eu/system/files/living-machines-files/bcn_flyer_28jul15_oneday.pdf">Bio-inspired design</a></li>
</ul>
<p style="text-align: justify;"><em>Living Machines</em> is one of the foremost conferences on robotics in the world, and is not to be missed. If you could not attend, the <a href="http://link.springer.com/book/10.1007/978-3-319-22979-9">proceedings are already available here</a> – do explore and have a look at some of the terrific ideas and developments being discussed. (Proceedings from previous years’ conferences can be found <a href="http://www.csnetwork.eu/livingmachines">here</a>.)</p>
<p style="text-align: justify;">For all inquiries contact <a href="mailto:info.csnetwork%40upf.edu">info.csnetwork@upf.edu</a></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Robot Saviours</title>
		<link>https://csnblog.specs-lab.com/2015/04/23/robot-saviours/</link>
		<comments>https://csnblog.specs-lab.com/2015/04/23/robot-saviours/#comments</comments>
		<pubDate>Thu, 23 Apr 2015 09:32:50 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[AI]]></category>
		<category><![CDATA[Art]]></category>
		<category><![CDATA[Ethics]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Science Fiction]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5662</guid>
		<description><![CDATA[Article by Michael Szollosy We’ve all seen the terrifying headlines: ‘Rise of the Cybermen: The Terminator-style bionic ear that could give people “superman” hearing’ ‘Terminator is nigh: Shape-shifting material that instantly switches from solid to liquid could lead to a new &#8230; <a href="https://csnblog.specs-lab.com/2015/04/23/robot-saviours/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>Article by <a href="https://www.shef.ac.uk/scharr/sections/hsr/mh/sectionstaff/mszollosy">Michael Szollosy</a></p>
<p style="text-align: justify;"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/Tscc_3.jpg" rel="attachment wp-att-5674"><img class="alignleft wp-image-5674 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/Tscc_3-200x300.jpg" alt="Tscc_3" width="200" height="300" /></a>We’ve all seen the terrifying headlines:</p>
<p style="padding-left: 30px; text-align: justify;"><strong>‘Rise of the Cybermen: </strong>The Terminator-style bionic ear that could give people “superman” hearing<strong>’</strong><br />
<strong> ‘Terminator is nigh: </strong>Shape-shifting material that instantly switches from solid to liquid could lead to a new generation of robots<strong>’</strong></p>
<p style="text-align: justify;">And the rest.<br />
Undoubtedly, there is a great deal of anxiety out there about the development of robots and artificial intelligence. Some of these fears are well-founded, of course, and some less so. We’ve been presented in the popular media so often – <a href="http://tvtropes.org/pmwiki/pmwiki.php/Main/UnnecessarilyCreepyRobot">in films</a>, <a href="http://en.wikipedia.org/wiki/Escape_from_the_Planet_of_the_Robot_Monsters">video games</a> and in the <a href="http://www.dailymail.co.uk/home/search.html?offset=0&amp;size=50&amp;sel=site&amp;searchPhrase=terminator&amp;sort=recent&amp;channel=sciencetech&amp;type=article&amp;type=video&amp;days=all">popular press</a> –  with the image of robotic monsters and genocidal AI that it’s a wonder that public have not demanded that these dangerous toys be taken from scientists and forever locked away, their development forever prohibited for the good of all life on earth as we know it. (A similar public attack is underway regarding <a href="http://www.nongmoproject.org/">GMOs</a>, for example; again, many of these are well-founded and some are not.)<span id="more-5662"></span></p>
<p style="text-align: justify;">However, increasingly, we are seeing another side to our imaginations of what robots can do, will do, to us, for us. No longer are they simply the laser-gun-wielding psychopaths, or the disembodied masterminds orchestrating the end of the human race. Robots and AI have also now become not only our carers (e.g. <a href="http://www.imdb.com/title/tt1990314/"><em>Robot and Frank</em></a>), our lovers (<a href="http://www.imdb.com/title/tt1798709/?ref_=fn_al_tt_1"><em>Her</em></a>) and even our children (<a href="http://www.imdb.com/title/tt0212720/?ref_=fn_al_tt_1"><em>A.I.</em></a>, <a href="http://www.imdb.com/title/tt1823672/?ref_=nv_sr_1"><em>Chappie</em></a>).</p>
<p style="text-align: justify;"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/new-poster-for-chappie-humanitys-last-hope-isnt-human.jpg" rel="attachment wp-att-5676"><img class="alignleft wp-image-5676 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/new-poster-for-chappie-humanitys-last-hope-isnt-human-202x300.jpg" alt="new-poster-for-chappie-humanitys-last-hope-isnt-human" width="202" height="300" /></a>And now, even more optimistically, they have become our saviours, the final great hope for humanity.</p>
<p style="text-align: justify;">This is quite a turnaround, in terms of public relations.</p>
<p style="text-align: justify;">Consider, for example, the tagline on the posters for <a href="http://thepsychologist.bps.org.uk/chappie-blomkamps-fabulous-robot"><em>Chappie</em></a>, New Blomkamp’s take on the birth of sentient AI and the Singularity:</p>
<p style="text-align: justify; padding-left: 30px;"><strong>Humanity’s Last Hope Isn’t Human</strong></p>
<p style="text-align: justify;">Or consider Daniel H. Wilson’s 2011 novel, <a href="http://en.wikipedia.org/wiki/Robopocalypse"><em>Robopocalypse</em></a>: we are presented with a story about the rise of AI and robots and the destruction of humanity. But absolutely essential in humanity’s fight back are not only technologically-enhanced humans (armed with prosthetics and neural implants), but our new robot allies, good robots that help us battle the bad robots.</p>
<p style="text-align: justify;">Or, going further back, consider more widely the <a href="http://en.wikipedia.org/wiki/Terminator_(franchise)"><em>Terminator </em>series</a>: in <a href="http://www.imdb.com/title/tt0088247/?ref_=fn_al_tt_1">the first movie</a>,, from 1984, Arnold Schwarzenegger is most certainly, unambiguously the Bad Guy, sent by a future AI to ensure that human resistance against machine-rule dies in its (or his) infancy. But already by <a href="http://www.imdb.com/title/tt0103064/?ref_=nv_sr_2">the second film</a>, , in 1991, Arnie is already the Good Guy protecting humanity from the next robot threat. And by the fourth in the series, in 2009, it is inevitable to avoid a certain degree of spoiling just by mentioning that the title is <a href="http://www.imdb.com/title/tt0438488/"><em>Terminator: Salvation</em></a>. (And 2015’s <em><a href="http://www.imdb.com/title/tt1340138/?ref_=nv_sr_1">Terminator: Genisys</a> </em>[sic] promises more of the same.)</p>
<div id="attachment_5680" style="width: 294px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/imgres.jpg" rel="attachment wp-att-5680"><img class="wp-image-5680 size-full" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/imgres.jpg" alt="Terminator" width="284" height="177" /></a><p class="wp-caption-text">Terminator</p></div>
<p style="text-align: justify;">All of this might seem like a positive step in the right direction for those whose work is dedicated to building useful machines that help humanity, as the bad PR of snarling chrome skulls (and THAT picture) are replaced with more wholesome and realistic ideas of robots caring for the elderly and helping the sick and disabled – and on some levels this absolutely needs to be applauded – but there is also the worry that this new conception of robots is really just the other side of the very same coin: that the idealisation of robots and AI as humanity’s last great hope is not actually that much different from the demonisation of robots that preceded it.</p>
<p style="text-align: justify;"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/Terminator-The-Sarah-Connor-Chronicles-Season-1.jpg" rel="attachment wp-att-5675"><img class="alignleft wp-image-5675 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/Terminator-The-Sarah-Connor-Chronicles-Season-1-300x300.jpg" alt="Terminator-The-Sarah-Connor-Chronicles-Season-1" width="300" height="300" /></a>Of course the idea that robots, and technology more generally, will be the humanity’s salvation is not a terribly new idea, and certainly has been around as long – or perhaps even longer – than the technological monsters that have come to dominate the popular media. <a href="http://en.wikipedia.org/wiki/Frankenstein">Frankenstein’s monster</a>, for example, was conceived as a warning of what could go wrong with humanity’s new technological prowess, despite our noblest intentions (and is itself a post-Enlightenment version of the classic <a href="http://www.faust.com/"><em>Faust</em></a> myth).</p>
<div id="attachment_5679" style="width: 310px" class="wp-caption alignright"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/Vision-and-Ultron.jpg" rel="attachment wp-att-5679"><img class="wp-image-5679 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/04/Vision-and-Ultron-300x154.jpg" alt="Android Superhero and Arch-villain – get ready to see more of The Vision and Ultron in the coming weeks." width="300" height="154" /></a><p class="wp-caption-text">Android Superhero and Arch-villain – get ready to see more of The Vision and Ultron in the coming weeks.</p></div>
<p style="text-align: justify;">And conceptions of the future since have always been manichean: utopian visions have always competed alongside dystopian versions, and though the nightmare images are more often (and popularly) the stuff of our fictions there have always been groups, from the Futurists to the posthumanists, that are ready to embrace the brave new world.</p>
<p style="text-align: justify;">But uncritical optimism is often driven by the same sort of (often unconscious) anxieties and fears that give rise to the images of robotic monsters; likewise, misinformation and unrealistic expectations are the source of both unrealistically positive and negative beliefs.</p>
<p style="text-align: justify;">So while the robo-enthusiast and AI-champion might welcome this cultural shift towards more positive social attitudes towards technology, it might not be all good news. We have to resist the vicissitudes of love and hate, demonisation and idealisation, and approach these questions &#8211; as always &#8211; with rational discussion and education.</p>
<p style="text-align: justify;">
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2015/04/23/robot-saviours/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Scientists set robots against Ebola</title>
		<link>https://csnblog.specs-lab.com/2014/10/24/scientists-set-robots-against-ebola/</link>
		<comments>https://csnblog.specs-lab.com/2014/10/24/scientists-set-robots-against-ebola/#comments</comments>
		<pubDate>Fri, 24 Oct 2014 16:54:44 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Center for Robot-Assisted Search and Rescue]]></category>
		<category><![CDATA[CRASAR]]></category>
		<category><![CDATA[Ebola]]></category>
		<category><![CDATA[germ zapping robot]]></category>
		<category><![CDATA[Medical Robots]]></category>
		<category><![CDATA[Texas A&M University]]></category>
		<category><![CDATA[Xenex]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5468</guid>
		<description><![CDATA[With the official Ebola death toll approaching 5,000, scientists are increasingly concerned with exploiting all possible ways of fighting this deadly disease. While the biggest labs around the world are working on a vaccine that will hopefully exterminate Ebola once &#8230; <a href="https://csnblog.specs-lab.com/2014/10/24/scientists-set-robots-against-ebola/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5471" style="width: 644px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/10/1412677632849_wps_6_devicewithlight_726x345_j.jpg" rel="attachment wp-att-5471"><img class="size-full wp-image-5471" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/10/1412677632849_wps_6_devicewithlight_726x345_j.jpg" alt="Xenex's germ-zapping robot Credit: Xenex" width="634" height="389" /></a><p class="wp-caption-text">Xenex&#8217;s germ-zapping robot<br />Credit: Xenex</p></div>
<p>With the official Ebola death toll approaching 5,000, scientists are increasingly concerned with exploiting all possible ways of fighting this deadly disease. While the biggest labs around the world are working on a vaccine that will hopefully exterminate Ebola once and for all, roboticists are developing more unconventional ways of preventing the spread of the disease.</p>
<p><span id="more-5468"></span> Recently, a lot of media attention has been focused on <a href="http://www.xenex.com/">Xenex</a>, a San Antonio-based company, which has developed a robotic assistant that helps medical professionals remove traces of infectious diseases, such as ebola, left in hospital premises. Even better, the robot can fence infections out 24/7 with 99,9 % efficiency, thus preventing any potential delays in the operation of a hospital.</p>
<p>The robot does that by firing powerful ultraviolet pulses that wipe out all nasty viruses and bacterias sneaking in the corners of hospital rooms. And while the technology of scrambling viral DNA with ultraviolet light is not particularly new, the idea of a roboticized Ebola killer is certainly to everybody’s liking.</p>
<p>But here is the catch: it does not take a genius to realize that Xenex’s machine has no more right to be called a robot than any other piece of medical equipment. What Xenox has developed is not an autonomous Roomba-like Ebola hunter. Essentially, it is a wheeled cart with a programmable ultraviolet lamp, and, although there is no doubt about its effectiveness in killing Ebola and other germs, we should choose words properly.</p>
<p>Does this mean, however, that robotics has nothing to offer in the biggest recorded outbreak of the virus?  Fortunatelly, the answer is no. Even existing medical robots have a huge potential for fighting diseases like Ebola, but deciding how to effectively use them in harsh conditions, such as those in West Africa, is a complicated issue.</p>
<p>In an attempt to clarify how robots can contribute to the ongoing battle, the<a href="http://crasar.org/"> Center for Robot-Assisted Search and Rescue (CRASAR)</a> at <a href="https://www.tamu.edu/">Texas A&amp;M University</a> is organizing a policy workshop on Safety Robotics for Ebola Workers. The workshop will help identify what robots can do in order to minimize human contact with the virus, detect the virus and provide expert consulting to those who contracted the virus. You can learn more about the upcoming workshop <a href="http://crasar.org/2014/10/24/more-about-our-workshop-on-safety-robotics-for-ebola-workers-nov-7-8/">HERE</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/10/24/scientists-set-robots-against-ebola/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Telluride neuromorphic engineering workshop celebrates 20 years</title>
		<link>https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/</link>
		<comments>https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/#comments</comments>
		<pubDate>Thu, 10 Jul 2014 14:24:30 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Project News]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Convergent Science Network]]></category>
		<category><![CDATA[CSN]]></category>
		<category><![CDATA[neuromorphic engineering]]></category>
		<category><![CDATA[Telluride workshop]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5360</guid>
		<description><![CDATA[Every year, Telluride, a small mountain town in Colorado, attracts an international roster of scientists from several disciplines for three weeks of intensive discussion and exchange of ideas about neuromorphic engineering, a rapidly expanding research field that promises to bridge &#8230; <a href="https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Brain_Chip_Wide.jpg" rel="attachment wp-att-5362"><img class="alignleft wp-image-5362" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Brain_Chip_Wide-300x199.jpg" alt="Brain_Chip_Wide" width="510" height="340" /></a></p>
<p>Every year, Telluride, a small mountain town in Colorado, attracts an international roster of scientists from several disciplines for three weeks of intensive discussion and exchange of ideas about neuromorphic engineering, a rapidly expanding research field that promises to bridge the gap between the lifeless silicon of computer chips and the very much lively brain-based biological systems. This year is not an exception: the <a href="http://ine-web.org/telluride-conference-2014/telluride-2014/index.html">Telluride workshop</a> is now in full swing and will continue until July 19.</p>
<p><span id="more-5360"></span></p>
<p>What is special about this year&#8217;s edition is that the workshop, organised by the <a href="http://ine-web.org/index.php">Institute of Neuromorphic Engineering</a>, celebrates its 20<sup>th</sup> anniversary and the sense of historical perspective is more perceptible than ever. The workshop was founded in 1994 by <a href="http://en.wikipedia.org/wiki/Christof_Koch">Christoph Koch</a>, <a href="http://en.wikipedia.org/wiki/Terry_Sejnowski">Terry Sejnowsky</a>, <a href="http://www.ini.uzh.ch/people/rjd">Rodney Douglas</a> and others. Merely five years before that, the concept of neuromorphic engineering was for the first time introduced by Carver Mead and many discussions at this year&#8217;s workshop revolve around what has been achieved in the past years and what future contributions we can expect in the next 25 years.</p>
<p>One of the major goals of the workshop is to reduce the distance between senior and junior researchers in the field of neuromorphic engineering, and this year students participating in the workshop have a chance to interact with some of the most important contributors to the field. The workshop includes numerous background lectures on a variety of topics in systems and cognitive neuroscience, practical tutorials, hands-on projects and interest groups. There are six topic areas this year ranging from human auditory cognition and neuromorphic Olympics to embodied neuromorphic architectures of perception, cognition and action.</p>
<p>Other priorities of the workshop include the encouragement of collaborative activities emerging from the workshop and the promotion of neuromorphic engineering as a self-sustaining research field.</p>
<p>The event is sponsored by some of the biggest players in neuromorphic research worldwide including the <a href="http://csnetwork.eu/">Convergent Science Network Project</a>, which, among other things, contributed eight scholarships for European applicants. You can always learn more about the application requirements and other activities supported by CSN <a href="http://csnetwork.eu/activities">HERE</a>.</p>
<p>Neuromorphic engineering was included in this year’s top 10 Breakthrough Technologies report published by <a href="http://www.technologyreview.com/">MIT Technology Review</a>. Read <a href="http://www.technologyreview.com/featuredstory/526506/neuromorphic-chips/">this article</a> to learn why neuromorphic engineering matters and how brain-based computer chips are preparing to revolutionise computing as we know it.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/07/10/telluride-neuromorphic-engineering-workshop-celebrates-20-years/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Living Machines 2014</title>
		<link>https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/</link>
		<comments>https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/#comments</comments>
		<pubDate>Thu, 19 Jun 2014 07:00:22 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Project News]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Convergent Science Network]]></category>
		<category><![CDATA[CSN]]></category>
		<category><![CDATA[Da Vinci Museum of Science and Technology]]></category>
		<category><![CDATA[Italian Institute of Technology]]></category>
		<category><![CDATA[Living Machines]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5335</guid>
		<description><![CDATA[The 3rd Conference on Biomimetic and Biohybrid Systems will be held this year from 30 July to 1 August in Milan. As has become a tradition, the three-day event, organised by the Convergent Science Network, will be hosted at a &#8230; <a href="https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/Screen-Shot-2014-06-18-at-15.35.48.png" rel="attachment wp-att-5338"><img class="aligncenter wp-image-5338 size-full" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/Screen-Shot-2014-06-18-at-15.35.48.png" alt="Screen Shot 2014-06-18 at 15.35.48" width="1152" height="294" /></a></p>
<p><a href="http://csnetwork.eu/livingmachines/conf2014">The 3<sup>rd</sup> Conference on Biomimetic and Biohybrid Systems</a> will be held this year from 30 July to 1 August in Milan. As has become a tradition, the three-day event, organised by the <a href="http://csnetwork.eu/">Convergent Science Network</a>, will be hosted at a fantastic venue consistent with the spirit of the conference: the <a href="http://www.museoscienza.org/english/">Da Vinci Museum of Science and Technology</a>, one of the largest technology museums in Europe.</p>
<p><span id="more-5335"></span></p>
<p>The conference will be packed with fascinating talks on a variety of topics related to the development of technologies at the intersection of living and artificial systems, including six plenary lectures from some of the most distinguished experts in the field. The plenary lectures will be complemented by nearly 20 short talks on diverse topics such as soft robotics, active sensing, neuromechanics and others.</p>
<p>You can find out more about the plenary speakers <a href="http://csnetwork.eu/livingmachines/conf2014/speakers">HERE</a> and check out the full conference programme <a href="http://csnetwork.eu/livingmachines/conf2014/programme">HERE</a>.</p>
<p>This year, the Living Machines conference will be preceded by a one-day satellite event, hosted by the <a href="http://www.iit.it/">Italian Institute of Technology</a> and consisting of a series of research-oriented workshops. Learn more about the workshops <a href="http://csnetwork.eu/livingmachines/conf2014/workshops">HERE</a>.</p>
<p>We are looking forward to seeing you this year!</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/06/19/living-machines-2014/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Is Pepper the world&#8217;s hottest personal robot yet?</title>
		<link>https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/</link>
		<comments>https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/#comments</comments>
		<pubDate>Wed, 18 Jun 2014 21:01:48 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Asia]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Aldebaran Robotics]]></category>
		<category><![CDATA[emotional intelligence]]></category>
		<category><![CDATA[Nao]]></category>
		<category><![CDATA[Pepper]]></category>
		<category><![CDATA[Robots and emotions]]></category>
		<category><![CDATA[Romeo]]></category>
		<category><![CDATA[SoftBank]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5329</guid>
		<description><![CDATA[Pepper, a new humanoid robot introduced earlier this month in Japan, may herald the beginning of a new era in personal robotics. Unlike its ancestors, such as Mitsubishi’s Wakamaru and Sony’s QRIO, who had to join the halls of robot &#8230; <a href="https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5332" style="width: 690px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper-680x365.jpg" rel="attachment wp-att-5332"><img class="size-full wp-image-5332" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper-680x365.jpg" alt="Credit: Aldebaran Robotics" width="680" height="365" /></a><p class="wp-caption-text">Credit: Aldebaran Robotics</p></div>
<p>Pepper, a new humanoid robot introduced earlier this month in Japan, may herald the beginning of a new era in personal robotics. Unlike its ancestors, such as Mitsubishi’s <a href="http://en.wikipedia.org/wiki/Wakamaru">Wakamaru</a> and Sony’s <a href="http://en.wikipedia.org/wiki/QRIO">QRIO</a>, who had to join the halls of robot extinction, Pepper, developed jointly by the French robotics company <a href="http://www.aldebaran.com/en">Aldebaran</a> and the Japanese telecom giant <a href="http://www.softbank.jp/en/mobile/">SoftBank</a>, is here to stay.</p>
<p><span id="more-5329"></span></p>
<p>Although the robot aims at possibly the most unreachable market in robotics industry, that of personal household robots, there are several major factors that can play a decisive role in Pepper’s future: his advanced emotional intelligence, surprisingly low price, and, of course, let’s not forget that looks matter – Pepper’s design is every bit gorgeous.</p>
<p>Softbank plans to start selling the robots next year in Japan for about $ 1,900. Until then, people can get acquainted with Pepper at certain SoftBank stores in Japan.</p>
<p>Although Pepper might initially seem quite unpractical – it will not clean your house and may not even be able to effectively fetch things – the robot’s strong suit lies in its ability to be good company.</p>
<div id="attachment_5333" style="width: 594px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper_ld.jpg" rel="attachment wp-att-5333"><img class="size-large wp-image-5333" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper_ld-1024x682.jpg" alt="Credit: Aldebaran Robotics" width="584" height="388" /></a><p class="wp-caption-text">Credit: Aldebaran Robotics</p></div>
<p>Pepper’s communication skills are the result of special software that allows it to effectively analyze human emotions by combining information about voice tone, facial expressions and body language. In this way Pepper will tailor each individual conversation based on how its interlocutors feel and behave. While by no means the first robot to do so, Pepper may well be the first consumer available robot with such advanced emotion-reading capabilities.</p>
<p>The cutting-edge emotion engine will be supported by a cloud-based “collective wisdom”, where all Pepper robots will be able to upload valuable information about their interactions with humans. Taken together, this data will allow them evolve and polish their communication skills. As an example, hundreds of robots could store information about whether a particular joke makes people laugh, and then decide whether the same joke will be appropriate in other situations.</p>
<p>Pepper’s emotional intelligence is a logical progression of Aldebaran’s pursuit of companion robots capable of living with humans and responding to their constantly changing moods and feelings. The robot is strongly reminiscent of Aldebaran’s previous hit <a href="http://www.aldebaran.com/en/humanoid-robot/nao-robot">Nao</a>, but, unlike his little brother, uses wheels instead of legs to move around – a choice dictated by power efficiency requirements.</p>
<p>A legged version of Pepper, however, might also see the light: Aldebaran’s legged <a href="http://www.aldebaran.com/en/robotics-company/projects">Romeo</a> robot, which still remains in development, can in the future serve as a foundation for a legged version of Pepper. You can read a <a href="http://csnblog.specs-lab.com/2014/04/01/meet-romeo-a-new-rising-star-of-humanoid-robotics/">previous post</a> to learn more about the ongoing Romeo project.</p>
<p>Allowing robots to understand human emotions and express their own is a critical step towards improving human robot interaction in all settings. Read <a href="http://csnblog.specs-lab.com/2014/02/27/children-will-learn-from-robots/">this post </a>to learn about some ongoing European projects that aim to improve emotional intelligence in robots.</p>
<p><iframe width="584" height="329" src="https://www.youtube.com/embed/8HXhsKpETXE?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Virtual reality labs reshape how we process information</title>
		<link>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/</link>
		<comments>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/#comments</comments>
		<pubDate>Wed, 07 May 2014 05:40:39 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[big data]]></category>
		<category><![CDATA[brain]]></category>
		<category><![CDATA[CEEDS]]></category>
		<category><![CDATA[eXperience Induction Machine]]></category>
		<category><![CDATA[Laval Virtual]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[Neuroscience]]></category>
		<category><![CDATA[SPECS]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<category><![CDATA[XIM]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5239</guid>
		<description><![CDATA[We live in a time when the scale of scientific research is undergoing an unprecedented exponential growth, which contributes to the generation of equally unprecedented amounts of data. Disciplines like neuroscience, astronomy or particle physics are piling up so much &#8230; <a href="https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/X31.jpg" rel="attachment wp-att-5257"><img class="aligncenter size-large wp-image-5257" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/05/X31-1024x501.jpg" alt="X3" width="584" height="285" /></a></p>
<p>We live in a time when the scale of scientific research is undergoing an unprecedented exponential growth, which contributes to the generation of equally unprecedented amounts of data. Disciplines like neuroscience, astronomy or particle physics are piling up so much information that finding and implementing new ways of representing, navigating and manipulating this information is rapidly becoming a pressing necessity.</p>
<p><span id="more-5239"></span></p>
<p>One specifically promising method relies on the use of virtual and mixed reality platforms. What could be more intuitive and useful for, say, a neuroscientist trying to make sense of a huge and seemingly chaotic brain data set than an ability to fly through its virtual gesture-controlled representation and actually experience the properties of data in search for meaningful patterns.</p>
<p>The <a href="http://specs.upf.edu/research_in_mixed_and_virtual_reality">eXperience Induction Machine</a> (XIM), built in the <a href="http://specs.upf.edu/">SPECS lab</a> at <a href="http://www.upf.edu/en/">Pompeu Fabra University</a> in Barcelona, is one example of such immersive spaces, which is currently applied to work precisely with data collected from the human brain. XIM allows researchers to visualize a brain connectome, the network of nodes and connections that defines what is going on in our vital organ. XIM is now a key part of the <a href="http://ceeds-project.eu/">Collective Experience of Emphatic Data Systems</a> (CEEDs), a European project seeking to develop a whole set of tools to bring big data visualisation to a new level.</p>
<p><iframe width="584" height="329" src="https://www.youtube.com/embed/PRXuMIZDucc?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>XIM can be hooked up to a series of sensors that measure such parameters as the user’s heart rate, skin conductance, eye gaze and brain activity. This allows the system to register certain subconscious patterns, associated with how we perceive and process information, and guide the user’s attention to areas of potential interest that would otherwise remain unnoticed. This feature, along with XIM&#8217;s increased interactivity, is what really makes XIM stand out in comparison with some other state-of-the-art virtual and mixed reality systems such as the <a href="http://www.allosphere.ucsb.edu/index.php">AlloSphere</a> at the <a href="http://www1.cnsi.ucla.edu/index">California Nanosystems Institute</a> or the <a href="http://www.evl.uic.edu/core.php?mod=4&amp;type=1&amp;indi=424">CAVE2</a> at the <a href="http://www.uic.edu/uic/">University of Illinois at Chicago. </a></p>
<p>Earlier this month, SPECS and CEEDs showcased their platform for embodied exploration of neural data at the 16<sup>th</sup> edition of <a href="http://www.laval-virtual.org/en/">Laval Virtual, </a>the largest virtual technology conference in Europe. You can see a complete photo report from the event <a href="http://ceeds-project.eu/2014/04/14/ceeds-laval-virtual-2014-in-pictures/">HERE</a>.</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Festo unveils a jumping kangaroo robot</title>
		<link>https://csnblog.specs-lab.com/2014/04/08/festo-unveils-a-jumping-kangaroo-robot/</link>
		<comments>https://csnblog.specs-lab.com/2014/04/08/festo-unveils-a-jumping-kangaroo-robot/#comments</comments>
		<pubDate>Tue, 08 Apr 2014 14:24:38 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[AquaJelly]]></category>
		<category><![CDATA[BionicKangaroo]]></category>
		<category><![CDATA[BioniCopter]]></category>
		<category><![CDATA[Festo]]></category>
		<category><![CDATA[Hannover Messe]]></category>
		<category><![CDATA[SmartBird]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5159</guid>
		<description><![CDATA[The German automation company has once again secured its place at the cutting edge of bionic technology. This time Festo came up with a life-like kangaroo robot that realistically emulates the jumping dynamics of a natural kangaroo. The robot is &#8230; <a href="https://csnblog.specs-lab.com/2014/04/08/festo-unveils-a-jumping-kangaroo-robot/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5161" style="width: 310px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/04/kangaroo1-1396398139335.jpg" rel="attachment wp-att-5161"><img class="size-medium wp-image-5161 " alt="Source: Festo " src="http://csnblog.specs-lab.com/wp-content/uploads/2014/04/kangaroo1-1396398139335-300x231.jpg" width="300" height="231" /></a><p class="wp-caption-text">Source: Festo</p></div>
<p dir="ltr">The German automation company has once again secured its place at the cutting edge of bionic technology.</p>
<p dir="ltr">This time <a href="https://www.festo.com/net/startpage/">Festo</a> came up with a life-like <a href="http://www.festo.com/cms/en_corp/13704.htm">kangaroo robot</a> that realistically emulates the jumping dynamics of a natural kangaroo. The robot is expected to be officially unveiled this week at <a href="http://www.hannovermesse.de/home">Hannover Messe</a>.</p>
<p dir="ltr"><span id="more-5159"></span></p>
<p dir="ltr">The most characteristic feature of kangaroos, when it comes to jumping, is their ability to recuperate and store energy from one jump, and then release it to produce another. Without this ability kangaroos would get tired extremely quickly while hopping.</p>
<p dir="ltr">Researchers at Festo spent almost two years figuring out how to reproduce this natural mechanism in a robotic platform. The result is an artificial kangaroo that can jump 0.4 meter vertically and 0.8 meter horizontally – being itself one meter tall and weighing just 7 kilograms. To achieve this the BionicKangaroo makes use of an elastic rubber spring that emulates the Achilles tendon, which is exceptionally developed in natural kangaroos, and is, in fact, what allows them to accumulate energy between jumps.</p>
<p><iframe width="584" height="329" src="http://www.youtube.com/embed/mWiNlWk1Muw?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p dir="ltr">The BionicKangaroo is part of the company’s <a href="http://www.festo.com/cms/en_corp/9617.htm">Bionic Learning Network</a>, which is a cooperative initiative aimed at studying principles perfected by nature in an effort to transform them into fresh and innovative industrial applications. During the past few years, Festo has developed a whole range of absolutely amazing bionic robots such as <a href="http://www.festo.com/cms/en_corp/13165.htm">BioniCopter</a>, <a href="http://www.festo.com/cms/en_corp/13611.htm">Aquajelly</a> and <a href="http://www.festo.com/cms/en_corp/11369.htm">SmartBird</a> – just to name some of them. You can see the complete chronological list of Festo’s bionic projects <a href="http://www.festo.com/cms/en_corp/10924.htm">HERE</a>.</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/04/08/festo-unveils-a-jumping-kangaroo-robot/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Meet Romeo, a new rising star of humanoid robotics</title>
		<link>https://csnblog.specs-lab.com/2014/04/01/meet-romeo-a-new-rising-star-of-humanoid-robotics/</link>
		<comments>https://csnblog.specs-lab.com/2014/04/01/meet-romeo-a-new-rising-star-of-humanoid-robotics/#comments</comments>
		<pubDate>Tue, 01 Apr 2014 14:51:24 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Europe]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Aldebaran Robotics]]></category>
		<category><![CDATA[Convergent Science Network]]></category>
		<category><![CDATA[Humanoid robots]]></category>
		<category><![CDATA[Innorobo]]></category>
		<category><![CDATA[Nao]]></category>
		<category><![CDATA[Romeo]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5124</guid>
		<description><![CDATA[&#160; Five years have passed since Aldebaran Robotics announced an ambitious joint project with over a dozen leading French research centres to make France one of the few countries to have developed an advanced humanoid robot. Finally, the robot, named &#8230; <a href="https://csnblog.specs-lab.com/2014/04/01/meet-romeo-a-new-rising-star-of-humanoid-robotics/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>&nbsp;</p>
<div id="attachment_5126" style="width: 310px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/04/PHO0f89b22c-ae7c-11e3-953c-c7c798c3042f-805x453.jpg" rel="attachment wp-att-5126"><img class="wp-image-5126 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/04/PHO0f89b22c-ae7c-11e3-953c-c7c798c3042f-805x453-300x168.jpg" alt="Source: Aldebaran Robotics" width="300" height="168" /></a><p class="wp-caption-text">Source: Aldebaran Robotics</p></div>
<p>Five years have passed since <a href="http://www.aldebaran.com/en">Aldebaran Robotics</a> announced an ambitious<a href="http://projetromeo.com/"> joint project </a>with over a dozen leading French <a href="http://projetromeo.com/partenaires">research centres</a> to make France one of the few countries to have developed an advanced humanoid robot. Finally, the robot, named <a href="http://projetromeo.com/">Romeo</a>, made its long-awaited debut at the <a href="http://www.innorobo.com/en/">Innorobo</a> robotics fair, which was held earlier last month in Lyon.</p>
<p><span id="more-5124"></span></p>
<p>Aldebaran Robotics hit the big time with its famed <a href="http://www.aldebaran.com/en/humanoid-robot/nao-robot">Nao</a> robot, which immediately conquered the love of the robotics community, and it was only a question of time before Aldebaran would face the challenge of creating a larger and more capable robot. Romeo, who stands 1,40 meters tall and weighs around 40 kg, however, is not just an enlarged version of Nao and, although not without some synergy between the two projects, the researches had to develop in many ways a very different humanoid.</p>
<p>Romeo was conceived as a personal assistant to elderly and disabled people and will have to move in an everyday environment and, in theory, perform such tasks as fetching objects, take out the trash and monitor the owner’s health, mood and behaviour.</p>
<p>Safety was a major concern when designing the physical platform, for a bigger robot implies bigger risks, and so Aldebaran set about developing a robot that neither looks dangerous, nor is a danger. In this regard, the project has made some important advances: unlike most humanoids, which rely on gears to power their joints, Romeo’s joints – and most importantly leg joints – are based on a very light and low-friction <a href="http://www.barrett.com/robot/glossary.htm">backdrivable mechanism</a> consisting of screws and cables, which offers more control over the robot and is considerably safer and cheaper. A good example of a backdrivable robot is the <a href="https://www.youtube.com/watch?v=oAjfjU7yxoY">WAM </a>arm from<a href="http://www.barrett.com/robot/index.htm"> Barrett Technology</a>.</p>
<p>Romeo, of course, still remains in the development stage, which was fairly obvious at Innorobo. So far, the robot seems to have limited mobility and cognitive capabilities, and it might be too early to talk about how Romeo stands up to what was promised at the beginning of the project. Some parts of the robot will be almost definitely improved along the way: Romeo’s hands, for instance, now have four fingers each and just one degree of freedom, which allows him to perform only a basic grasping motion – clearly not enough to perform most of the tasks envisioned for the robot.</p>
<div id="attachment_5127" style="width: 310px" class="wp-caption alignright"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/04/romeo-a-humanoid-robot-from-aldebaran-139467065606302301.jpg" rel="attachment wp-att-5127"><img class="size-medium wp-image-5127" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/04/romeo-a-humanoid-robot-from-aldebaran-139467065606302301-300x200.jpg" alt="Source: Aldebaran Robotics" width="300" height="200" /></a><p class="wp-caption-text">Source: Aldebaran Robotics</p></div>
<p>Aldebaran hopes that Romeo will start working at aged care facilities by the year 2017 or, at the latest, by 2019. And, although it might seem unrealistic – given the rumoured cost of Romeo at around $ 330.000 – Aldebaran has plans for commercialising the robot by offering it to hospitals and nursing homes and eventually to individuals.</p>
<p>For more information on Romeo, you can read <a href="http://spectrum.ieee.org/automaton/robotics/humanoids/france-developing-advanced-humanoid-robot-romeo">this article</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/04/01/meet-romeo-a-new-rising-star-of-humanoid-robotics/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
	</channel>
</rss>
