<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Convergent Science Network &#187; Robot Companions</title>
	<atom:link href="https://csnblog.specs-lab.com/tag/robot-companions/feed/" rel="self" type="application/rss+xml" />
	<link>https://csnblog.specs-lab.com</link>
	<description>Blog on Biomimetics and Neurotechnology.     With [writers] Michael Szollosy, Dmitry Malkov, Michelle Wilson, and Anna Mura [editor]</description>
	<lastBuildDate>Tue, 27 Sep 2022 14:58:43 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>This Bot Doesn&#039;t Bite&#8230;</title>
		<link>https://csnblog.specs-lab.com/2012/12/03/this-bot-doesnt-bite/</link>
		<comments>https://csnblog.specs-lab.com/2012/12/03/this-bot-doesnt-bite/#comments</comments>
		<pubDate>Mon, 03 Dec 2012 07:29:03 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Asia]]></category>
		<category><![CDATA[Biology]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Flea Bot]]></category>
		<category><![CDATA[Jumping robot]]></category>
		<category><![CDATA[Minkyun Noh]]></category>
		<category><![CDATA[Nitinol]]></category>
		<category><![CDATA[Resilin]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[Seoul National University]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4381</guid>
		<description><![CDATA[Check out this robot inspired by fleas! Scientists at Seoul National University (SNU) have recently created a robot inspired by tiny blood-sucking bugs: fleas! Pesky as these little insects may be, they&#8217;ve got an incredible physical ability that not even an &#8230; <a href="https://csnblog.specs-lab.com/2012/12/03/this-bot-doesnt-bite/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>Check out this robot inspired by fleas!</strong><br />
<object id="flashObj" width="560" height="349" classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="flashVars" value="videoId=1923385709001&amp;playerID=2227271001&amp;playerKey=AQ~~,AAAAADqBmN8~,Yo4S_rZKGX0rYg6XsV7i3F9IB8jNBoiY&amp;domain=embed&amp;dynamicStreaming=true" /><param name="base" value="http://admin.brightcove.com" /><param name="seamlesstabbing" value="false" /><param name="allowFullScreen" value="true" /><param name="swLiveConnect" value="true" /><param name="allowScriptAccess" value="always" /><param name="src" value="http://c.brightcove.com/services/viewer/federated_f9?isVid=1" /><param name="flashvars" value="videoId=1923385709001&amp;playerID=2227271001&amp;playerKey=AQ~~,AAAAADqBmN8~,Yo4S_rZKGX0rYg6XsV7i3F9IB8jNBoiY&amp;domain=embed&amp;dynamicStreaming=true" /><param name="allowfullscreen" value="true" /><param name="swliveconnect" value="true" /><param name="allowscriptaccess" value="always" /><param name="pluginspage" value="http://www.macromedia.com/shockwave/download/index.cgi?P1_Prod_Version=ShockwaveFlash" /><embed id="flashObj" width="560" height="349" type="application/x-shockwave-flash" src="http://c.brightcove.com/services/viewer/federated_f9?isVid=1" flashVars="videoId=1923385709001&amp;playerID=2227271001&amp;playerKey=AQ~~,AAAAADqBmN8~,Yo4S_rZKGX0rYg6XsV7i3F9IB8jNBoiY&amp;domain=embed&amp;dynamicStreaming=true" base="http://admin.brightcove.com" seamlesstabbing="false" allowFullScreen="true" swLiveConnect="true" allowScriptAccess="always" flashvars="videoId=1923385709001&amp;playerID=2227271001&amp;playerKey=AQ~~,AAAAADqBmN8~,Yo4S_rZKGX0rYg6XsV7i3F9IB8jNBoiY&amp;domain=embed&amp;dynamicStreaming=true" allowfullscreen="true" swliveconnect="true" allowscriptaccess="always" pluginspage="http://www.macromedia.com/shockwave/download/index.cgi?P1_Prod_Version=ShockwaveFlash" /></object></p>
<p>Scientists at <a title="Seoul National University" href="http://www.useoul.edu/">Seoul National University</a> (SNU) have recently created a robot inspired by tiny blood-sucking bugs: fleas! Pesky as these little insects may be, they&#8217;ve got an incredible physical ability that not even an Olympic high-jumper could compete with — these guys can jump over 200 hundred times their own body length! See for yourself in NewScientist&#8217;s video above.<br />
<span id="more-4381"></span></p>
<p>Not every insect is capable of such an extraordinary feat, so what is it exactly that puts that special spring in every little flea&#8217;s step? The muscle is the flea&#8217;s upper-leg is endowed with a special protein called resilin. Nerve impulses stimulate the compression and decompression of the stretchy resilin and in coordination with tissue that acts a bit like a latch, the flea&#8217;s jump mechanism operates much the way a spring does.</p>
<p>Using a special alloy called nitinol, derived from nickel and titanium, Minkyun Noh and his team at SNU constructed three tiny springs that function much like the flea&#8217;s. Embedded into a tiny 2 cm robot, the insect-inspired machine is able to leap about 30 times its own body length.</p>
<p>Currently, this bot relies on an external power source but scientists are trying to figure out a feasible way to get some nano batteries on board. While applications for the bot have yet to be specified, researchers believe this kind of technology could be used in a wide range of fields— from medicine to environment monitoring.</p>
<p>For more information on the design of this robot, you can access the paper <a title="A Miniature Jumping Robot  with Flea-inspired Catapult System: Active Latch and Trigger" href="http://www.emn.fr/z-dre/bionic-robots-workshop/uploads/Abstracts%20BRW%202011/53.pdf" target="_blank">HERE</a></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/12/03/this-bot-doesnt-bite/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Cognitive Skills for Rehabilitation Robots</title>
		<link>https://csnblog.specs-lab.com/2012/10/16/rehabilitation-robot-requires-cognitive-skills/</link>
		<comments>https://csnblog.specs-lab.com/2012/10/16/rehabilitation-robot-requires-cognitive-skills/#comments</comments>
		<pubDate>Tue, 16 Oct 2012 08:00:53 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Cognitive skills]]></category>
		<category><![CDATA[collision proof wheelchairs]]></category>
		<category><![CDATA[CORBYS]]></category>
		<category><![CDATA[Dr Farshid Amirabadollohian]]></category>
		<category><![CDATA[Dr Polani]]></category>
		<category><![CDATA[European projects]]></category>
		<category><![CDATA[GeckoSystems]]></category>
		<category><![CDATA[rehabilitation robots]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[Smart wheelchairs]]></category>
		<category><![CDATA[University of Hertfordshire]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=2280</guid>
		<description><![CDATA[Europe invests in cognitive skills for rehabilitation robots The European Commission has provided a grant for €780,800  to develop cognitive skills for rehabilitation robots being developed by CORBYS (Control Framework for Robotic Systems), a four year European project which began &#8230; <a href="https://csnblog.specs-lab.com/2012/10/16/rehabilitation-robot-requires-cognitive-skills/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.robotcompanions.eu/blog/2012/10/rehabilitation-robot-requires-cognitive-skills/walking-around-testimonial/" rel="attachment wp-att-2281"><img class="alignleft size-medium wp-image-2281" title="walking-around-testimonial" alt="" src="http://csnblog.specs-lab.com/wp-content/uploads/2012/10/walking-around-testimonial-300x199.jpg" width="300" height="199" /></a><strong>Europe invests in cognitive skills for rehabilitation robots</strong></p>
<p>The <a title="EC" href="http://ec.europa.eu/index_en.htm" target="_blank">European Commission</a> has provided a grant for €780,800  to develop cognitive skills for rehabilitation robots being developed by <a title="corbys" href="http://corbys.eu/index.php/Home.html" target="_blank">CORBYS (Control Framework for Robotic Systems)</a>, a four year European project which began in February, 2011.<br />
<span id="more-2280"></span></p>
<p>These robots will be used for the rehabilitation of people with  damaged limbs who are working towards walking again.  Robots created for this purpose already exist however, “the issue is that they need constant attention and monitoring by therapists and they cannot effectively monitor the human,” explains <a title="Polani" href="http://homepages.feis.herts.ac.uk/%7Ecomqdp1/" target="_blank">Dr. Polani</a> who is developing methods to endow these kinds of robots with cognitive skills,  along with <a href="http://web-apps.herts.ac.uk/uhweb/about-us/profiles/profiles_home.cfm?profile=93E98391-B756-B738-BFCA0E7A647DD0C8&amp;view=publications" target="_blank">Dr. Farshid Amirabadollohian</a> and a team at the <a href="http://www.herts.ac.uk/home-page.cfm" target="_blank">University of Hertfordshire</a>.</p>
<p>Specifically, researchers are working on a perception system that will allow the robot  to assess the physical and mental state of the environment it’s being deployed in. These fine tunings aim to create a rehabilitative machine that will understand what its user needs and act autonomously in  response to those needs. To accomplish this, researchers are turning to mother nature for some ideas- “we believe that all organisms optimise information and organize it efficiently in their niche and that this shapes their behaviour – in a way, it tells them to some extent what to do. We believe it will help our system to take decisions similar to organisms and to better &#8216;read&#8217; the intentions of the human it supports,” said Dr Polani. “Furthermore, we will use these techniques to balance the lead-taking between robot and human.”</p>
<p>Partnerships of this nature are beginning to emerge around the world; the American company <a title="GeckoSystems" href="http://www.geckosystems.com/" target="_blank">GeckoSystems</a>, which specializes in mobile service robots, is about to undergo a <a title="joint venture" href="http://www.prnewswire.com/news-releases/geckosystems-actively-negotiating-joint-venture-with-chinese-wheelchair-manufacturer-for-international-marketing-of-robotic-wheelchair-safety-technology-133609658.html" target="_blank">joint venture</a> with an undisclosed Chinese wheelchair manufacturing company to produce a line of ¨collision proof¨wheelchairs.  As is the case in many other nations, the Chinese population is rapidly ageing and the government is putting forth extensive resources in order to achieve their goal of providing quality universal healthcare by 2020.</p>
<p><a title="robocom" href="http://www.robotcompanions.eu/home" target="_blank">The coordination action of Robot Companions for Citizens</a> is another example of a European sponsored initiative which integrates advances in neuroscience, robotics and new materials. This initiative aims to create novel benefits for society by fostering  the development of new types of robots and a new industry at large. <strong><br />
</strong></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/10/16/rehabilitation-robot-requires-cognitive-skills/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>The 2012 Living Machines Conference</title>
		<link>https://csnblog.specs-lab.com/2012/07/23/the-2012-living-machines-conference/</link>
		<comments>https://csnblog.specs-lab.com/2012/07/23/the-2012-living-machines-conference/#comments</comments>
		<pubDate>Mon, 23 Jul 2012 08:00:29 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Asia]]></category>
		<category><![CDATA[Biology]]></category>
		<category><![CDATA[Europe]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[USA]]></category>
		<category><![CDATA[Barcelona]]></category>
		<category><![CDATA[biohybrid technology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[La Pedrera]]></category>
		<category><![CDATA[New technology]]></category>
		<category><![CDATA[Pompeu Fabra University]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[The Convergent Science Network]]></category>
		<category><![CDATA[University of Sheffield]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4262</guid>
		<description><![CDATA[Here&#8217;s a taste of what went on over the 3 day event organized by the Convergent Science Network Electro sensors inspired by fish who navigate their way through murky waters, robots that dance with the honeybees, and artificial muscles and &#8230; <a href="https://csnblog.specs-lab.com/2012/07/23/the-2012-living-machines-conference/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.robotcompanions.eu/blog/2012/07/the-2012-living-machines-conference/living-machines_la-pedrera-3/" rel="attachment wp-att-4278"><img class="alignleft size-full wp-image-4278" title="Living Machines_La Pedrera" src="http://www.robotcompanions.eu/blog/wp-content/uploads/2012/07/Living-Machines_La-Pedrera2.bmp" alt="" width="350" height="262" /></a><strong>Here&#8217;s a taste of what went on over the 3 day event organized by the <a title="CSN" href="http://www.csnetwork.eu/" target="_blank">Convergent Science Network</a></strong></p>
<p>Electro sensors inspired by fish who navigate their way through murky waters, robots that dance with the honeybees, and artificial muscles and blood vessels making their way into modern medicine. These are just a few of the research topics that were <span id="more-4262"></span>discussed at this year&#8217;s Living Machines Conference that took place from 9th to the 12th of July in Barcelona, Spain.</p>
<p>Chairs of the session  Paul Verschure, from <a title="P. Verschure, Pompeu Fabra University" href="http://specs.upf.edu/people/331" target="_blank">Pompeu Fabra University</a> and Tony Prescott from the <a title="T. Prescott_ University of Sheffield" href="http://www.shef.ac.uk/psychology/staff/academic/tony-prescott" target="_blank">University of Sheffield</a>, welcomed delegates to one of Barcelona&#8217;s architectural gems; Antoni Gaudí&#8217;s <em>La Pedrera </em>building.</p>
<p>During 4 consecutive days, leading scientists in the fields of Biomimetics and Biohybryd systems gathered for pre-conference workshops, lectures, poster sessions, exhibitions and open panel sessions to present their work and discuss issues related to the development of real-word technologies inspired by biological systems.</p>
<p>The first day finished off with a panel-lead discussion centred on the question: why study nature? Co-chair Tony Prescott got dialogue flowing by providing two general reasons: to build technologies that could be useful in solving current challenges, and to better understand nature itself.  While the speakers and audience engaged in the discussion agreed that these are likely the main motives, other interesting opinions surged through out the conversation.<a href="http://www.robotcompanions.eu/blog/2012/07/the-2012-living-machines-conference/panel-discussion_lm/" rel="attachment wp-att-4294"><img class="alignright size-medium wp-image-4294" title="Panel discussion_LM" src="http://www.robotcompanions.eu/blog/wp-content/uploads/2012/07/Panel-discussion_LM-300x225.jpg" alt="" width="251" height="188" /></a></p>
<p>According to Barry Trimmer who specializes in Neurobiology at the<a title="B. Trimmer_University of Tufts" href="http://ase.tufts.edu/biology/faculty/trimmer/" target="_blank"> University of Tufts</a>,  by attempting to understand nature&#8217;s complexity, a biomimetic approach may allow us to bypass the limits of human creativity.</p>
<p>Toshio Fukuda who specializes in Micro-Nano Systems Engineering at <a title="T.Okuda_ Nagoya University" href="http://www.mein.nagoya-u.ac.jp/staff/fukuda-e.html" target="_blank">Nagoya University</a> is often inspired by particular functions or geometric shapes found in nature to help make devices such as the artificial blood vessels he works on more efficient.</p>
<p>Conversely, as a mechanical engineer specialized in aerodynamics, <a title="D.Lentink_Stanford University" href="http://www.dejongeakademie.nl/smartsite.dws?ch=DJA&amp;lang=EN&amp;id=25477" target="_blank">David Lentink</a>  is not so much interested in biomimetics as an outfit for a design, but rather in specific principles which might make sense from an engineering point of view ¨ We don’t want to look at the final detail of a bird wing to make an aircraft because it’s simply too complex, but some of the principles are extremely useful and they allow scientists to really think outside the box.¨</p>
<p>While a biomimetic approach often involves studying some of the most puzzling aspects of nature scientists have yet to wrap their heads around, there are still many things nature can&#8217;t do. ¨ Biological systems satisfy many constraints at one time so they may not be optimal for any one function that we may want to imitate. Flight is a great example because we can do things by optimizing that birds just can’t do and we can exceed the capabilities of birds with jets and planes that we build,¨explained Frank Grasso, director of the <a title="Biomimetic and Cognitive Robotics Lab" href="http://academic.brooklyn.cuny.edu/userhome/psych/fgrasso/" target="_blank">Biomimetic and Cognitive Robotics lab </a>at Brooklyn College, New York.</p>
<p><a href="http://www.robotcompanions.eu/blog/2012/07/the-2012-living-machines-conference/lm_38/" rel="attachment wp-att-4311"><img class="size-medium wp-image-4311 alignleft" title="LM_38" src="http://www.robotcompanions.eu/blog/wp-content/uploads/2012/07/LM_38-300x181.jpg" alt="" width="300" height="181" /></a>However, Dieter Braun, who specializes in Systems Biophysics at <a href="http://www.biosystems.physik.uni-muenchen.de/">Ludwig Maximilians University,</a> pointed out that it&#8217;s really a two-way learning stream and just because ¨evolution did not invent the bicycle¨ nature still has plenty of tricks to teach us and we need not be afraid of its complexity.</p>
<p>Check back to find out more on what what was shared during the 2012 Living Machines Conference; proceedings from the conference will be published in <a title="Springer Lecture Notes in Computer Science" href="http://www.springer.com/computer/lncs?SGWID=0-164-0-0-0" target="_blank"><em>Springer Lecture Notes in Computer Science (LNAI/LNCS)</em>.</a></p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/07/23/the-2012-living-machines-conference/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Robots That Smell</title>
		<link>https://csnblog.specs-lab.com/2012/07/19/robots-that-smell/</link>
		<comments>https://csnblog.specs-lab.com/2012/07/19/robots-that-smell/#comments</comments>
		<pubDate>Thu, 19 Jul 2012 09:00:29 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[NEUROChem]]></category>
		<category><![CDATA[olfaction]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[Robots that smell]]></category>
		<category><![CDATA[silk moths]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4226</guid>
		<description><![CDATA[Biomimetic Robot from Vicky Vouloutsi on Vimeo. As humans, we may take our sense of smell for granted but for many of the other members of the animal kingdom, either land-roaming or water-dwelling, a keen sense of smell serves as &#8230; <a href="https://csnblog.specs-lab.com/2012/07/19/robots-that-smell/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><iframe src="http://player.vimeo.com/video/30943298" frameborder="0" width="560" height="349"></iframe></p>
<p><a href="http://vimeo.com/30943298">Biomimetic Robot</a> from <a href="http://vimeo.com/user5796460">Vicky Vouloutsi</a> on <a href="http://vimeo.com">Vimeo</a>.</p>
<p>As humans, we may take our sense of smell for granted but for many of the other members of the animal kingdom, either land-roaming or water-dwelling, a keen sense of smell serves as an invaluable tool! While cetaceans like dolphins have no sense of smell, some species of fish, such as the salmon, use theirs to guide them back to their native streams or to assist them in maintaining social hierarchy. Most of us are well aware that a dog’s nose could out <span id="more-4226"></span>sniff a human’s any day— particular breeds such as the bloodhound may have a sense of smell that’s more than 10 million times more powerful than ours. A Grizzly bear’s nose is also hard to beat— forget being able to smell the bakery around the corner, these guys can smell food up to 30 km away! And while that’s pretty impressive, imagine catching a whiff of true love in a single gust of wind. Well, we don’t know whether the male silk moth has any interest in true love, but they are in fact, extremely sensitive to the pheromone Bombykol which is released by potential female mates.</p>
<p>The video above features a robot that was developed as part of the European project, <a title="NEUROCHEM" href="http://neurochem.sisbio.recerca.upc.edu/" target="_blank">NEUROChem,</a> which has been successful in modelling the behavior of the silk moth in a robot. The robot above is pictured searching for the source of an odour plume much the way real moths do. Researchers are working towards applying this type of technology to humanitarian demining, environmental monitoring and search and rescue operations.</p>
<p>At the <a title="AASS" href="http://www.oru.se/aass" target="_blank">Centre for Applied Autonomous Sensor Systems (AASS)</a> at <a title="Orebro University" href="http://www.oru.se/English/" target="_blank">Örebro University</a> in Sweden, professor Achim Lilienthal heads the <a title="Mobile Robotics and Olfaction lab" href="http://www.aass.oru.se/Research/mro/index.html" target="_blank">Mobile Robotics and Olfaction Lab</a>. The work at the lab aims to tackle one of the big problems in this area of robotics— getting scent detecting robots  out of the lab and into the real world, where variables such as wind and an unlimited area for gas distribution are difficult to control.</p>
<p>The video below features a robot that was developed a couple years ago at <a title="The University of Tokyo" href="http://www.u-tokyo.ac.jp/en/" target="_blank">The University of Tokyo</a>. As shown in the video by New Scientist, the mannequin&#8217;s head turns upon the detection of specific chemicals. Believe it or not, this robotic head is endowed an organic sensor made of frogs eggs, which act as powerful smell receptors!</p>
<p><iframe src="http://www.youtube.com/embed/JZfUAcKlTRE" frameborder="0" width="560" height="349"></iframe></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/07/19/robots-that-smell/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Exploring the Red Planet</title>
		<link>https://csnblog.specs-lab.com/2012/07/16/exploring-the-red-planet/</link>
		<comments>https://csnblog.specs-lab.com/2012/07/16/exploring-the-red-planet/#comments</comments>
		<pubDate>Mon, 16 Jul 2012 08:00:45 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Curiosity]]></category>
		<category><![CDATA[Gale Crater]]></category>
		<category><![CDATA[Mars mission]]></category>
		<category><![CDATA[NASA]]></category>
		<category><![CDATA[Robot Companions]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4163</guid>
		<description><![CDATA[NASA&#8217;s Curiosity gets close to landing on Mars It takes a while to get over 90 million km away from earth. Although NASA&#8217;s rover Curiosity hit the road for Mars at the end of November, 2011, the robot isn&#8217;t expected &#8230; <a href="https://csnblog.specs-lab.com/2012/07/16/exploring-the-red-planet/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>NASA&#8217;s Curiosity gets close to landing on Mars</strong><br />
<script type="text/javascript" src="http://cdn-akm.vmixcore.com/vmixcore/js?auto_play=0&amp;cc_default_off=1&amp;player_name=uvp&amp;width=560&amp;height=349&amp;player_id=1aa0b90d7d31305a75d7fa03bc403f5a&amp;t=V0AfeCvPCIxvdOCQPjasYBXh-UNDYFtNZG"></script></p>
<p>It takes a while to get over 90 million km away from earth. Although NASA&#8217;s rover Curiosity hit the road for Mars at the end of November, 2011, the robot isn&#8217;t expected to land on the planet until August 6th of this year. In the past, rovers like this one had set out to look for evidence of water on Mars but this time <a title="Cusiosity" href="http://www.nasa.gov/mission_pages/msl/index.html" target="_blank">Curiosity</a> will be keeping a look out for any spots on the planet that may have been particularly hospitable for life. Curiosity is scheduled to be<span id="more-4163"></span> stationed up there for the next 2 years where it will spend the majority of its time exploring <a title="Gale Crater" href="http://www.nasa.gov/multimedia/imagegallery/image_feature_2023.html" target="_blank">Gale Crater</a> which scientists believed was formed about 3 and a half billion years ago during a particularly tumultuous time for our solar system.</p>
<p>The crater&#8217;s most impressive feature is Mount Sharp, a 5 km high pile of debris that rises from its centre. While scientists aren&#8217;t exactly sure how this structure was formed, they suspect it&#8217;s composed of the sediment that used to fill the crater. The area is a particularly special geological hotspot. As NASA scientist John P. Grotzinger explains, &#8220;There is no place on Earth you can go to get the whole history at once&#8230; at Gale you don&#8217;t need to reconstruct the layers. You can see how they go from older to younger. You&#8217;ve got time&#8217;s arrow always pointed in the right direction. It&#8217;s all laid out very simply&#8221;.</p>
<p>One thing that might not be so simple, is the actual landing of the robot on Mars. While NASA will attempt to land Curiosity on a flat surface as close as possible to Mount Sharp, the exact landing spot will only be determined by the bot&#8217;s final steer towards Mars. Check out the video above to get a sense of the challenge Curiosity&#8217;s got ahead of itself!</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/07/16/exploring-the-red-planet/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Personalized Assistance from Robots</title>
		<link>https://csnblog.specs-lab.com/2012/07/11/4135/</link>
		<comments>https://csnblog.specs-lab.com/2012/07/11/4135/#comments</comments>
		<pubDate>Wed, 11 Jul 2012 07:39:06 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[CSAIL]]></category>
		<category><![CDATA[Julie Shah]]></category>
		<category><![CDATA[learning algorithms]]></category>
		<category><![CDATA[MIT]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[working with robots]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4135</guid>
		<description><![CDATA[Robots help out—the way we want them to! Both humans and robots work in manufacturing plants however, they don&#8217;t usually work alongside one another. Robots are most often used in repetitive, exhausting or hazardous work, while people are needed for &#8230; <a href="https://csnblog.specs-lab.com/2012/07/11/4135/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>Robots help out—the way we want them to!</strong><br />
<iframe src="http://www.youtube.com/embed/BDplbkg0fd0" frameborder="0" width="560" height="349"></iframe></p>
<p>Both humans and robots work in manufacturing plants however, they don&#8217;t usually work alongside one another. Robots are most often used in repetitive, exhausting or hazardous work, while people are needed for tasks that require finer skill and detail. An <a title="Working With Robots" href="http://www.robotcompanions.eu/blog/2011/08/working-with-robots/" target="_blank">earlier post</a> on this blog discussed some of the ways robots have started working alongside humans but there are of course some important issues to consider if we really want to make this kind of collaboration work.<br />
<span id="more-4135"></span><br />
Humans are individuals and we like to do things our own way. According to Julie Shah, an Assistant Professor of Aeronautics and Astronautics at MIT, robots need to display an almost seamless understanding of how they can help people. To give robots the ability to do this, Shah and her team at the MIT <a title="MIT CSAIL" href="http://www.csail.mit.edu/" target="_blank">Computer Science and Artificial Intelligence Laboratory (CSAIL) </a>have developed a new algorithm that allows a robot to quickly learn how people prefer to do particular tasks so that it can adapt accordingly to give them a hand.Shah stresses that without this ability, people are likely to get frustrated and forego any assistance from the robot which could potentially increase their efficiency on the task.</p>
<p>Watch the video above to see how the algorithm works in a test case involving spar assembly, a specific part of airplane manufacturing. The algorithm is also being used  in simulations to train robots and humans to work together; findings will be presented at the <a title="Robotics science and systems" href="http://roboticsconference.org/pmwiki/" target="_blank">Robotics: Science and Systems Conference</a> in Sydney, Australia this July.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/07/11/4135/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>The Last Moment Robot</title>
		<link>https://csnblog.specs-lab.com/2012/07/09/4147/</link>
		<comments>https://csnblog.specs-lab.com/2012/07/09/4147/#comments</comments>
		<pubDate>Mon, 09 Jul 2012 08:00:00 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Brown University Science Center]]></category>
		<category><![CDATA[Dan Chen]]></category>
		<category><![CDATA[human-robot interaction]]></category>
		<category><![CDATA[Last moment robot]]></category>
		<category><![CDATA[Rhode Island School of Design]]></category>
		<category><![CDATA[Robot Companions]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4147</guid>
		<description><![CDATA[It&#8217;s OK if this gives you the creeps If you think this kind of robot may be taking things a step too far, its creator Dan Chen would be pleased he&#8217;s gotten his point across. For starters, this robot isn&#8217;t &#8230; <a href="https://csnblog.specs-lab.com/2012/07/09/4147/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>It&#8217;s OK if this gives you the creeps</strong><br />
<iframe src="http://www.youtube.com/embed/T8PNzA2S6EY" frameborder="0" width="560" height="349"></iframe></p>
<p>If you think this kind of robot may be taking things a step too far, its creator <a title="Dan Chen" href="http://www.pixedge.com/lastmoment" target="_blank">Dan Chen </a>would be pleased he&#8217;s gotten his point across. For starters, this robot isn&#8217;t actually being used for the application shown in the video above. In fact, the bed and fluorescent lit room are nothing more than props used to create a hospital-like environment within this interactive installation.<br />
<span id="more-4147"></span><br />
Accompanied by someone dressed as a doctor, viewers of the installation are able to enter the room, one at a time, taking a turn to lie in the hospital bed. At this point, the pseudo-doctor asks for their permission to place their arm under the Last Moment Robot&#8217;s mechanical caress. The ¨doctor¨then leaves the room and the robot begins to gently stroke the ¨patient&#8217;s¨arm as the LED screen reads ¨end of life detected¨and a soothing script of comforting words ensue.</p>
<p>While the use of robotic pets in hospital care has been shown to help some people cope with stress and isolation, Chen cautions against trying to fool people into a false experience: ¨With my own robots, I use generic patterns of behavior to suggest at our desire for comfort and highlight the human need for intimacy. The design of my robots is honest with its function. Using no fancy adornments, I do not attempt to disguise the robots or portray them as anything but what they are.¨</p>
<p>Chen further states that he thinks his devices could ¨serve as a stepping stone or learning tool to create deeper and more meaningful human to human relationships and build a stronger and more supportive community. Because my robots look more like appliances, the user must jump a mental gap in order to feel intimacy with the device. In the process of making this jump, I want the user to realize that the possibility of a real, deep relationship is not fully reproducible through imagination or even robotics. These are only temporary solutions.¨</p>
<p>Nevertheless, Chen is a robot-lover and while the video above may make some of us feel uncomfortable, he maintains that the idea is not meant to be negative. To explain his point, he includes an excerpt from Anthony Dunne and Fiona Raby&#8217;s <em>Design Noir: The Secret Life of Electronic Objejects</em> in his recently published Masters thesis: “The idea is not to be negative, but to stimulate discussion and debate amongst designers, industry and the public about electronic technology and everyday life. This is done by developing alternative and often gently provocative artifacts which set out to engage people through humor, insight, surprise and wonder.”</p>
<p>The interactive installation has been running at the <a title="Brown University Science Centre" href="http://brown.edu/academics/science-center/" target="_blank">Brown University Science Centre</a> as well as the <a title="RISD" href="http://www.risd.edu/" target="_blank">Rhode Island School for Design</a>.  For more information on some of Chen&#8217;s fascinating work, check out the thesis he wrote for his Masters in Fine Art in Digital + Media titled: <a title="Dan Chen_thesis" href="http://www.pixedge.com/download/dan_thesis.pdf" target="_blank">File &gt; Save As &gt; Intimacy</a> which examines the question:  what is intimacy without humanity?</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/07/09/4147/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Robots Get Social&#8230;</title>
		<link>https://csnblog.specs-lab.com/2012/07/05/robots-that-love-twitter/</link>
		<comments>https://csnblog.specs-lab.com/2012/07/05/robots-that-love-twitter/#comments</comments>
		<pubDate>Thu, 05 Jul 2012 07:55:35 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Cloud Robotics]]></category>
		<category><![CDATA[Facebook for robots]]></category>
		<category><![CDATA[IULM University of Milan]]></category>
		<category><![CDATA[Marco Camisani Calzolari]]></category>
		<category><![CDATA[MyRobots]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[Robots on Twitter]]></category>
		<category><![CDATA[Smart homes]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4086</guid>
		<description><![CDATA[Creating hype on Twitter and other networks Twitter has reached over 500 million users, but of those, apparently only 140 million actually use it regularly. Now it turns out that some of these users may not even be real people, &#8230; <a href="https://csnblog.specs-lab.com/2012/07/05/robots-that-love-twitter/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong><a href="http://www.robotcompanions.eu/blog/2012/07/robots-that-love-twitter/social-robot_mw/" rel="attachment wp-att-4103"><img class="alignleft size-medium wp-image-4103" title="Social Robot_mwilson" src="http://www.robotcompanions.eu/blog/wp-content/uploads/2012/06/Social-Robot_mw-300x211.jpg" alt="" width="305" height="214" /></a>Creating hype on Twitter and other networks</strong></p>
<p>Twitter has reached over 500 million users, but of those, apparently only 140 million actually use it regularly. Now it turns out that some of these users may not even be real people, but robots instead!</p>
<p><span id="more-4086"></span><br />
<a title="Calzolari study" href="http://www.camisanicalzolari.com/MCC-Twitter-ENG.pdf" target="_blank">A study </a>by Marco Camisani Calzolari, a  professor in Corporate Communication and Digital Languages at The <a title="IULM University of Milan" href="http://www.iulm.it/" target="_blank"> IULM University of Milan</a>, recently carried out a study that analysed the profiles of users who followed companies with a strong presence on Twitter.</p>
<p>A forensic software program was used to identify profiles which diverged from normal human behaviour on social networks, but when it comes to use of social networks, it&#8217;s hard to say what should be considered ¨normal¨ or human.</p>
<p>For starters, not every Twitter user who failed to post an update every 5 minutes was automatically deemed a robot. Instead, two of the most important elements in gauging a user&#8217;s humanness were whether or not some of the user&#8217;s tweets had ever been re-tweeted and whether or not the user had logged in through more than one client or through a mobile device. Apparently robots don&#8217;t often  say things worth repeating, nor are they likely to own an iPhone.</p>
<p>Calzolari&#8217;s report concludes that among some of the companies he analysed, nearly half their Twitter followers are actually robots! However, as Calzolari explains in a <a title="Perth News" href="http://www.perthnow.com.au/business/media-marketing/lies-damn-lies-and-statistics-half-of-companies-twitter-followers-are-robots/story-e6frg2rc-1226391354112" target="_blank">recent article</a> the companies themselves may not be directly at fault for their phony followers: &#8220;In some cases, the web agency or media centre executives have chosen to take short cuts in order to demonstrate to companies, who are oblivious, that their activities have been successful by generating lots of new users&#8221;.</p>
<p>Now these days, robots aren&#8217;t just on Twitter. In fact, a new social network is aiming to be the ¨Facebook for robots,¨ because robots should be able to post photos of their latest vacations too. Alright, we&#8217;re kidding, but the company Robot Shop is dead serious about their robot connecting network, <a title="MyRobot" href="http://www.myrobots.com/" target="_blank">MyRobots</a>.</p>
<p>The network aims to connect all robots and intelligent devices to the internet, enabling them to  do more by allowing them to be remotely  monitored and controlled. As explained on the MyRobots website,¨In the same way humans benefit from socializing, collaborating and sharing, robots can benefit from those interactions too by sharing their sensor information giving insight on their perspective of their current state.¨ To help describe their concept, MyRobots has posted a link to the video below produced by <a title="Ericsson" href="http://www.ericsson.com/" target="_blank">Ericsson</a>.</p>
<p><iframe src="http://www.youtube.com/embed/i5AuzQXBsG4" frameborder="0" width="560" height="349"></iframe></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/07/05/robots-that-love-twitter/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Racing with Robots</title>
		<link>https://csnblog.specs-lab.com/2012/07/03/racing-with-robots/</link>
		<comments>https://csnblog.specs-lab.com/2012/07/03/racing-with-robots/#comments</comments>
		<pubDate>Tue, 03 Jul 2012 06:30:23 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Exertion Games Lab]]></category>
		<category><![CDATA[Joggobot]]></category>
		<category><![CDATA[RMIT University]]></category>
		<category><![CDATA[Robot Companions]]></category>
		<category><![CDATA[Running Apps]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=4175</guid>
		<description><![CDATA[Ready, Set, Joggobot! Running can be a challenging activity to get into and keep up. Whether it&#8217;s a friend or canine pal acting as a running buddy, the company often provides us with the motivation needed to push ourselves further &#8230; <a href="https://csnblog.specs-lab.com/2012/07/03/racing-with-robots/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>Ready, Set, Joggobot!</strong><br />
<iframe src="http://www.youtube.com/embed/4x4d8IX_0kI" frameborder="0" width="560" height="349"></iframe></p>
<p>Running can be a challenging activity to get into and keep up. Whether it&#8217;s a friend or canine pal acting as a running buddy, the company often provides us with the motivation needed to push ourselves further or to get us out there on days when we&#8217;d rather not lace up those trainers. But since not everyone has a dog or an active friend, researchers at the Exertion Games Laboratory at <a title="RMIT" href="http://www.rmit.edu.au/" target="_blank">RMIT University</a> in Melbourne, Australia have recently created the Joggobot so you won&#8217;t have to be the lone ranger out there on the track!<br />
<span id="more-4175"></span></p>
<p>The idea behind the Joggobot is relatively simple. Researchers took a flying quadrotor robot which was already commercially available and enabled it with a camera and marker tracking software. Then they made special T-shirts, to be worn by the joggers so that they can be tracked.</p>
<p>The Joggobot takes off from the ground once the marker on the user’s T-shirt has been identified, automatically rising to its height. Level with the marker on the T-shirt, the Joggobot positions itself about 3 m in front of the jogger, keeping ahead of them according to their pace. If the joggobot is ever unable to detect the jogger, it will  land itself immediately so there&#8217;s no need to worry about rogue robots in the sky!</p>
<p>Over the past few years, several companies have come out with apps for mobile phones and other electronic devices that are conducive to jogging however, the team at the Exertion Games Lab stresses the importance of embodying this kind of technology appropriately.</p>
<p>Preliminary research on the joggobot has found that ¨People were positive about the idea of having a flying robot accompanying them while jogging, distracting them from their exhaustion and challenging them to increase their effort. In particular, users appreciated that the system had a ‘body’, which seemed to match the embodied activity of jogging. This becomes particularly evident when compared to virtual jogging support systems such as those available on watches and mobile phones: reading the information during running is often difficult, but with Joggobot, participants thought interactions could be easier to comprehend¨.</p>
<p>Researchers hope to continue using the Joggobot to explore the idea of using robots to support us when exercising&#8230; would you jog with the Joggobot?<br />
Click <a title="Exertion Games Lab" href="http://exertiongameslab.org/projects/joggobot" target="_blank">HERE</a> to read more about the Joggobot on the Exertion Games Lab website.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/07/03/racing-with-robots/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Hiroshi Ishiguro&#039;s Huggable Robot</title>
		<link>https://csnblog.specs-lab.com/2012/06/18/hiroshi-ishiguros-huggable-robot/</link>
		<comments>https://csnblog.specs-lab.com/2012/06/18/hiroshi-ishiguros-huggable-robot/#comments</comments>
		<pubDate>Mon, 18 Jun 2012 08:00:21 +0000</pubDate>
		<dc:creator><![CDATA[Michelle Wilson]]></dc:creator>
				<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Actroid]]></category>
		<category><![CDATA[Geminoid]]></category>
		<category><![CDATA[Hiroshi Ishiguro]]></category>
		<category><![CDATA[Huggable robot]]></category>
		<category><![CDATA[Hugvie]]></category>
		<category><![CDATA[Intelligent Robotics Laboratory]]></category>
		<category><![CDATA[Osaka University]]></category>
		<category><![CDATA[Robot Companions]]></category>

		<guid isPermaLink="false">http://www.robotcompanions.eu/blog/?p=3990</guid>
		<description><![CDATA[After a bad day, there&#8217;s nothing like a Hugvie If you&#8217;re a fan of bizarre robots, you&#8217;ve got to be familiar with some of Hiroshi Ishiguro&#8217;s work. As the director of the Intelligent Robotics Laboratory at Osaka University in Japan, &#8230; <a href="https://csnblog.specs-lab.com/2012/06/18/hiroshi-ishiguros-huggable-robot/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>After a bad day, there&#8217;s nothing like a <em>Hugvie</em></strong><br />
<iframe src="http://www.youtube.com/embed/nJXkL7bcQR0?rel=0" frameborder="0" width="560" height="349"></iframe><br />
If you&#8217;re a fan of bizarre robots, you&#8217;ve got to be familiar with some of Hiroshi Ishiguro&#8217;s work. As the director of the <a title="Intelligent Robotics Lab, Osaka University" href="http://top.irl.sys.es.osaka-u.ac.jp/" target="_blank">Intelligent Robotics Laboratory </a>at Osaka University in Japan, Ishiguro&#8217;s gained a lot of fame through many of his pseudo-human creations.<br />
<span id="more-3990"></span><br />
The media took quite a fancy to <a title="Ishiguro's Geminoid" href="http://spectrum.ieee.org/robotics/humanoids/hiroshi-ishiguro-the-man-who-made-a-copy-of-himself/0" target="_blank">his fellow <em>Geminoid</em></a>, a humanoid designed to be the robotist&#8217;s very own robotic twin. Particularly focused on the notion of creating robots that are as life-like as possible, some of his other robots such as the <a title="Actroid" href="http://en.wikipedia.org/wiki/Actroid" target="_blank">Actroid </a>have even been regarded as eerily realistic. Speaking of eerily realistic, you may want to check out the <em>Geminoid</em> he recently<a title="Geminoid" href="http://www.youtube.com/watch?v=eZlLNVmaPbM&amp;feature=player_embedded" target="_blank"> modeled after Danish professor, Henrik Scharfe</a>.</p>
<p>In light of his many creations, Ishiguro&#8217;s new development should come as no surprise. The fact that the <em>Hugvie</em> is another one of many huggable robots out there begs the question: why would anyone want a hug from a robot? Well, according to Ishiguro, these robots could give great comfort to elderly people living far away from family and loved ones. Furthermore, other scientists speculate that the use of technology like this could help in the prevention and treatment of neurological diseases like Dementia and Alzheimer&#8217;s. While it&#8217;s too soon to tell if that could really be the case, we do know that touch affects the brain in many ways.</p>
<p>You may have heard of psychologist <a title="Harry Harlow" href="http://en.wikipedia.org/wiki/Harry_Harlow" target="_blank">Harry Harlow&#8217;s </a>experiments in the late 1950&#8242;s. While his experiments with young Rhesus monkeys are now outdated and even considered cruel,  the pioneering psychologist did manage to show that comforting physical contact promotes healthy cognitive development while isolation and touch deprivation leads to the contrary. Similarly, more <a title="PubMedThe effect of therapeutic touch on behavioral symptoms and cortisol in persons with dementia" href="http://www.ncbi.nlm.nih.gov/pubmed/19657203" target="_blank">recent studies</a> have shown that positive physical contact lowers levels of stress producing hormones like Cortisol.</p>
<p>Ishiguro sees other applications for these soft, blob-like bots—for example, if your partner happens to be far way, you can hug the robot while talking to them over the phone. The voice on the phone gets converted into vibrations that the hugger can feel, while another vibrating device within the robot produces a constant heartbeat. This may seem like a strange thing to do but could  adding in this extra modality help bridge the distance between people? After all, many people feel that simply being able to see their loved one while hearing their voice via programmes like Skype makes them feel significantly closer to them.</p>
<p>If you&#8217;re interested in the relationships between man and machine check out a <a title="Mechanical Love" href="http://www.youtube.com/watch?v=F-tTS7Ze85o&amp;feature=endscreen&amp;NR=1" target="_blank">clip </a>of the 2009 documentary <em>Mechanical Love</em>, which features some of Ishiguro&#8217;s insights on the concept of ¨Sonzai-Kan,¨or the feeling of human presence by means of the internet.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2012/06/18/hiroshi-ishiguros-huggable-robot/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
