<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Convergent Science Network &#187; Robots and emotions</title>
	<atom:link href="https://csnblog.specs-lab.com/tag/robots-and-emotions/feed/" rel="self" type="application/rss+xml" />
	<link>https://csnblog.specs-lab.com</link>
	<description>Blog on Biomimetics and Neurotechnology.     With [writers] Michael Szollosy, Dmitry Malkov, Michelle Wilson, and Anna Mura [editor]</description>
	<lastBuildDate>Tue, 27 Sep 2022 14:58:43 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Is Pepper the world&#8217;s hottest personal robot yet?</title>
		<link>https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/</link>
		<comments>https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/#comments</comments>
		<pubDate>Wed, 18 Jun 2014 21:01:48 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Asia]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Aldebaran Robotics]]></category>
		<category><![CDATA[emotional intelligence]]></category>
		<category><![CDATA[Nao]]></category>
		<category><![CDATA[Pepper]]></category>
		<category><![CDATA[Robots and emotions]]></category>
		<category><![CDATA[Romeo]]></category>
		<category><![CDATA[SoftBank]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5329</guid>
		<description><![CDATA[Pepper, a new humanoid robot introduced earlier this month in Japan, may herald the beginning of a new era in personal robotics. Unlike its ancestors, such as Mitsubishi’s Wakamaru and Sony’s QRIO, who had to join the halls of robot &#8230; <a href="https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5332" style="width: 690px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper-680x365.jpg" rel="attachment wp-att-5332"><img class="size-full wp-image-5332" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper-680x365.jpg" alt="Credit: Aldebaran Robotics" width="680" height="365" /></a><p class="wp-caption-text">Credit: Aldebaran Robotics</p></div>
<p>Pepper, a new humanoid robot introduced earlier this month in Japan, may herald the beginning of a new era in personal robotics. Unlike its ancestors, such as Mitsubishi’s <a href="http://en.wikipedia.org/wiki/Wakamaru">Wakamaru</a> and Sony’s <a href="http://en.wikipedia.org/wiki/QRIO">QRIO</a>, who had to join the halls of robot extinction, Pepper, developed jointly by the French robotics company <a href="http://www.aldebaran.com/en">Aldebaran</a> and the Japanese telecom giant <a href="http://www.softbank.jp/en/mobile/">SoftBank</a>, is here to stay.</p>
<p><span id="more-5329"></span></p>
<p>Although the robot aims at possibly the most unreachable market in robotics industry, that of personal household robots, there are several major factors that can play a decisive role in Pepper’s future: his advanced emotional intelligence, surprisingly low price, and, of course, let’s not forget that looks matter – Pepper’s design is every bit gorgeous.</p>
<p>Softbank plans to start selling the robots next year in Japan for about $ 1,900. Until then, people can get acquainted with Pepper at certain SoftBank stores in Japan.</p>
<p>Although Pepper might initially seem quite unpractical – it will not clean your house and may not even be able to effectively fetch things – the robot’s strong suit lies in its ability to be good company.</p>
<div id="attachment_5333" style="width: 594px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper_ld.jpg" rel="attachment wp-att-5333"><img class="size-large wp-image-5333" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/06/pepper_ld-1024x682.jpg" alt="Credit: Aldebaran Robotics" width="584" height="388" /></a><p class="wp-caption-text">Credit: Aldebaran Robotics</p></div>
<p>Pepper’s communication skills are the result of special software that allows it to effectively analyze human emotions by combining information about voice tone, facial expressions and body language. In this way Pepper will tailor each individual conversation based on how its interlocutors feel and behave. While by no means the first robot to do so, Pepper may well be the first consumer available robot with such advanced emotion-reading capabilities.</p>
<p>The cutting-edge emotion engine will be supported by a cloud-based “collective wisdom”, where all Pepper robots will be able to upload valuable information about their interactions with humans. Taken together, this data will allow them evolve and polish their communication skills. As an example, hundreds of robots could store information about whether a particular joke makes people laugh, and then decide whether the same joke will be appropriate in other situations.</p>
<p>Pepper’s emotional intelligence is a logical progression of Aldebaran’s pursuit of companion robots capable of living with humans and responding to their constantly changing moods and feelings. The robot is strongly reminiscent of Aldebaran’s previous hit <a href="http://www.aldebaran.com/en/humanoid-robot/nao-robot">Nao</a>, but, unlike his little brother, uses wheels instead of legs to move around – a choice dictated by power efficiency requirements.</p>
<p>A legged version of Pepper, however, might also see the light: Aldebaran’s legged <a href="http://www.aldebaran.com/en/robotics-company/projects">Romeo</a> robot, which still remains in development, can in the future serve as a foundation for a legged version of Pepper. You can read a <a href="http://csnblog.specs-lab.com/2014/04/01/meet-romeo-a-new-rising-star-of-humanoid-robotics/">previous post</a> to learn more about the ongoing Romeo project.</p>
<p>Allowing robots to understand human emotions and express their own is a critical step towards improving human robot interaction in all settings. Read <a href="http://csnblog.specs-lab.com/2014/02/27/children-will-learn-from-robots/">this post </a>to learn about some ongoing European projects that aim to improve emotional intelligence in robots.</p>
<p><iframe width="584" height="329" src="https://www.youtube.com/embed/8HXhsKpETXE?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/06/18/is-pepper-the-worlds-hottest-personal-robot-yet/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>What robotics learned from Pixar</title>
		<link>https://csnblog.specs-lab.com/2014/03/24/what-robotics-learned-from-pixar/</link>
		<comments>https://csnblog.specs-lab.com/2014/03/24/what-robotics-learned-from-pixar/#comments</comments>
		<pubDate>Mon, 24 Mar 2014 16:08:15 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[AUR]]></category>
		<category><![CDATA[Guy Hoffman]]></category>
		<category><![CDATA[human-robot interaction]]></category>
		<category><![CDATA[Pixar]]></category>
		<category><![CDATA[Robots and emotions]]></category>
		<category><![CDATA[Shimon]]></category>
		<category><![CDATA[Travis]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5095</guid>
		<description><![CDATA[Each year brings us closer to the day when robotic companions will become an integral part of our homes, schools, hospitals and offices. However, for robots to be truly accepted in our personal space, their social interactions with us must &#8230; <a href="https://csnblog.specs-lab.com/2014/03/24/what-robotics-learned-from-pixar/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><iframe width="584" height="329" src="http://www.youtube.com/embed/-dT6meyruxQ?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>Each year brings us closer to the day when robotic companions will become an integral part of our homes, schools, hospitals and offices. However, for robots to be truly accepted in our personal space, their social interactions with us must acquire the kind of fluency and coordination that humans expect from each other. This is one of the challenges addressed by <a href="http://guyhoffman.com/">Guy Hoffman</a>, the co-director of the <a href="http://milab.idc.ac.il/">Media Innovation Lab</a> at <a href="http://portal.idc.ac.il/en/main/homepage/pages/homepage.aspx">IDC Herzilya </a>in Israel and possibly one of the most original thinkers in robotics today.</p>
<p><span id="more-5095"></span></p>
<p>Collaborative fluency implies a coordinated and synchronised meshing of joint activities between several participants. Among the most significant parameters that affect the level of fluency and coordination are the anticipation and timing of robotic movements. The problem with this is that the majority of existing robots are designed and programmed in such a way that requires them to first analyse human movements, calculate the appropriate response and only then act accordingly, all of which delay the robot’s movements and contribute to their jerkiness and unnaturalness.</p>
<p>Guy Hoffman was one of those researchers who realised that eliciting emotional response has more to do with how a robot moves than how it looks. Hoffman was initially inspired by <a href="http://www.pixar.com/">Pixar’s</a> animated<a href="http://en.wikipedia.org/wiki/Luxo_Jr."> short film</a> that featured a pair of desk lamps, who, despite their non-anthropomorphic appearance managed to provoke a strong emotional response exclusively by means of right timing and sound effects.</p>
<p>His subsequent experiences with computer animation in combination with his enthusiasm in robotics led him to <a href="http://web.mit.edu/">MIT</a> where he created <a href="http://alumni.media.mit.edu/~guy/aur/">AUR</a>, a real-world robotic counterpart of Pixar’s lamp, capable of quietly assisting a human based on anticipating his movements rather than providing a straightforward calculated response. Thanks to AUR’s smooth and obedient behaviour, people who interacted with the lamp had a more positive and fulfilling emotional experience.</p>
<div id="attachment_5098" style="width: 305px" class="wp-caption alignright"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/03/Shimon.jpg" rel="attachment wp-att-5098"><img class="wp-image-5098     " alt="Shimon robot " src="http://csnblog.specs-lab.com/wp-content/uploads/2014/03/Shimon.jpg" width="295" height="168" /></a><p class="wp-caption-text">Shimon can improvise music together with human muscicians</p></div>
<p>According to Hoffman, robotic intelligence can be essentially classified either as a traditional “calculated” intelligence that works in a chess-like manner or a more intuitive “adventurous” intelligence that tries to anticipate its partner’s movements. Anticipating the full range of movement, however, is tricky and Hoffman’s robots still tend to commit more mistakes along the way. Even so, studies demonstrate that people prefer such less perfect robots to their more accurate, but less understanding twins.</p>
<div id="attachment_5099" style="width: 239px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/03/31-Travis.jpg" rel="attachment wp-att-5099"><img class=" wp-image-5099    " alt="Travis, a robotic speaker dock released in 2012" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/03/31-Travis.jpg" width="229" height="143" /></a><p class="wp-caption-text">Travis, a robotic speaker dock released in 2012</p></div>
<p>With one of his latest robotic creations<a href="http://www.gtcmt.gatech.edu/research-projects/shimon"> Shimon</a>, Hoffman ventured into the world of music improvisation, where he tried to apply the same principles of fluent collaboration. Why music improvisation? Because it is a time-critical interaction that Hoffman saw as an ideal ground to test his ideas. Shimon is basically a robotic <a href="http://en.wikipedia.org/wiki/Marimba">marimba</a> virtuoso that can jam with human musicians in real time.</p>
<p>&nbsp;</p>
<p>You can also check out <a href="http://www.gtcmt.gatech.edu/research-projects/travis">Travis</a> (aslo Shimi), a cute speaker dock released by Hoffman in 2012, which not only plays music, but also enjoys it himself.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/03/24/what-robotics-learned-from-pixar/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Children will learn from robots</title>
		<link>https://csnblog.specs-lab.com/2014/02/27/children-will-learn-from-robots/</link>
		<comments>https://csnblog.specs-lab.com/2014/02/27/children-will-learn-from-robots/#comments</comments>
		<pubDate>Thu, 27 Feb 2014 06:57:31 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Aldebaran Robotics]]></category>
		<category><![CDATA[ALIZ-E]]></category>
		<category><![CDATA[EASEL]]></category>
		<category><![CDATA[EEFA]]></category>
		<category><![CDATA[EMOTE]]></category>
		<category><![CDATA[emotional intelligence]]></category>
		<category><![CDATA[Nao]]></category>
		<category><![CDATA[Robot tutors]]></category>
		<category><![CDATA[Robots and emotions]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=4958</guid>
		<description><![CDATA[We featured a previous post on one of the most emotionally literate robots in the world Nao, who was developed by Aldebaran Robotics and is currently being used by the ALIZ-E project scheduled to end this year. This cute robot has been tested at aged care &#8230; <a href="https://csnblog.specs-lab.com/2014/02/27/children-will-learn-from-robots/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_4962" style="width: 237px" class="wp-caption alignleft"><a href="http://www.flickr.com/photos/ajourneyroundmyskull/4205226788/"><img class="size-medium wp-image-4962" alt="4205226788_4f49a3940c_b" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/02/4205226788_4f49a3940c_b-227x300.jpg" width="227" height="300" /></a><p class="wp-caption-text"><a href="http://www.flickr.com/photos/ajourneyroundmyskull/4205226788/">Illus. by E. Benyaminson for &#8220;Hello, I&#8217;m Robot!&#8221; by Stanislav Zigunenko</a></p></div>
<p>We featured a <a href="http://csnblog.specs-lab.com/2011/12/19/2217/">previous post</a> on one of the most emotionally literate robots in the world <a href="http://en.wikipedia.org/wiki/Nao_(robot)">Nao</a>, who was developed by <a href="http://www.aldebaran-robotics.com/en/">Aldebaran Robotics</a> and is currently being used by the <a href="http://www.aliz-e.org/">ALIZ-E project</a> scheduled to end this year. This cute robot has been tested at aged care facilities and proved helpful for such tasks as monitoring and reducing people’s anxiety levels by engaging with them emotionally.</p>
<p>Now, another European project is testing Nao in a slightly different role – that of a tutor. Needless to say, <a href="http://www.emote-project.eu/">EMOTE</a>, a three-year research project launched in 2012, also picked up Nao for his ability to empathise.</p>
<p><span id="more-4958"></span></p>
<p>The project aims to develop and evaluate a new generation of artificial tutors with sufficient perceptive capability to engage in emotional interactions with students in a physical space. Nao’s tutoring skills have already been put to test in <a href="http://www.emote-project.eu/schools/">a number of schools</a> in Portugal, the UK and Sweden. The robot uses his peculiar abilities to track and respond to students’ emotions, which allows him to adapt his teaching style to an ever-changing environment of a classroom.</p>
<p>However, for robots to be truly effective as tutors, they must adapt their behavior and teaching style not only within singular encounters, but also over longer sequences of encounters. This challenge is addressed by the <a href="http://easel.upf.edu/">EASEL project</a>. Launched in December of 2013, EASEL will revolve around the study of human-robot symbiotic interaction, which among other things requires the robot to be able to influence and be influenced by humans (including on the emotional level), store the acquired information and consequently use it to extract new knowledge necessary for successful long-term interactions with students.</p>
<p>It is easy to envision how empathetic robots can revolutionise the tutoring process and maybe even do better than human tutors at some aspects such as tracking the engagement level and progress of each and every student simultaneously and over long periods of time. Making robots even more engaging by enabling them to express emotions is a different part of the story, and we will need to have a better understanding of how emotions work in ourselves before we can successfully teach them to robots. <a href="http://efaa.upf.edu/">EFAA</a>, for instance, is another European project that aims to enhance our social interactions with robots, including by means of equipping them with just such an ability to express emotions.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/02/27/children-will-learn-from-robots/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
