<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Convergent Science Network &#187; Biology</title>
	<atom:link href="https://csnblog.specs-lab.com/category/robots-and-research/biology/feed/" rel="self" type="application/rss+xml" />
	<link>https://csnblog.specs-lab.com</link>
	<description>Blog on Biomimetics and Neurotechnology.     With [writers] Michael Szollosy, Dmitry Malkov, Michelle Wilson, and Anna Mura [editor]</description>
	<lastBuildDate>Tue, 27 Sep 2022 14:58:43 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Living Machines 2016</title>
		<link>https://csnblog.specs-lab.com/2016/03/23/living-machines-2016/</link>
		<comments>https://csnblog.specs-lab.com/2016/03/23/living-machines-2016/#comments</comments>
		<pubDate>Wed, 23 Mar 2016 14:50:58 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[AI]]></category>
		<category><![CDATA[Biohybrid]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[bionics]]></category>
		<category><![CDATA[brain research]]></category>
		<category><![CDATA[mechatronics]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[science]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5895</guid>
		<description><![CDATA[The 5th International Conference on Biomimetic and Biohybrid Systems will be held this year in beautiful Edinburgh, Scotland,18 -22 July. The three-day event, organised by the Convergent Science Network, will be hosted at a fantastic venue consistent with the spirit of the conference, the Dynamic Earth: a &#8230; <a href="https://csnblog.specs-lab.com/2016/03/23/living-machines-2016/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2016/03/POSTER-LM2016_OKcut.png" rel="attachment wp-att-5904"><img class="alignnone wp-image-5904 size-large" src="http://csnblog.specs-lab.com/wp-content/uploads/2016/03/POSTER-LM2016_OKcut-e1458745483622-1024x518.png" alt="POSTER LM2016_OKcut" width="584" height="295" /></a></p>
<p><a href="http://csnetwork.eu/livingmachines/conf2016">The 5th International Conference on Biomimetic and Biohybrid Systems</a> will be held this year in beautiful Edinburgh, Scotland,18 -22 July. The three-day event, organised by the <a href="http://csnetwork.eu/">Convergent Science Network</a>, will be hosted at a fantastic venue consistent with the spirit of the conference, the <a href="http://www.dynamicearth.co.uk/visit/what-is-dynamic-earth">Dynamic Earth: a 5 stars visitor experience </a>with incredible interactive technology to learn about natural events and much more&#8230;.</p>
<p><span id="more-5895"></span></p>
<p>The conference will offer amazing talks on a variety of topics related to the<span style="font-weight: normal; color: #022b38;"> fields of biomimetics and bioybrid systems</span> and technologies at the intersection of living and artificial systems. The program includes 5 plenary lectures from excellent experts in the field. The plenary lectures will be complemented by short talks on diverse topics such as robotics, active sensing, navigation, locomotion and others.</p>
<p>You can find out more about the plenary speakers <a href="http://csnetwork.eu/livingmachines/conf2016/plenary" target="_blank">HERE</a>, the full conference programme will be published shortly!</p>
<p>The Living Machines conference will be preceded by a one-day satellite event, hosted by the <strong><a style="font-weight: normal; color: #39bbda !important;" href="http://www.ed.ac.uk/informatics">University of Edinburgh Department of Informatics</a></strong><span style="font-weight: normal; color: #022b38;">, </span>and consisting of a series of research-oriented workshops. You can submit your workshops <a href="http://csnetwork.eu/livingmachines/conf2014/workshops">HERE</a>.</p>
<p>We are looking forward to seeing you this year in Edinburgh!</p>
<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2016/03/POSTER-LM2016_OK-e1458745269808.png"><img class="alignnone wp-image-5902 size-large" src="http://csnblog.specs-lab.com/wp-content/uploads/2016/03/POSTER-LM2016_OK-724x1024.png" alt="POSTER LM2016_OK" width="584" height="825" /></a></p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2016/03/23/living-machines-2016/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>An ecology of robots built using principles of biomimetics</title>
		<link>https://csnblog.specs-lab.com/2015/10/22/an-ecology-of-robots-through-biomimetics/</link>
		<comments>https://csnblog.specs-lab.com/2015/10/22/an-ecology-of-robots-through-biomimetics/#comments</comments>
		<pubDate>Thu, 22 Oct 2015 08:00:33 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[science]]></category>
		<category><![CDATA[society]]></category>
		<category><![CDATA[Uncategorized]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5807</guid>
		<description><![CDATA[More then ever scientists are using a nature-inspired approach to build biomimimetic robots. Developed after through investigation of biological systems, these robots are a wonder of engineering and artificial intelligence research. Here are some examples of small biomimetic robots, inspired by sea creatures &#8230; <a href="https://csnblog.specs-lab.com/2015/10/22/an-ecology-of-robots-through-biomimetics/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p>More then ever scientists are using a nature-inspired approach to build biomimimetic robots. Developed after through investigation of biological systems, these robots are a wonder of engineering and artificial intelligence research.</p>
<div id="attachment_5832" style="width: 1610px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/10/lobot133.jpg" rel="attachment wp-att-5832"><img class="wp-image-5832 size-full" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/10/lobot133.jpg" alt="RoboLobster" width="1600" height="1200" /></a><p class="wp-caption-text">Robotic Lobster by Prof. Josef Ayers at Northeastern University. Photography Jan Witting</p></div>
<p><span id="more-5807"></span>Here are some examples of small biomimetic robots, inspired by sea creatures and insects, developed by scientists around the world</p>
<p><strong>The RoboClam</strong></p>
<div id="attachment_5813" style="width: 310px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/roboclam_web.jpg" rel="attachment wp-att-5813"><img class="wp-image-5813 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/roboclam_web-300x199.jpg" alt="roboclam_web" width="300" height="199" /></a><p class="wp-caption-text">RoboClam MIT</p></div>
<p>Inspired by the Atlantic razor clam, this small energy efficient robot, <a href="http://www.techtimes.com/articles/4748/20140325/roboclam-mimics-digging-ability-of-real-one-could-seek-out-underwater-mines.htm">developed by Amos Winter at MIT</a> can dig holes into the sand like a razor clam. This was possible since the researchers have understood the principle behind this clam&#8217;s ability  —<em> localized fluidization</em> — and were able to give a robotic digging clam similar abilities.  The RoboClam may be useful to monitor a biological situation under water or to bury anchors and terminate underwater mines. &#8220;<em>And the study of the robot gives deeper insight into the important mechanics behind burrowing through localized fluidization</em>” says <span style="color: #222222;">Amos Winter.</span> https://youtu.be/bztw9PUiRss</p>
<p><strong><span style="color: #565656;">Row-bot</span></strong></p>
<div id="attachment_5812" style="width: 310px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/Row-bot-Hemma-Philamore-BRL.jpg" rel="attachment wp-att-5812"><img class="wp-image-5812 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/Row-bot-Hemma-Philamore-BRL-300x199.jpg" alt="Row-bot Hemma Philamore, BRL" width="300" height="199" /></a><p class="wp-caption-text">Row-bot with its mouth open. Hemma Philamore, Univ. Bristol/BRL</p></div>
<p>Inspired by the water beetle, at the <a href="http://www.bristol.ac.uk/news/2015/november/row-bot.html">Bristol Robotics Laboratory</a>, a group of scientists have been developing a robot called <strong>Row-bot</strong> that can swim in remote locations by harvesting energy directly from the water using a microbial fuel cell as an artificial stomach.</p>
<p>&#8220;<em>When it is hungry the Row-bot opens its soft robotic mouth and rows forward to fill its microbial fuel cell (MFC) stomach with nutrient-rich dirty water. It then closes its mouth and slowly digests the nutrients&#8221;. </em>The Row-bot may be useful for environmental clean-up of contaminants in natural and man-made disasters.</p>
<p><strong>3D-printed soft robotic tentacles</strong></p>
<div id="attachment_5821" style="width: 289px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/3d-printed-robotic-tentacle.jpg" rel="attachment wp-att-5821"><img class="wp-image-5821 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/3d-printed-robotic-tentacle-279x300.jpg" alt="3d-printed-robotic-tentacle" width="279" height="300" /></a><p class="wp-caption-text">3D-printed robotic tentacle. Cornell University</p></div>
<p>Using an elastomer and a 3D printing technique, engineers at <a href="http://mediarelations.cornell.edu/2015/10/14/video-3d-printed-soft-robotic-tentacle-displays-new-level-of-agility/">Cornell University</a> have developed a method to re-create soft actuators. Using their new technique, a digital mask projection stereolithgraphy system, they have produced pairs of actuators that mimic the function of octopus tentacles.</p>
<p>As reported in a paper published in the journal <a href="https://cornell.app.box.com/softactuators/1/4929651481/40142266489/1">Bioinspiration &amp; Biomimetics</a>, the researchers believe that &#8220;<em>this nascent printing process for soft actuators is a promising route to sophisticated, biomimetic systems</em>&#8221; https://youtu.be/BZ5W7LyyKL0</p>
<p><strong>The RoboBee</strong></p>
<div id="attachment_5827" style="width: 310px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/10/RoboticInsectPhoto02.jpg" rel="attachment wp-att-5827"><img class="wp-image-5827 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/10/RoboticInsectPhoto02-300x200.jpg" alt="RoboticInsectPhoto02" width="300" height="200" /></a><p class="wp-caption-text">RoboBee. Wyss Institute</p></div>
<p>This very small flying robot, inspired by the biology of a bee, was initially developed by researchers from the <a href="http://wyss.harvard.edu/viewpage/457">Wyss Institute</a> at <a href="http://robobees.seas.harvard.edu/">Harvard University</a> in 2004. The RoboBee, designed at Robert J Wood’s lab, is a micro-robot, smaller than a fingernail, that flies and hovers like an insect, flapping its transparent wings 120 times per second. The research effort around the RoboBee project is believed to &#8220;<em>foster novel methods for designing and building an electronic surrogate nervous system able to deftly sense and adapt to changing environments; and advance work on the construction of small-scale flying mechanical devices&#8221;</em>. Scientist anticipate that these devices may have an impact in advancing fields ranging from entomology and developmental biology to amorphous computing and electrical engineering. http://wyss.harvard.edu/viewpage/428/</p>
<p><strong>The Tabbot</strong></p>
<div id="attachment_5815" style="width: 310px" class="wp-caption alignleft"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/spider-inspired-robot.png" rel="attachment wp-att-5815"><img class="wp-image-5815 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2016/01/spider-inspired-robot-300x232.png" alt="spider-inspired-robot" width="300" height="232" /></a><p class="wp-caption-text">Tabbot. by Ingo Rechenberg</p></div>
<p>The robot Tabbot has the looks of a cartwheeling desert-dwelling spider and it is named after tabacha, which means spider in the local Berber language in northern Africa. According to its developer, engineer Ingo Rechenberg &#8220;&#8230;s<em>uch a means of locomotion would be an advantage in a device meant to navigate the rough surface condition on Mars</em>&#8220;. Rechenberg, who teaches biomimetics at the Technical University of Berlin, believes that this kind of tumbling robots can be used in agriculture as well as on the ocean floor. https://youtu.be/OHo32JrkDRk For more biomimetic robots see our previous blogs and  <a href="http://csnblog.specs-lab.com/2013/12/08/biomimetic-robots-presented-at-robot-safari-in-london/">Biomimetic robots at Robot SafariEU in London</a> and <a href="http://csnblog.specs-lab.com/2013/07/12/biomimetics-wheres-it-at/">Biomimetics: Where’s it at?</a></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2015/10/22/an-ecology-of-robots-through-biomimetics/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Living Machines 2015</title>
		<link>https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/</link>
		<comments>https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/#comments</comments>
		<pubDate>Tue, 04 Aug 2015 16:54:45 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[science]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5747</guid>
		<description><![CDATA[Article by Michael Szollosy Just last week, La Pedrera, Barcelona, has hosted the Living Machines 2015, the 4th International conference on biomimetics and biohybrid systems. Running from the 28th – 31st of July, Living Machines 2015 is sponsored by the Convergent Science Network &#8230; <a href="https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p style="text-align: justify;"><span style="color: #373737;"><br />
Article by </span><a style="color: #617c96;" href="https://www.shef.ac.uk/scharr/sections/hsr/mh/sectionstaff/mszollosy">Michael Szollosy</a></p>
<p style="text-align: justify;"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/header_v1_big_full.png" rel="attachment wp-att-5749"><img class="alignright wp-image-5749 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/header_v1_big_full-300x127.png" alt="header_v1_big_full" width="300" height="127" /></a>Just last week, La Pedrera, Barcelona, has hosted the <a href="http://csnetwork.eu/livingmachines/conf2015"><em>Living Machines 2015</em></a>, the 4<sup>th</sup> International conference on biomimetics and biohybrid systems.</p>
<p style="text-align: justify;"><span id="more-5747"></span><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/La-Pedrera-Vista-general.jpg" rel="attachment wp-att-5750"><img class="alignright wp-image-5750 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/La-Pedrera-Vista-general-300x225.jpg" alt="La Pedrera Vista general" width="300" height="225" /></a>Running from the 28<sup>th</sup> – 31<sup>st</sup> of July, <em>Living Machines </em>2015 is sponsored by the <a href="http://www.csnetwork.eu/">Convergent Science Network </a>and feature plenary talks by internationally-renowned researchers in roboticists and <a href="http://csnetwork.eu/livingmachines/conf2015/workshops">workshops</a> examining the intersection of living and artificial systems. There were also <a href="http://csnetwork.eu/livingmachines/conf2015/spotlights">poster spotlights</a> and poster <a href="http://csnetwork.eu/livingmachines/conf2015/posters">sessions</a>, and robot and media demonstrations. A <a href="http://csnetwork.eu/livingmachines/conf2015/programme">full programme of the events can be found here</a>.<!--more--></p>
<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/mantisbot_whole.jpg" rel="attachment wp-att-5748"><img class="alignright wp-image-5748 size-medium" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/08/mantisbot_whole-300x200.jpg" alt="mantisbot_whole" width="300" height="200" /></a></p>
<p style="text-align: justify;"><em>Biomimetic</em> systems are technologies that draw their inspiration from biological systems; these can be used to improve artificial systems and offer solutions to technological and engineering, and can also be used to explore in greater depth natural systems themselves. <em>Biohybridity</em> refers to the merging of living and artificial systems to create new entities, and are used, for example, in robotics, materials, computing, brain-machine interfaces (e.g. neural implants), artificial organs and body parts.</p>
<p style="text-align: justify;"><a href="http://csnetwork.eu/livingmachines/conf2015/plenaryspeakers">Plenary speakers</a> at this year’s conference include:</p>
<ul>
<li style="text-align: justify;"><a href="http://biorobots.case.edu/personne/roger-quinn/">Roger Quinn</a>: Director of the Centre for <a href="http://biorobots.cwru.edu/">Biologically Inspired Robotics Researc</a>h at Case Western Reserve University in Cleveland, Ohio. Professor Quinn will talk on ‘Animals as models for robot mobility and autonomy:  Crawling, walking, running, climbing, and flying’</li>
<li style="text-align: justify;"><a href="http://mbr.iit.it/people/barbara-mazzolai.html">Barbara Mazzolai</a>: Director of the <a href="http://mbr.iit.it/">Centre for Micro-BioRobotics (CMBR) of the Istituto Italiano di Tecnologia (IIT) </a>of Genoa, Italy, and Deputy Director for Supervision and Organization of IIT Centres Network. Professor Mazzolai will be giving a talk entitled ‘From plants and animals to robots: movement, sensing and control’</li>
<li style="text-align: justify;"><a href="http://www.researchgate.net/profile/Ryad_Benosman">Ryad Benosman</a>: Professor at the University Pierre and Marie Curie, Paris, France, leading the <a href="http://www.institut-vision.org/index.php?option=com_content&amp;view=article&amp;id=283%3Aequipe-de-r-benosman&amp;catid=17%3Afiches&amp;Itemid=15&amp;lang=en">Natural Computation and Neuromorphic Vision Laboratory</a>, Vision Institute, Paris. Professor Benosman will be giving a talk entitled ‘Neuromorphic Event-based time oriented vision: A framework to unify computational and biological vision.’</li>
<li><a href="http://www.personal.leeds.ac.uk/~menrcr/">Robert Richardson</a>: Director of the Institute of <a href="https://www.engineering.leeds.ac.uk/idro/">Design, Robotics and Optimisation at the University of Leeds</a>. Professor Richardson will be talking about using robots for safety and security, surgical technologies for health and well-being, and rehabilitation and prosthetics.</li>
<li><a href="http://www.lied-pieri.univ-paris-diderot.fr/spip.php?article93">José Halloy</a>: Professor of <a href="http://www.univ-paris-diderot.fr/english/sc/site.php?bc=formations&amp;np=ficheufr&amp;n=13&amp;g=sm">Physics at Université Paris Diderot</a>. Professor Halloy will be speaking about collective intelligence in natural and artificial systems.</li>
</ul>
<p>And <a href="http://csnetwork.eu/livingmachines/conf2015/workshops">workshops</a> included discussion on topics such as:</p>
<ul>
<li><a href="http://csnetwork.eu/system/files/living-machines-files/robot_self_call_for_participation_1.pdf">The robot self</a></li>
<li><a href="http://csnetwork.eu/system/files/living-machines-files/nature_inspired_manufacturing_workshop_programme.pdf">Nature-inspired manufacturing</a></li>
<li><a href="http://csnetwork.eu/system/files/living-machines-files/bcn_flyer_28jul15_oneday.pdf">Bio-inspired design</a></li>
</ul>
<p style="text-align: justify;"><em>Living Machines</em> is one of the foremost conferences on robotics in the world, and is not to be missed. If you could not attend, the <a href="http://link.springer.com/book/10.1007/978-3-319-22979-9">proceedings are already available here</a> – do explore and have a look at some of the terrific ideas and developments being discussed. (Proceedings from previous years’ conferences can be found <a href="http://www.csnetwork.eu/livingmachines">here</a>.)</p>
<p style="text-align: justify;">For all inquiries contact <a href="mailto:info.csnetwork%40upf.edu">info.csnetwork@upf.edu</a></p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2015/08/04/living-machines-2015-in-barcelona/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>2015 the year of Personal Robots?</title>
		<link>https://csnblog.specs-lab.com/2015/01/25/2015-the-year-of-personal-robots/</link>
		<comments>https://csnblog.specs-lab.com/2015/01/25/2015-the-year-of-personal-robots/#comments</comments>
		<pubDate>Sun, 25 Jan 2015 13:01:54 +0000</pubDate>
		<dc:creator><![CDATA[Anna Mura]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[personal robots]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5503</guid>
		<description><![CDATA[Article by Michael Szollosy Amidst all the talk about the Big Trends in tech for 2015 – driverless cars apparently on the horizon, and of course the VR revolution will arrive just in time for next Christmas – is talk of personal &#8230; <a href="https://csnblog.specs-lab.com/2015/01/25/2015-the-year-of-personal-robots/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/01/shutterstock_ROBOTS.copy_1.jpg" rel="attachment wp-att-5528"><img class="alignnone wp-image-5528 size-large" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/01/shutterstock_ROBOTS.copy_1-1024x682.jpg" alt="shutterstock_ROBOTS.copy" width="584" height="388" /></a></p>
<p>Article by <a href="https://www.shef.ac.uk/scharr/sections/hsr/mh/sectionstaff/mszollosy">Michael Szollosy</a></p>
<p style="text-align: justify;">Amidst all the talk about the Big Trends in tech for 2015 – <a href="http://www.theguardian.com/technology/2015/jan/09/ces-roundup-superchips-driverless-cars-drones">driverless cars apparently on the horizon</a>, and of course the VR revolution will <a href="http://www.vice.com/read/the-future-of-video-games-in-2015-905">arrive just in time for next Christmas</a> – is talk of <em>personal robotics</em>: more than simple machines, these are robots that promise to organise our lives. Through the power of ‘emotional engines’ and other advances in Artificial Intelligence (some genuine, some less revolutionary than marketing agents would have us believe), these are robots that will become our companions, or perhaps even trustworthy friends.</p>
<p style="text-align: justify;">The key, of course, to the up-take of any new technology – beyond the tech-enthusiasts that gobble up anything new and innovative (e.g. <a href="http://www.theguardian.com/technology/2014/feb/19/google-glass-advice-smartglasses-glasshole">Glassholes</a>) – is how useful a product will be to the wider consumer market.</p>
<p style="text-align: justify;"><span id="more-5503"></span><br />
<iframe src="https://www.youtube.com/embed/8pSkPgBrcTA" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p style="text-align: justify;">The excitement over these personal robots, and perhaps certain problems, lie in our expectation of how ‘personal’ they can be.  According to their websites, <a href="http://www.myjibo.com/">JIBO</a>, and <a href="https://www.kickstarter.com/projects/403524037/personal-robot">this personal robot recently launched on Kickstarter</a>, these robots can wake us up, remind us of our appointments and organise our offices, take pictures of us and watch over our homes. These functions, for most of us, though, are all more than adequately performed by existing technologies, such as our phones and wearables. Another, <a href="https://www.aldebaran.com/en/a-robots/who-is-pepper">Pepper</a>, declares that it can be instructed to stack coloured blocks, and is shown performing art-house techno music. We have to ask if the ‘value added’ – what personal robots can offer that these existing technologies do not – is really something that we want robots or AI to be doing for us. A personal stylist? Someone to let our children watch TV in bed? Are these functions we need or want fulfilled by a new machine?</p>
<p style="text-align: justify;"><iframe src="https://www.youtube.com/embed/XcJccQqTM6Q" width="560" height="315" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p style="text-align: justify;">These robots are also described as ‘companions’, but with such limited intelligence and matrix of responses, they aren’t exactly very… chummy.</p>
<p style="text-align: justify;">The promise of convergence – an all-in-one tech solution – has a certain appeal. But your phone will fit in your pocket; your iPad in a bag slung over your shoulder. This generation of personal robots will need to be carried about from table top to table top. Or they can follow you about, but they’ll find it difficult to climb on the bus behind you.</p>
<p style="text-align: justify;">From one promotional video, the poor robot seems fated to wander about after its master, looking like a dejected, Disneyfied darlek, desperate just to hang out with the gang. Rarely have robots looked more frightening than when a cartooned-faced princess insists that <em>you wear</em> <em>the blue tie. </em>If the worst did happen and she transformed into a HAL 9000 on wheels, careening madly around your living room and mumbling about what you <em>should</em> <em>be eating for breakfast</em> you could always take solace in the knowledge that you could make a quick escape upstairs.</p>
<p style="text-align: justify;">The possibilities these robots offer for <a href="http://en.wikipedia.org/wiki/Telepresence"><em>telepresence</em> </a>– the ability to be in one place and have a functional, material presence somewhere else – is an application with tremendous potential, particularly for industry and specialist functions (dangerous work, health-care, etc.) For many of these robots aimed at the mass-consumer market, however, the best that the marketing personnel seem to be able to imagine is a sort of <a href="http://www.roboticstrends.com/article/hands_on_furo_i_home_personal_robot/CES">very expensive tablet case</a>, or a really big baby monitor.</p>
<p style="text-align: justify;">There is undoubtedly an important role to play for such robots in the care of disabled or elderly. Robots like <a href="http://5elementsrobotics.com/">Budgee</a>, or a smart-table that we saw at the recent launch of <a href="http://www.sheffieldrobotics.ac.uk/">Sheffield Robotics</a>, which could arrange itself around its user to perform a variety of functions (writing, eating, assisting with movement) look more promising. Rather than trying to create demand, these devices offer solutions to a particular set of existing problems. (For example, <a href="http://grillbots.com/">cleaning the grill of your oven</a>.)</p>
<p style="text-align: justify;">And, perhaps most importantly, these more task-specific robots don’t have faces. Because the anthropomorphisation of machines carries with it all sorts of complications.</p>
<p style="text-align: justify;">First, simply put: we aren’t capable yet of creating robots with a personality that is compelling, or recognisable as a ‘personality’, to most people. This creates unrealistic expectations. And it is a problem because it helps fuel the fear of intelligent robots, and the feeling that the <a href="https://www.goodreads.com/book/show/9634967-robopocalypse">robopocalypse </a>is just around the corner.</p>
<p style="text-align: justify;">On a cultural level, we could say that the promise of such robots is ‘the stuff of science fiction’; though whether it is fiction or fact, we are still faced with some intriguing questions: what is it we are hoping to achieve by trying to create machines that are not only useful and intelligent, but are also emotionally engaging? Are we are expecting something beyond <em>instrumentalisation</em>? that is, something more than a tool for particular jobs?</p>
<p style="text-align: justify;">Whether any of these robots live up to the promise and hype we’ve seen advertised remains to be seen. But, as with any technological innovation, until designers answer the question – <em>what is this for?</em> – such devices may struggle to succeed in the consumer market.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2015/01/25/2015-the-year-of-personal-robots/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>This cuttlefish robot is actually better than cuttlefish</title>
		<link>https://csnblog.specs-lab.com/2015/01/18/this-cuttlefish-robot-is-actually-better-than-cuttlefish/</link>
		<comments>https://csnblog.specs-lab.com/2015/01/18/this-cuttlefish-robot-is-actually-better-than-cuttlefish/#comments</comments>
		<pubDate>Sun, 18 Jan 2015 12:35:30 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Biomimetic Robots]]></category>
		<category><![CDATA[cuttlefish robot]]></category>
		<category><![CDATA[ETH Zurich]]></category>
		<category><![CDATA[Sepios]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5492</guid>
		<description><![CDATA[A new marine robot, called Sepios, has recently joined the ever-growing robotic animal kingdom. Built by a group of students from Switzerland’s ETH Zurich, this biomimetic robot was inspired by yet another marine creature, namely a cuttlefish. The interesting thing &#8230; <a href="https://csnblog.specs-lab.com/2015/01/18/this-cuttlefish-robot-is-actually-better-than-cuttlefish/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5494" style="width: 630px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2015/01/sepios-1419935157964.jpg" rel="attachment wp-att-5494"><img class="size-full wp-image-5494" src="http://csnblog.specs-lab.com/wp-content/uploads/2015/01/sepios-1419935157964.jpg" alt="Sepios robot Credit: ETH Zurich" width="620" height="465" /></a><p class="wp-caption-text">Sepios robot<br />Credit: ETH Zurich</p></div>
<p>A new marine robot, called Sepios, has recently joined the ever-growing robotic animal kingdom. Built by a group of students from Switzerland’s ETH Zurich, this biomimetic robot was inspired by yet another marine creature, namely a cuttlefish. The interesting thing is that Sepios can actually do better than the creature that inspired it.</p>
<p><span id="more-5492"></span></p>
<p>Cuttlefish have two elongated fins that produce a beautiful undulating motion and allow these animals to move forward and backward, turn on the spot, or hover. Sepios has four such fins. The extra pair makes it possible for the robot to propel itself in any direction, including straight up and down, and rotate on any axis. Simply put, Sepios is omnidirectional, which cannot be said about the cuttlefish.</p>
<p>The fins are driven by the total of 36 servo motors and can reach the maximum speed of 1.8 km/h.</p>
<p>Perhaps the robot’s biggest advantage is that its fins cause very little turbulence and allow for a greater control as opposed to many other underwater vehicles. Sepios, for example, can easily navigate through patches of sea grass without leaving a mess behind.</p>
<p>Such properties suggest that Sepios will come in handy for marine life observation. The video below certainly proves the point, as Sepios seems to get on quite well with real fish.</p>
<p>Sepios is not the first biomimetic robot to use undulating propulsion. Its predecessors include <a href="http://www.northwestern.edu/newscenter/stories/2011/01/robotic-ghost-knifefish.html">this knifefish robot</a> developed by researchers at Northwestern University and <a href="http://www.youtube.com/watch?v=mejYGMuv_1A">another cuttlefish robot</a> from <a href="https://www.nextgenaero.com/index.html">NextGen Aeronautics</a>. Still, Sepios is the first to feature four undulating fins, making it the only truly omnidirectional vehicle that uses this kind of propulsion.</p>
<p><iframe width="584" height="329" src="http://www.youtube.com/embed/GeCLL2RWV1c?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>The amount of robots inspired by various marine critters has increased tremendously in the past years. Various types of fish, mollusks and even jellyfish consistently provide scientists with new ideas.</p>
<p>You may also be interested in <a href="http://news.nus.edu.sg/press-releases/8450-nus-researchers-develop-new-generation-thinking-biomimetic-robots-as-ocean-engineering-solutions">this recent announcement</a> from the National University of Singapore, which is developing a whole range of bio-inspired marine robots, including a smart robotic sea turtle.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2015/01/18/this-cuttlefish-robot-is-actually-better-than-cuttlefish/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Europe’s largest robot fleet observation mission is underway</title>
		<link>https://csnblog.specs-lab.com/2014/11/20/europes-largest-robot-fleet-observation-mission-is-underway/</link>
		<comments>https://csnblog.specs-lab.com/2014/11/20/europes-largest-robot-fleet-observation-mission-is-underway/#comments</comments>
		<pubDate>Thu, 20 Nov 2014 10:52:10 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Europe]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and the Environment]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Exploring Ocean Fronts]]></category>
		<category><![CDATA[National Oceanography Centre]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5481</guid>
		<description><![CDATA[Several decades ago, Earth observation satellites transformed how we keep track of changes on our planet. Now we are rapidly crossing a new technological threshold that will allow us to pick up even the most subtle variations in the environment. Imagine swarms &#8230; <a href="https://csnblog.specs-lab.com/2014/11/20/europes-largest-robot-fleet-observation-mission-is-underway/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5483" style="width: 610px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/11/eof-C-Enduro-launched-img3.jpg" rel="attachment wp-att-5483"><img class="size-full wp-image-5483" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/11/eof-C-Enduro-launched-img3.jpg" alt="C-Enduro vehicle sets off Credit: National Oceanography Centre" width="600" height="400" /></a><p class="wp-caption-text">C-Enduro vehicle sets off<br />Credit: National Oceanography Centre</p></div>
<p>Several decades ago, Earth observation satellites transformed how we keep track of changes on our planet. Now we are rapidly crossing a new technological threshold that will allow us to pick up even the most subtle variations in the environment.</p>
<p>Imagine swarms of autonomous robots roaming the globe by land, sea and air, together producing the ultimate picture of what is going on on our planet. This great vision is already becoming a reality – or at least with respect to the sea.</p>
<p><span id="more-5481"></span></p>
<p>Recently UK scientists have unleashed an entire fleet of autonomous marine robots to travel about 500 km across an area of southwestern UK. The fleet comprises <a href="http://projects.noc.ac.uk/exploring-ocean-fronts/vehicles">4 types of vehicles</a>, including both underwater and surface ones. The great thing is that all the vehicles rely on renewable sources of energy, thanks to which they can spend months offshore without any human intervention.</p>
<p>Instruments on board the vehicles record key parameters of the ocean, ranging from the temperature of the water to the density of plankton populations. Equipped with GoPro cameras, the robots are also expected to take some spectacular shots of marine life.</p>
<p><a href="http://projects.noc.ac.uk/exploring-ocean-fronts/">The Exploring Ocean Fronts project</a> is led by the <a href="http://noc.ac.uk/">National Oceanography Centre</a> and is already referred to as the most ambitious of its kind in Europe. The project is now in phase two, in which several vehicles are attempting to track acoustically tagged fish. The goal is to get an insight into the daily habits of marine life, which, believe it or not, we know very little about. The obtained information will inform future decisions regarding ocean management, including those directed at achieving sustainable fisheries.</p>
<p>Potential benefits of such massive robot observation missions, of course, go way beyond that. For instance, a better understanding of how the ocean varies over time and space can immensely benefit climate and weather research.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/11/20/europes-largest-robot-fleet-observation-mission-is-underway/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Wearable robots will take the burden from workers&#8217; shoulders</title>
		<link>https://csnblog.specs-lab.com/2014/09/19/wearable-robots-will-take-the-burden-from-workers-shoulders/</link>
		<comments>https://csnblog.specs-lab.com/2014/09/19/wearable-robots-will-take-the-burden-from-workers-shoulders/#comments</comments>
		<pubDate>Fri, 19 Sep 2014 15:13:13 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Robots and Health]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Daewoo Shipbuilding and Marine Engineering]]></category>
		<category><![CDATA[d’Arbeloff Laboratory for Information Systems and Technology.]]></category>
		<category><![CDATA[exoskeleton]]></category>
		<category><![CDATA[MIT]]></category>
		<category><![CDATA[SRL]]></category>
		<category><![CDATA[Supernumerary Robot Limbs]]></category>
		<category><![CDATA[Wearable robotics]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5414</guid>
		<description><![CDATA[Everybody has been in a situation when we wish we had stronger arms or, even better, an extra pair of them. Whether it is attaching something large overhead or manipulating something heavy, we all know we are bound to run into &#8230; <a href="https://csnblog.specs-lab.com/2014/09/19/wearable-robots-will-take-the-burden-from-workers-shoulders/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_5421" style="width: 287px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/mg22329803.900-1_300.jpg" rel="attachment wp-att-5421"><img class="wp-image-5421" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/mg22329803.900-1_300.jpg" alt="Credit: Daewoo" width="277" height="370" /></a><p class="wp-caption-text">Credit: Daewoo</p></div>
<p>Everybody has been in a situation when we wish we had stronger arms or, even better, an extra pair of them. Whether it is attaching something large overhead or manipulating something heavy, we all know we are bound to run into the limitations of our own anatomical design. In some professions, such as construction work, these difficulties can surface practically every day. To make physical drudgery less stressful and traumatic, researchers around the globe are now developing a new kind of robots that will be worn on the body just like your regular backpack.</p>
<p><span id="more-5414"></span></p>
<p>Wearable robotics flourishes on the collaboration between the human and the machine and has a huge potential in all kinds of physically challenging work. This idea has already been put to test by <a href="http://www.dsme.co.kr/epub/main/index.do">Daewoo Shipbuilding &amp; Marine Engineering</a>, one of the biggest shipbuilders in the world. Korean shipyards have long been known for their high degree of automatisation. Now it appears the Korean company has decided to go one step further.</p>
<p>The company has developed a wearable exoskeleton that allows workers to carry huge pieces of metal and other heavy components with no or little effort. The exoskeleton weighs around 30 kg, none of which, however, is felt by the wearer since the suit is designed to support itself and follow the wearer’s movements.</p>
<div id="attachment_5426" style="width: 332px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/wearable-robot-from-DSME-2.jpg" rel="attachment wp-att-5426"><img class=" wp-image-5426" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/wearable-robot-from-DSME-2.jpg" alt="Credit: Daewoo" width="322" height="429" /></a><p class="wp-caption-text">Credit: Daewoo</p></div>
<p>The prototype can lift and precisely manipulate objects with a mass of up to 30 kg. The test has demonstrated that this technology can indeed help workers with their daily tasks, although those who had a chance to take part in the test run say they would like to be able to move faster and lift even heavier weights – a goal the research team is already working towards: the current research target is an exoskeleton that can lift up to 100 kg and be used on a daily basis at shipyard facilities.</p>
<p>Another example of how wearable robots can literally give a hand to future workers comes from the <a href="http://darbelofflab.mit.edu/">MIT’s d’Arbeloff Laboratory for Information Systems and Technology</a>. The lab is working on a pair of lightweight robotic arms attached to a backpack that are envisioned to assist people with those tasks where our two arms are just not enough.</p>
<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/ojkcq3bpls4u01q72iwi.gif" rel="attachment wp-att-5425"><img class="aligncenter size-full wp-image-5425" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/09/ojkcq3bpls4u01q72iwi.gif" alt="ojkcq3bpls4u01q72iwi" width="635" height="288" /></a></p>
<p>The project called <a href="http://darbelofflab.mit.edu/?q=node/22">SRL (Supernumerary Robot Limbs)</a> is supported by Boeing and was recently used in a demo that involved installing ceiling panels in an airplane, a highly repetitive task that is difficult to perform on your own. By pushing the panels against the ceiling, the device can alleviate the worker from the necessity of simultaneously holding the panel, inserting the screws and using the screwdriver to attach it.</p>
<p>Watch the video below to see the prototype in action.</p>
<p><iframe width="584" height="329" src="http://www.youtube.com/embed/LkXpldrhRm4?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/09/19/wearable-robots-will-take-the-burden-from-workers-shoulders/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Don’t be afraid of big data</title>
		<link>https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/</link>
		<comments>https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/#comments</comments>
		<pubDate>Sun, 17 Aug 2014 14:44:55 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Europe]]></category>
		<category><![CDATA[BrainX3]]></category>
		<category><![CDATA[CEEDS]]></category>
		<category><![CDATA[European Commiss]]></category>
		<category><![CDATA[eXperience Induction Machine]]></category>
		<category><![CDATA[Jonathan Freeman]]></category>
		<category><![CDATA[Neelie Kroes]]></category>
		<category><![CDATA[Pompeu Fabra University]]></category>
		<category><![CDATA[SPECS]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5402</guid>
		<description><![CDATA[European Commission bets on data-driven economy Information can be scary, and even more so when we find ourselves humbled by its immensity. In a press release issued earlier this week, the European Commission has once again demonstrated that it is not afraid of &#8230; <a href="https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<h2>European Commission bets on data-driven economy</h2>
<p><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/08/images-Ceeds-image.jpg" rel="attachment wp-att-5406"><img class="alignleft wp-image-5406" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/08/images-Ceeds-image.jpg" alt="images-Ceeds image" width="500" height="352" /></a></p>
<p>Information can be scary, and even more so when we find ourselves humbled by its immensity.<a href="http://europa.eu/rapid/press-release_IP-14-916_en.htm"> In a press release</a> issued earlier this week, the European Commission has once again demonstrated that it is not afraid of big data. Quite the opposite, Europe is more than ever ready to embrace it – a gesture, which is reflected in Europe&#8217;s strong bet on research projects like <a href="http://ceeds-project.eu/">CEEDs</a>, which uses big data to enhance human cognition and improve problem solving.</p>
<p><span id="more-5402"></span><a href="http://csnblog.specs-lab.com/2014/05/07/virtual-reality-labs-reshape-how-we-process-information/">In a previous post</a>, we already discussed CEEDs and the <a href="http://specs.upf.edu/research_in_mixed_and_virtual_reality">eXperience Induction Machine</a> (XIM), the heart of the project, located in the <a href="http://specs.upf.edu/">SPECS lab</a> at <a href="http://www.upf.edu/en/">Pompeu Fabra University</a> in Barcelona. The press release singles out CEEDs as an example of successful and highly promising big data research initiative.</p>
<p>Although XIM has so far mainly been applied to visualising brain (<a href="http://www.brainx3.com/">BrainX3</a>) and historical (<a href="http://specs.upf.edu/installation/2772">Bergen-Belsen reconstruction</a>) data and will certainly bring about a huge qualitative change in how scientists work with tremendous amounts of information, the integration of this technology into more down-to-earth application fields seems imminent.</p>
<p>The press release reports that early interest in the XIM technology is already coming from several museums in Germany, the Netherlands, the UK and the United States, where it could potentially help with gathering and reacting to feedback from visitors. This naturally applies to many other public spaces such as shops, libraries and concerts. The CEEDs team is also conducting negotiations with several public, charity and commercial organisations to further extend the scope of application of the platform.</p>
<p>The CEEDs project coordinator <a href="http://www.gold.ac.uk/psychology/staff/freeman/">Jonathan Freeman</a>, Professor of Psychology at <a href="http://www.gold.ac.uk/">Goldsmiths</a>, <a href="http://www.lon.ac.uk/">University of London</a> pointed out that “anywhere where there’s a wealth of data that either requires a lot of time or an incredible effort, there is potential.” In science, whole disciplines, from satellite imagery inspection to oil prospecting and astronomy, could benefit immensely from this novel approach to processing information.</p>
<p>With projects like CEEDs, Europe is working its way towards a new data-driven economy, a long-time goal, which the European Commission is now actively promoting across national governments. The European approach towards big data is perhaps best expressed in the words of the vice-president of the European Commission <a href="http://ec.europa.eu/commission_2010-2014/kroes/">Neelie Kroes</a>: “Big data doesn’t have to be scary. Projects like this enable us to take control of data and deal with it so we can get to solving problems. Leaders need to embrace big data.”</p>
<p>You can also read <a href="http://www.cbronline.com/news/tech/software/businessintelligence/the-5-coolest-eu-big-data-projects-4340683">this article</a> to learn about some other exciting big data projects backed by the European Commission.</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/08/17/dont-be-afraid-of-big-data/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Human or machine?</title>
		<link>https://csnblog.specs-lab.com/2014/08/02/human-or-machine/</link>
		<comments>https://csnblog.specs-lab.com/2014/08/02/human-or-machine/#comments</comments>
		<pubDate>Sat, 02 Aug 2014 07:04:48 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Ethics]]></category>
		<category><![CDATA[Robots and Research]]></category>
		<category><![CDATA[Robots and Society]]></category>
		<category><![CDATA[Robots Around the World]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[Science Fiction]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Hiroshi Ishiguro]]></category>
		<category><![CDATA[Humanoid robots]]></category>
		<category><![CDATA[Kodomoroid]]></category>
		<category><![CDATA[Lars Lundstroem]]></category>
		<category><![CDATA[Otonaroid]]></category>
		<category><![CDATA[Real Humans]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5390</guid>
		<description><![CDATA[Should we make robots more human-like? A hit Swedish TV show has a say Although we may be decades away from building truly life-like humanoid robots, it is never too early to start questioning the legal and ethical implications of creating &#8230; <a href="https://csnblog.specs-lab.com/2014/08/02/human-or-machine/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<h1>Should we make robots more human-like? A hit Swedish TV show<em> </em>has a say</h1>
<div id="attachment_5391" style="width: 630px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/pg-42-real-humans-1.jpg" rel="attachment wp-att-5391"><img class="wp-image-5391 size-full" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/pg-42-real-humans-1.jpg" alt="pg-42-real-humans (1)" width="620" height="465" /></a><p class="wp-caption-text">Credit: Real Humans</p></div>
<p>Although we may be decades away from building truly life-like humanoid robots, it is never too early to start questioning the legal and ethical implications of creating machines that are hard to tell apart from ourselves. In a brave leap of imagination, <a href="http://en.wikipedia.org/wiki/Real_Humans"><em>Real Humans</em></a>, a popular Swedish TV show, written by Lars Lundstroem, deliberately blurs the line between humans and robots to explore what it means to be human.</p>
<p><span id="more-5390"></span></p>
<p>The show, which could have been inspired by <a href="http://en.wikipedia.org/wiki/Hiroshi_Ishiguro">Hiroshi Ishiguro’s</a> weirdest dream, is set in an alternative present-day Sweden, where extremely life-like androids with perfect looks called “hubots” are commercialised` to take care of all domestic and workplace drudgery.</p>
<p>With time, some hubots are programmed to acquire free will and become capable of entering into social and even intimate relations with humans. The story follows the emotional effects on two families in possession of hubots as well as the trials and tribulations of a group of hubots that decide to fight for their rights.</p>
<p><em>Real Humans</em> has received a positive critical acclaim, even though it has been repeatedly characterised as creepy and disturbing, which in fact seems to be part of the writers’ intention. The premise of the show allows the creators to explore diverse philosophical questions as well as contemporary social issues.</p>
<p>Although the creators of the show claim that there was no science to rely on in the making of <em>Real </em>Humans, it is remarkable how the show&#8217;s vision resonates with some of the statements made recently by the mentioned Japanese roboticist Hiroshi Ishiguro during the presentation of his latest creations, two eerily human-looking robot newscasters Kodomoroid and Otonaroid. “Making androids is about exploring what it means to be human,” Ishiguro explained to reporters, “examining the question of what emotion is, what awareness is, what thinking is.”</p>
<div id="attachment_5396" style="width: 650px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Kodomoroid-presentatore-televisivo-robot1-640x359.jpg" rel="attachment wp-att-5396"><img class="wp-image-5396 size-full" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/Kodomoroid-presentatore-televisivo-robot1-640x359.jpg" alt="Kodomoroid and Otenaroid during the demonstration last month" width="640" height="359" /></a><p class="wp-caption-text">Hiroshi Ishiguro&#8217;s latest androids Kodomoroid and Otenaroid during the demonstration last month</p></div>
<p>Just like the creators of <em>Real Humans, </em>robotics researchers, such as Ishiguro, warn of possible legal and ethical complications that may arise when humans and robots form stronger bonds, especially if the latter look and behave like us. Most of the present-day robots look deliberately artificial, but there is no reason why it has to remain so once the technology becomes available to make them look human.</p>
<p><em>Real Humans </em>was released in 2012 and screened in at least 50 other countries with great success. Now the show is to be remade in English, with the premiere scheduled for 2015.</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/08/02/human-or-machine/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Human Brain Project under attack</title>
		<link>https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/</link>
		<comments>https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/#comments</comments>
		<pubDate>Fri, 18 Jul 2014 15:42:40 +0000</pubDate>
		<dc:creator><![CDATA[Dmitry Malkov]]></dc:creator>
				<category><![CDATA[Biology]]></category>
		<category><![CDATA[Biomimetics]]></category>
		<category><![CDATA[Cognitive Sciences]]></category>
		<category><![CDATA[Computer Science]]></category>
		<category><![CDATA[Europe]]></category>
		<category><![CDATA[Robots, Brain, Mind and Behaviour]]></category>
		<category><![CDATA[European Commission]]></category>
		<category><![CDATA[FET Flagship]]></category>
		<category><![CDATA[Human Brain Project]]></category>
		<category><![CDATA[ICT]]></category>

		<guid isPermaLink="false">http://csnblog.specs-lab.com/?p=5378</guid>
		<description><![CDATA[Last week, the eyes of the scientific community were fixed on the € 1.2 billion Human Brain Project (HBP) as more than 150 European neuroscientists raised concerns over the project&#8217;s management in an open letter to the European Commission. One &#8230; <a href="https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<h1 style="text-align: justify"></h1>
<div id="attachment_5379" style="width: 683px" class="wp-caption aligncenter"><a href="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/human-brain-project.jpg" rel="attachment wp-att-5379"><img class="size-full wp-image-5379" src="http://csnblog.specs-lab.com/wp-content/uploads/2014/07/human-brain-project.jpg" alt="Credit: Human Brain Project" width="673" height="378" /></a><p class="wp-caption-text">Credit: Human Brain Project</p></div>
<p>Last week, the eyes of the scientific community were fixed on the € 1.2 billion <a href="https://www.humanbrainproject.eu/">Human Brain Project</a> (HBP) as more than 150 European neuroscientists raised concerns over the project&#8217;s management in <a href="http://www.neurofuture.eu/">an open letter</a> to the European Commission.</p>
<p>One of the two Europe’s <a href="http://cordis.europa.eu/fp7/ict/programme/fet/flagship/home_en.html">Flagship Initiatives</a>, the HBP spans 112 research institutions across 24 countries and was launched last year with the grand vision of creating a long-needed ICT infrastructure for future brain research. Not without controversy, the project adopted a bottom-up approach to build a computer simulation of the brain based exclusively on the fundamental understanding of neurons and their interactions.</p>
<p><span id="more-5378"></span></p>
<p>The public outcry is not surprising given that the project has been surrounded by heated discussions from the very beginning when a number of labs refused to be part of the project because of its narrow focus on ICT and an apparent lack of basic neuroscience. Now many researchers fear that the inevitable failure of the project will cause a wave of adverse reaction to neuroscience undermining the future of the field.</p>
<p>The letter was largely driven by the recent changes made in the project plans for the next stage, which limits the role of cognitive scientists who pursue the difficult task of understanding the brain on the level of thought and behaviour. Now the labs working in this direction are to be repositioned from the project’s core to what is known as partnering projects (PPs). The concern is that, while the resulting computer simulations may not be completely useless, without a more pronounced theoretical component they will fail to elucidate brain functions.</p>
<p>A detailed review of the second stage by the EU commission is scheduled for January 2015 and the letter’s authors hope to bring the attention of the reviewers to the flaws in both science and management of the project. The second stage is expected to receive € 100 million over the course of 2 to 3 years, with a 50/50 split between the CP and the PPs.</p>
<p><a href="https://www.humanbrainproject.eu/documents/10180/17646/HBP-Statement.090614.pdf">The official response</a>, released by the HBP two days after the letter, shows signs of disposition and openness to dialogue. The response states that “the members of the HBP are saddened by the open letter” and invite the signatories to engage in direct discussion with the project leaders. Importantly, the response strongly suggests that cognitive neuroscience and other basic research will have an increasingly crucial role in the project as the required ICT platform comes into place.</p>
<p>Lots of researchers still firmly stand by the project arguing it&#8217;s a long-needed change in brain research. You may also be interested in reading <a href="http://www.newscientist.com/article/mg22329784.400-defending-the-grand-vision-of-the-human-brain-project.html#.U8Zj542Szbw">this article</a> defending the project by <a href="http://www.unil.ch/lren/en/home/menuinst/lab-members/honorary-pis/richard-frackowiak.html">Richard Frackowiak,</a> the co-executive director of the HBP.</p>
<p>What is clear is that the HBP has not managed to entirely unite neuroscientists, but when it comes to such grand projects this is not as surprising as it may seem. The management might need to become more consensual and we can only hope that HBP will continue its 10-year journey to unravel the universe inside our heads.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://csnblog.specs-lab.com/2014/07/18/human-brain-project-under-attack/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
