Call for Papers: Special Issue on Unconventional Sensors in Robotics
Endowing robots with range, visual and propioceptive sensing capabilities has enabled huge scientific progress in many of their fundamental skills such as mapping, navigation and interaction. These conventional modalities, however, are limited in many aspects and do
not fit all of the potential spectrum of robots, tasks and scenarios. Passive visual sensing,
for instance, requires significant computational resources, that might not be available in all
platforms. In addition to a similarly high computational footprint, active range sensing has
also a significant power consumption. None of these sensing technologies is a good fit for
underwater, micro/nano-, or soft robots, to name a few. Moreover, these perceptual abilities
might be insufficient or inadequate to address some robotic challenges, like fine-grained/highspeed/accurate manipulation, chemical source localisation, high-speed tracking, avoidance of highly dynamic obstacles, autonomy in extreme/hazardous environments or natural human-robot
The limitations of conventional sensors stem from different factors, such as the different capabilities of existing technologies, their inability to measure certain variables of interest, or an insufficient accuracy or robustness in certain conditions. For instance, visual sensors are strongly constrained by the illumination and transmission media and, in particular, strongly limited in the dark or in turbid waters. In other cases, the response time of the sensors puts strict limits to the dynamics of the robot and the scene. In these cases and many
others, research on unconventional sensing modalities can equip robots with new capabilities
and improve their fundamental ones, contributing to address many relevant challenges and
broadening the tasks and scenarios where robots can be deployed.
This special issue will focus on the use of unconventional sensors to address robotic challenges where standard vision, range and proprioception are not sufficient, dealing with both the scientific foundations of novel unconventional sensors and their use and applications on real robots.
Topics of interest include, but are not necessarily limited to:
- Unconventional visual sensors.
- Sound perception.
- Tactile sensing.
- Olfactory sensing.
- Applications of unconventional sensors in robotics.
- Learning and cognition using unconventional sensing.
- Semantic perception using unconventional sensors
- Localization and mapping using unconventional sensors.
- Sensor fusion using unconventional sensors for navigation, mapping, obstacle avoidance
and object manipulation.
- Sensory feedback in prosthetic devices using unconventional sensors.
- Unconventional sensing for feedback control.
- Motion planning and re-planning using unconventional sensor measurements.
- Design and development of novel unconventional sensors.
- Deep learning-based approaches to unconventional sensor data.
- Augusto Gomez Eguiluz, University of Seville, Spain (email@example.com)
- Javier Civera, University of Zaragoza, Spain (firstname.lastname@example.org)
- Inaki Rano, University of Southern Denmark, Denmark (email@example.com)
Contributors are welcome to contact the guest editors of the special issue for questions and
- June 1st 2021 - Submission deadline manuscripts.
- July 31st 2021 - Initial reviews completed.
- August 31st 2021 - First-round decisions and author notification.
- September 30th 2021 - Second-round submission deadline for conditionally accepted papers.
- November 15th 2021 - Final decisions and author notification.
- December 31th 2021 - Final manuscripts for publication.
Submit manuscripts to: http://AURO.edmgr.com
Please choose “SI 205: Unconventional Sensors in Robotics” as the Article Type.
Authors are encouraged to submit high-quality, original work that has neither appeared in, nor is under consideration by other journals.
Papers must be prepared in accordance with the Journal guidelines: www.springer.com/10514
Springer provides a host of information about publishing in a Springer Journal on our Journal Author Resources page, including FAQs, Tutorials along with Help and Support.
Other links include: