Robotic rodents that can "feel" objects could soon become a reality thanks to European scientists who are working hard to create artificial mice.
As part of the AMouse project, researchers from Italy, Germany and Switzerland have created a " feeling" robot by developing a new type of sensor modelled on hypersensitive mouse whiskers.
The researchers developed a large series of different sensors based on a wide variety of 'whisker' types. These sensors were then added to mainly standard research robots.
The project, funded by the Future and Emerging Technologies initiative of the EU's Information Society Technologies programme, has given the researchers enhanced insights into how mouse whiskers do their job, and enabled them to develop prototypes that can be used to distinguish between different textures or objects.
Researchers used a variety of materials for the whiskers, from plastic to human hair attached to the condenser plate of a microphone. As the whisker encountered an object or surface, it deformed the microphone diaphragm in a measurable way, allowing researchers to track characteristic signals from particular surfaces.
The scientists then experimented with various whisker arrays and designs to discover the optimal profile.
"Even more exciting, however, were the results from 'multimodal' sensor experiments. These use a combination of vision and touch through whisker and light or camera sensors," the scientists stated in a newly published report (PDF available here).
"The mix of sensory inputs revealed how different data sources affect each other and how they combine to provide a clearer perception of any particular object. Some robots even manifested emergent behaviour."
In one startling outcome an AMouse robot demonstrated what appeared to be emergent behaviour when it developed a homing instinct without any pre-programming of any kind.
"Essentially we put in the sensors and then wire them up through the robots 'brain', its CPU. We just switch it on without giving it instructions of any kind," said Simon Bovet, a Ph.D. student at the University of Zurich.
When he threw the switch his robot started moving about the room but always returned to the spot where it began.
"I think emergent behaviour like this will be a major area in neuroscience and robotics research in the future," added Dr Andreas K. Engel, professor at the University Medical Centre in Hamburg and co-ordinator of the AMouse project.
"We can study neural pathways and neural coding in a machine, in a way that's currently impossible in humans. In a robot we can isolate a particular neural pathway to see what happens to other neurons when we trigger a specific one.
"In humans, if we stimulate one neuron it will influence changes a large number of other neurons, so it's impossible to track what's going on."
Dr Engel believes that robotic models will offer many further insights into human cognition in the future.
Electronics and computer chain the latest high street retailer to fall into difficulties
Incisive Media and Investec Asset Management supported fundraiser crosses Atlantic in 40 days
Alphabet's health sciences division Verily have been messing with AI algorithms
North Korea's cyber attack capabilities are expanding fast - and turning their fire on a wider range of targets