Robots of the future will also benefit from brain-computer interfaces (BCI). A BCI forges a direct, online communication between brain and machine and can augment human capabilities [SM18]. It allows brain activity to control a robot without the mediation of the peripheral, somatomotor nervous system. BCI has major applications in enabling paralysed patients to communicate and control robotic prosthetics and in rehabilitation for restoring neural function [SM18]. BCIs use machine learning to translate user intentions into outputs or actions. Recent advances in BCIs have been accelerated by allied technologies, including neuroscience, sensor technologies and component miniaturisation, biocompatibility of materials, and embedded computing [SM18].
For BCI to gain broader traction requires low cost implantable sensing with new microfabrication, packaging, and flexible electronics, combined with ultra-low power local processing, and wireless data paths. BCI currently faces the challenges of size (cumbersome), expense (high), dealing with artefacts of noncerebral origin in data processing, the availability of simpler techniques such as eye tracking or muscle-based devices, and dealing with tasks with high degrees of freedom. Exciting new research opportunities will accompany the development of BCIs in robot control, functional rehabilitation and in knowledge exchange with neuroscience [SM18].
Robots of the future will also be able to learn from the behaviour of natural systems such as humans, primates, other vertebrates and even insects [AAS18]. Creatures with relatively small, energy efficient, and ‘simple’ nervous systems can demonstrate impressive levels of cognition that robots can aspire to. Such learning includes guessing the intention of other animals, problem solving and tool usage, observation and imitation, and learning motor actions using mirror neurons. These animal behaviours may be the key for the basis of learning by imitation in animals and could also provide useful clues for the design of algorithms that will enable robots to learn through observation and imitation and inspire new robot learning strategies [AAS18].