Presentation - Towards a Robot Web of Interoperable Spatial AI
Safe and useful robots for complex environments must use their on-board sensors and computation to map, understand and localise within their surroundings, and we can envision a future where many such devices, with different functions and made by different companies, should operate in the same space. Is there a more modular way for this to work than all devices needing to use the same unified cloud-based "maps" system?
I will present and demonstrate our Robot Web proposal for distributed solutions to many robot localisation and planning based on per-device local computation and storage, and peer to peer communication between heterogenous devices via standardised open protocols. Our method uses Gaussian Belief Propagation-based distributed inference on full non-linear factor graph, and is highly robust and scalable while remaining simple and modular.
Andrew Davison is Professor of Robot Vision and Director of the Dyson Robotics Laboratory at Imperial College London. His long-term research focus is on SLAM (Simultaneous Localisation and Mapping) and its evolution towards general `Spatial AI': computer vision algorithms which enable robots and other artificial devices to map, localise within and ultimately understand and interact with the 3D spaces around them. With his research group and collaborators he has consistently developed and demonstrated breakthrough systems, including MonoSLAM, KinectFusion, SLAM++ and CodeSLAM, and recent prizes include Best Paper at ECCV 2016, Best Paper Honourable Mention at CVPR 2018 and the Helmholtz Prize at ICCV 2021. He has also had strong involvement in taking this technology into real applications, in particular through his work with Dyson on the design of the visual mapping system inside the Dyson 360 Eye robot vacuum cleaner and as co-founder of applied SLAM start-up SLAMcore. He was elected Fellow of the Royal Academy of Engineering in 2017.