Visual SLAM combines AI and 3D vision technologies to guarantee a superior performance in comparison to other guidance techniques for autonomous mobile robots (AMRs). Offering significant advantages over other forms of navigation such as magnetic tape, QR codes, and traditional 2D SLAM that require additional infrastructure to function, Visual SLAM AMRs are being embraced by companies to handle an expanding range of production and distribution tasks.
By eliminating the need to change the environment, stop production, or add infrastructure, Visual SLAM technology helps to reduce commissioning time by up to 20 percent compared to 2D SLAM, significantly reducing the time needed to introduce a new AMR into the existing fleet. The technology can be used at scale with fleets updated remotely. The technology is also secure, as it analyzes raw data only, with no visual images saved on either the AMR or on a server.
ABB developed Visual SLAM AMRs in collaboration with partner Sevensense Robotics, an ETH Zurich spin-off founded in 2018. Its technology is based on 15+ years of research. The Visual SLAM technology is already deployed in industrial projects for customers in automotive and retail, with the potential to replace conventional production lines with intelligent, modular production cells served by AMRs.
The technology will be incorporated in ABB’s latest generation AMRs, the AMR T702V, from Q3 2023, and the AMR P604V, from Q4 2023. These will be followed by other AMR products incorporating Visual SLAM which will be rolled out up to 2025. “Our introduction of Visual SLAM AMRs radically enhances companies’ operations, making them faster, more efficient and more flexible, while freeing up employees to take on more rewarding work,” said Marc Segura, President of ABB Robotics Division.
Visual SLAM uses cameras mounted on the AMR to create a real time 3D map of all objects in the surrounding area. The system can differentiate between fixed navigation references such as floors, ceilings and walls that need to be added to the map, and objects such as people or vehicles that move or change position. The cameras detect and track natural features in the environment enabling the AMR to dynamically adapt to its surroundings and determine the safest and most efficient route to its destination. Unlike 2D SLAM, Visual SLAM requires no additional references such as reflectors or markers, saving cost and space and offers accurate positioning to within three millimeters.
(Press release / SK)