We increasingly use robots for logistics and delivery, manufacturing, cleaning and disinfection, and various other services. A significant reason for their expanded use is their autonomous navigation, object detection, force measurement, collision detection, etc. At the heart of these capabilities are sensors that enable robots to perceive their environments, measuring such parameters as angular velocity, temperature, etc., and their ability to modify reactions when necessary.
A recent IDTechEx research report, “Sensors for Robotics 2023-2043: Technologies, Markets, and Forecasts,” predicts annual revenue for robotics sensors will exceed US$80 billion by 2043.
They categorized the robotic navigation sensors into either proprioceptive sensors or exteroceptive sensors. Proprioceptive sensors measure internal data such as joint speed, torque, position, and force. Exteroceptive sensors collect parameters of surrounding environments such as light intensity, chemicals, distance to an object, and many others. We use multiple sensors to deal with complicated tasks, and sensor fusion algorithms assist in merging data with such modalities as sound, vision, haptics, and force for a robust output.
Fig 1. Commonly used sensors by types and applications. Source: IDTechEx
IDTechEx reports that autonomous mobility requires robots to have a suite of functions, including navigation and mapping, object detection, proximity detection, posture control, and collision detection, with navigation and safety as priorities. The sensors involved for navigation and safety include cameras, Radar, and LiDAR. The choice of sensors naturally depends heavily on the robot’s working environment.