Abstract
Although Global Navigation Satellite Systems (GNSSs) generally provide adequate accuracy for outdoor localization, this is not the case for indoor environments, due to signal obstruction. Therefore, a self-contained localization scheme is beneficial under such circumstances. Modern sensors and algorithms endow moving robots with the capability to perceive their environment, and enable the deployment of novel localization schemes, such as odometry, or Simultaneous Localization and Mapping (SLAM). The former focuses on incremental localization, while the latter stores an interpretable map of the environment concurrently. In this context, this paper conducts a comprehensive review of sensor modalities, including Inertial Measurement Units (IMUs), Light Detection and Ranging (LiDAR), radio detection and ranging (radar), and cameras, as well as applications of polymers in these sensors, for indoor odometry. Furthermore, analysis and discussion of the algorithms and the fusion frameworks for pose estimation and odometry with these sensors are performed. Therefore, this paper straightens the pathway of indoor odometry from principle to application. Finally, some future prospects are discussed.
Original language | English |
---|---|
Article number | 2019 |
Number of pages | 34 |
Journal | Polymers |
Volume | 14 |
Issue number | 10 |
DOIs | |
Publication status | Published - 15 May 2022 |
Keywords
- Camera
- IMU
- LiDAR
- Odometry
- Polymeric sensor
- Radar
- SLAM
- Self-contained localization
- Sensor fusion
- State estimation
ASJC Scopus subject areas
- General Chemistry
- Polymers and Plastics