TY - JOUR
T1 - Sensors and Sensor Fusion Methodologies for Indoor Odometry
T2 - A Review
AU - Yang, Mengshen
AU - Sun, Xu
AU - Jia, Fuhua
AU - Rushworth, Adam
AU - Dong, Xin
AU - Zhang, Sheng
AU - Fang, Zaojun
AU - Yang, Guilin
AU - Liu, Bingjian
N1 - Publisher Copyright:
© 2022 by the authors. Licensee MDPI, Basel, Switzerland.
PY - 2022/5/1
Y1 - 2022/5/1
N2 - Although Global Navigation Satellite Systems (GNSSs) generally provide adequate accuracy for outdoor localization, this is not the case for indoor environments, due to signal obstruction. Therefore, a self-contained localization scheme is beneficial under such circumstances. Modern sensors and algorithms endow moving robots with the capability to perceive their environment, and enable the deployment of novel localization schemes, such as odometry, or Simultaneous Localization and Mapping (SLAM). The former focuses on incremental localization, while the latter stores an interpretable map of the environment concurrently. In this context, this paper conducts a comprehensive review of sensor modalities, including Inertial Measurement Units (IMUs), Light Detection and Ranging (LiDAR), radio detection and ranging (radar), and cameras, as well as applications of polymers in these sensors, for indoor odometry. Furthermore, analysis and discussion of the algorithms and the fusion frameworks for pose estimation and odometry with these sensors are performed. Therefore, this paper straightens the pathway of indoor odometry from principle to application. Finally, some future prospects are discussed.
AB - Although Global Navigation Satellite Systems (GNSSs) generally provide adequate accuracy for outdoor localization, this is not the case for indoor environments, due to signal obstruction. Therefore, a self-contained localization scheme is beneficial under such circumstances. Modern sensors and algorithms endow moving robots with the capability to perceive their environment, and enable the deployment of novel localization schemes, such as odometry, or Simultaneous Localization and Mapping (SLAM). The former focuses on incremental localization, while the latter stores an interpretable map of the environment concurrently. In this context, this paper conducts a comprehensive review of sensor modalities, including Inertial Measurement Units (IMUs), Light Detection and Ranging (LiDAR), radio detection and ranging (radar), and cameras, as well as applications of polymers in these sensors, for indoor odometry. Furthermore, analysis and discussion of the algorithms and the fusion frameworks for pose estimation and odometry with these sensors are performed. Therefore, this paper straightens the pathway of indoor odometry from principle to application. Finally, some future prospects are discussed.
KW - Camera
KW - IMU
KW - LiDAR
KW - Odometry
KW - Polymeric sensor
KW - Radar
KW - SLAM
KW - Self-contained localization
KW - Sensor fusion
KW - State estimation
UR - http://www.scopus.com/inward/record.url?scp=85132549993&partnerID=8YFLogxK
U2 - 10.3390/polym14102019
DO - 10.3390/polym14102019
M3 - Review article
AN - SCOPUS:85132549993
SN - 2073-4360
VL - 14
JO - Polymers
JF - Polymers
IS - 10
M1 - 2019
ER -