TY - GEN
T1 - Indoor pedestrian dead reckoning calibration by visual tracking and map information
AU - Yan, Jingjing
AU - He, Gengen
AU - Basiri, Anahid
AU - Hancock, Craig
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/12/4
Y1 - 2018/12/4
N2 - Currently, Pedestrian Dead Reckoning (PDR) systems are becoming an important tool in indoor navigation. This is mainly due to the development of affordable and portable Micro Electro-Mechanical Systems (MEMS) on smartphones and decreased requirement of additional infrastructures in indoor areas. The main drawback to this technology remains the problem of drift accumulation and the need for support from external positioning systems. Vision-aided inertial navigation is one possible solution to that problem. This solution has become more popular in indoor localization with improved satisfaction compared to individual PDR system. Previous studies have used fixed platforms and visual tracking employed feature-extraction-based methods. This paper proposes a distributed implementation of positioning system and uses deep learning for visual tracking. Meanwhile, as both inertial navigation and optical systems can provide only relative positioning information, this paper proposes a method to integrate digital maps with real geographical coordinates to supply absolute location. This hybrid system has been tested on two common operation systems of smartphones using iOS and Android systems, based on corresponding data collection apps respectively, in order to test the robustness of method. It also uses two different methods for calibration, by time synchronization of positions and heading calibration based on time steps. Results demonstrate that localization information collected from both operating systems can be significantly improved after integrating with visual tracking data.
AB - Currently, Pedestrian Dead Reckoning (PDR) systems are becoming an important tool in indoor navigation. This is mainly due to the development of affordable and portable Micro Electro-Mechanical Systems (MEMS) on smartphones and decreased requirement of additional infrastructures in indoor areas. The main drawback to this technology remains the problem of drift accumulation and the need for support from external positioning systems. Vision-aided inertial navigation is one possible solution to that problem. This solution has become more popular in indoor localization with improved satisfaction compared to individual PDR system. Previous studies have used fixed platforms and visual tracking employed feature-extraction-based methods. This paper proposes a distributed implementation of positioning system and uses deep learning for visual tracking. Meanwhile, as both inertial navigation and optical systems can provide only relative positioning information, this paper proposes a method to integrate digital maps with real geographical coordinates to supply absolute location. This hybrid system has been tested on two common operation systems of smartphones using iOS and Android systems, based on corresponding data collection apps respectively, in order to test the robustness of method. It also uses two different methods for calibration, by time synchronization of positions and heading calibration based on time steps. Results demonstrate that localization information collected from both operating systems can be significantly improved after integrating with visual tracking data.
KW - indoor navigation
KW - pedestrian dead reckoning
KW - sensor fusion
KW - smartphone positioning
KW - visual tracking
UR - http://www.scopus.com/inward/record.url?scp=85060214356&partnerID=8YFLogxK
U2 - 10.1109/UPINLBS.2018.8559925
DO - 10.1109/UPINLBS.2018.8559925
M3 - Conference contribution
AN - SCOPUS:85060214356
T3 - Proceedings of 5th IEEE Conference on Ubiquitous Positioning, Indoor Navigation and Location-Based Services, UPINLBS 2018
BT - Proceedings of 5th IEEE Conference on Ubiquitous Positioning, Indoor Navigation and Location-Based Services, UPINLBS 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th IEEE Conference on Ubiquitous Positioning, Indoor Navigation and Location-Based Services, UPINLBS 2018
Y2 - 22 March 2018 through 23 March 2018
ER -