TY - GEN
T1 - Moving Object Tracking based on Kernel and Random-coupled Neural Network
AU - Chen, Yiran
AU - Liu, Haoran
AU - Liu, Mingzhe
AU - Liu, Yanhua
AU - Wang, Ruili
AU - Li, Peng
N1 - Publisher Copyright:
© 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.
PY - 2024/12/28
Y1 - 2024/12/28
N2 - Moving object tracking on cost-effective hardware is a crucial need in numerous research and industrial applications. However, current deep learning-based tracking algorithms usually prioritize exceptional performance at the expense of increased computational load. Due to the unavailability of expensive GPUs for many tracking tasks, these popular trackers often fall short in providing robust tracking capabilities with affordable computational resources. This study introduces RCNNshift, a kernel-based tracker that relies on feature extraction from a random-coupled neural network. This visual cortex inspired neural model can extract image features without requiring cumbersome pre-training or deep neural connections. By utilizing an enhanced one-dimensional feature representation, RCNNshift demonstrates superior performance compared to other kernel-based object tracking methods, even those employing higher-dimensional feature spaces. Its improvement in precision and success plots of OPE, compared to the Meanshift and Camshift in the HSV and RGB color spaces, exceeds over 160% and 190% respectively. Comparative experiments have validated the robustness of RCNNshift, showcasing its superior performance over various kernel-based and particle filter trackers. Its combination of robustness and computational efficiency makes RCNNshift an ideal choice for mid to low-end object tracking tasks such as surveillance and underwater tracking. The source code is available at https://github.com/HaoranLiu507/RCNNshift.
AB - Moving object tracking on cost-effective hardware is a crucial need in numerous research and industrial applications. However, current deep learning-based tracking algorithms usually prioritize exceptional performance at the expense of increased computational load. Due to the unavailability of expensive GPUs for many tracking tasks, these popular trackers often fall short in providing robust tracking capabilities with affordable computational resources. This study introduces RCNNshift, a kernel-based tracker that relies on feature extraction from a random-coupled neural network. This visual cortex inspired neural model can extract image features without requiring cumbersome pre-training or deep neural connections. By utilizing an enhanced one-dimensional feature representation, RCNNshift demonstrates superior performance compared to other kernel-based object tracking methods, even those employing higher-dimensional feature spaces. Its improvement in precision and success plots of OPE, compared to the Meanshift and Camshift in the HSV and RGB color spaces, exceeds over 160% and 190% respectively. Comparative experiments have validated the robustness of RCNNshift, showcasing its superior performance over various kernel-based and particle filter trackers. Its combination of robustness and computational efficiency makes RCNNshift an ideal choice for mid to low-end object tracking tasks such as surveillance and underwater tracking. The source code is available at https://github.com/HaoranLiu507/RCNNshift.
KW - Kernel-based tracking
KW - Moving object tracking
KW - Radom-coupled neural network
KW - Spiking neural networks
UR - http://www.scopus.com/inward/record.url?scp=85216222293&partnerID=8YFLogxK
U2 - 10.1145/3696409.3700168
DO - 10.1145/3696409.3700168
M3 - Conference contribution
AN - SCOPUS:85216222293
T3 - Proceedings of the 6th ACM International Conference on Multimedia in Asia, MMAsia 2024
BT - Proceedings of the 6th ACM International Conference on Multimedia in Asia, MMAsia 2024
PB - Association for Computing Machinery, Inc
T2 - 6th ACM International Conference on Multimedia in Asia, MMAsia 2024
Y2 - 3 December 2024 through 6 December 2024
ER -