Real-Time Tracking and Distance Measurement of Opencv Aruco Marker Using Webcam

Article Sidebar

Main Article Content

Ali Shuja Sardar

Object tracking and distance measurement play a vital role in robotics and drones. It is often challenging to measure the distance of a target object in an environment by just using a single-vision camera. This paper discusses the development of a fiducial marker-based object tracking and distance measurement system. Fiducial marker detection uses the ArUco method based on the OpenCV library and Python 3.x. The hardware consists of an Arduino, a single-vision camera, and two servos as an actuator for tracking. A mathematical equation is derived to measure the real-time distance of the marker by using a single camera and adjusting the frame size and the camera output colors to increase the detection method’s performance. OpenCV is used to find the center coordinates of the bounding box, and a tracking algorithm is applied to give pan/tilt angles to the servos. Finally, to stabilize the tracking mechanism, an acceptable error is defined. The accuracy of the system is measured by performing 100 trials, and the results show a good accuracy for the system when tracking the ArUco marker. The system is highly beneficial for indoor mobile robot navigation and drone applications

Real-Time Tracking and Distance Measurement of Opencv Aruco Marker Using Webcam. (2025). International Journal of Latest Technology in Engineering Management & Applied Science, 14(1), 313-321. https://doi.org/10.51583/IJLTEMAS.2025.1401034

Downloads

Downloads

Download data is not yet available.

References

Mustafah, Y.M., Noor, R., Hasbi, H., and Azma, A.W. (2012). Stereo vision images processing for real-time object distance and size measurements. 2012 International Conference on Computer and Communication Engineering (ICCCE), 659-663. DOI: https://doi.org/10.1109/ICCCE.2012.6271270

Hossain, M. A., and Mukit, M. (2015). A real-time face to camera distance measurement algorithm using object classification. 2015 International Conference on Computer and Information Engineering (ICCIE), 107–110. https://doi.org/10.1109/CCIE.2015.7399293 DOI: https://doi.org/10.1109/CCIE.2015.7399293

Ye, Y., Tsotsos, J., and Harley, E. (2000). Tracking a person with a pre-recorded image database and a pan, tilt, and zoom camera. Machine Vision and Applications, 12(1), 32–43. https://doi.org/10.1007/s001380050122 DOI: https://doi.org/10.1007/s001380050122

Acuna, R., and Willert, V. (2018). Dynamic markers: UAV landing proof of concept. 2018 Latin American Robotic Symposium, 2018 Brazilian Symposium on Robotics (SBR), and 2018 Workshop on Robotics in Education (WRE), 496–502. https://doi.org/10.48550/arXiv.1709.04981 DOI: https://doi.org/10.1109/LARS/SBR/WRE.2018.00093

Saez, J. M., Lozano, M. A., Escolano, F., and others. (2020). An´ efficient, dense, and long-range marker system for the guidance of the visually impaired. Machine Vision and Applications, 31, 57. https://doi.org/10.1007/s00138-020-01097-y DOI: https://doi.org/10.1007/s00138-020-01097-y

Kato, H. (2002). ARToolKit: Library for vision-based augmented reality. IEICE Technical Report, 101(652 (PRMU2001 222-232)), 79–86

Kato, H., and Billinghurst, M. (1999). Marker tracking and HMD calibration for a video-based augmented reality conferencing system. Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR ’99), 85–94. https://doi.org/10.1109/IWAR.1999.803809 DOI: https://doi.org/10.1109/IWAR.1999.803809

Olson, E. (2011). AprilTag: A robust and flexible visual fiducial system. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2011), 3400–3407. https://doi.org/10.1109/ICRA.2011.5979561 DOI: https://doi.org/10.1109/ICRA.2011.5979561

Garrido-Jurado, S., Munoz-Salinas, R., Madrid-Cuevas, F. J., and Marín-Jiménez, M. J. (2014). Automatic generation and detection of highly´ reliable fiducial markers under occlusion. Pattern Recognition, 47(6), 2280–2292. https://doi.org/10.1016/j.patcog.2014.01.005 DOI: https://doi.org/10.1016/j.patcog.2014.01.005

Dandil, E., and C¸evik, K. K. (2019). Computer vision-based distance measurement system using stereo camera view. 2019 3rd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), 1–4. https://doi.org/10.1109/ISMSIT.2019.8932817 DOI: https://doi.org/10.1109/ISMSIT.2019.8932817

Jun, J., Yue, Q., and Qing, Z. (2010). An extended marker-based tracking system for augmented reality. Proceedings of the 2010 Second International Conference on Modeling, Simulation and Visualization Methods (WMSVM), 94–97. https://doi.org/10.1109/WMSVM.2010.52 DOI: https://doi.org/10.1109/WMSVM.2010.52

Ababsa, F., and Mallem, M. (2004). Robust camera pose estimation using 2D fiducials tracking for real-time augmented reality systems. VRCAI ’04. https://doi.org/10.1145/1044588.1044682 DOI: https://doi.org/10.1145/1044588.1044682

Latifah, A., Saripudin, Aulawi, H., and Ramdhani, M. (2018). Pantilt modelling for face detection. IOP Conference Series: Materials Science and Engineering, 434, 012204. https://doi.org/10.1088/1757899X/434/1/012204 DOI: https://doi.org/10.1088/1757-899X/434/1/012204

Torkaman, B., and Farrokhi, M. (2012). Real-time visual tracking of a moving object using pan and tilt platform: A Kalman filter approach. 20th Iranian Conference on Electrical Engineering (ICEE2012), 928–933. https://doi.org/10.1109/IranianCEE.2012.6292486 DOI: https://doi.org/10.1109/IranianCEE.2012.6292486

Intel. (2008, October). Intel Open Source Computer Vision Library, v1.1ore. http://sourceforge.net/projects/opencvlibrary/

Chakravorty, T., Bilodeau, G., and Granger, E. (2020). Robust face track-´ ing using multiple appearance models and graph relational learning. Machine Vision and Applications, 31, 23. https://doi.org/10.1007/s00138020-01071 DOI: https://doi.org/10.1007/s00138-020-01071-8

Article Details

How to Cite

Real-Time Tracking and Distance Measurement of Opencv Aruco Marker Using Webcam. (2025). International Journal of Latest Technology in Engineering Management & Applied Science, 14(1), 313-321. https://doi.org/10.51583/IJLTEMAS.2025.1401034

Similar Articles

You may also start an advanced similarity search for this article.