⚠ Official Notice: www.ijisrt.com is the official website of the International Journal of Innovative Science and Research Technology (IJISRT) Journal for research paper submission and publication. Please beware of fake or duplicate websites using the IJISRT name.



Visual Intelligence in Resource-Constrained Edge Devices: A Review of Raspberry Pi


Authors : Archana D.; Radhika V.

Volume/Issue : Volume 11 - 2026, Issue 3 - March


Google Scholar : https://tinyurl.com/24y8eywe

Scribd : https://tinyurl.com/2hax2u86

DOI : https://doi.org/10.38124/ijisrt/26mar114

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : Developments in computer vision have provided a great deal of capability to smart systems to sense the world around them and be able to make autonomous decisions with a broad spectrum of uses. Nonetheless, executing the visionbased algorithms in resource-constrained edge devices is not easy since their capabilities are severely limited in terms of computational and memory resources, as well as energy supply. This review explores the computer vision algorithms that can be applied to low-cost and low-power embedded systems with a specific focus on the Raspberry Pi family. Methods of object detection and multi-object tracking (including classical vision algorithms, as well as deep learning models like YOLO, SSD, and MobileNet) are reviewed and compared with each other in terms of the indicators of the evaluation characteristics that are commonly reported (such as inference speed, latency, model size, number of parameters, and detection accuracy (mAP)) when available. Also, the techniques of optimisation like quantization, pruning, lightweight backbone design, and embedding edge AI accelerators (e.g., Google Coral TPU and Intel Neural Compute Stick 2) are discussed. This survey summarizes the results of benchmark data sets and real-world systems to highlight major trade-offs in embedded vision implementation and suggests real- world combinations of detectors and trackers to use in appli- cations of surveillance, mobile robots, and smart transportation

Keywords : Edge Computing, Raspberry Pi, Object Detection, Object Tracking, Resource-Constrained Devices, Lightweight Vision Algorithms.

References :

  1. V.N.Le,V.L.Khong,V.T.Nguyen,and Q. D. Pham, “Development of a real-time object-tracking system using Raspberry Pi,” JMST, vol. 90, no. 90, pp. 127–133, Oct. 2023.
  2. L. Rosyidi, A. Prasetyo, and M. S. Romadhon, “Object tracking with Raspberry Pi using Histogram of Oriented Gradients (HOG) and Support Vector Machine (SVM),” in Proc. 8th Int. Conf. Inf. Com- mun. Technol. (ICoICT), Yogyakarta, Indonesia, 2020, pp. 1–6, doi: 10.1109/ICoICT49345.2020.9166330.
  3. G. Dewantoro, J. Mansuri, and F. D. Setiaji, “Comparative study of computer vision based line followers using Raspberry Pi and Jetson Nano,” Jurnal Rekayasa Elektrika, vol. 17, no. 4, pp. 239–246, Dec. 2021, doi: 10.17529/jre.v17i4.21324
  4. Aryani, K. Dewi, F. TaBy, and E. P. Sanggaria, “Real-time ball detection and tracking using Raspberry Pi,” INTEK Jurnal Penelitian, vol. 10, no. 1, pp. 48–54, 2023, doi: 10.31963/intek.v10i1.4301.
  5. S. Sheikh, P. Kumar, R. Kshirsagar, M. Chouhan, and A. S. Bhagwat, “Raspberry Pi based surveillance robot for real-time intrusion detection and tracking,” Int. Res. J. Eng. Technol. (IRJET), vol. 8, no. 5, pp. 3333–3335, May 2021.
  6. T. J. Nagalakshmi and P. Prakas, “Fast and economical object tracking using Raspberry Pi 3.0,” Int. J. Eng. Adv. Technol. (IJEAT), vol. 8, no. 6S, Aug. 2019, doi: 10.35940/ijeat.F1070.0886S19
  7. R. R. Patil, O. S. Vaidya, G. M. Phade, and S. T. Gandhe, “Qualified scrutiny for real-time object tracking framework,” Int. J. Emerging Technol., vol. 11, no. 3, pp. 313–319, 2020.
  8. G. Anand and A. K. Kumawat, “Object detection and position tracking in real time using Raspberry Pi,” Materials Today: Proceedings, vol. 47, pp. 3221–3226, Jul. 2021, doi: 10.1016/j.matpr.2021.05.276.
  9. M. Ojha, S. Prajapati, and M. Jain, “Object detection and tracking on a Raspberry Pi using background subtraction and CNNs,” Int. J. Comput. Appl., vol. 182, no. 47, pp. 6–10, Feb. 2019.
  10. U. K. Sahu, K. M., K. A., M. P. M., A. Jaiswal, U. K. Yadav, and S. K. Dash, “Autonomous object tracking with vision-based control using a 2DOF robotic arm,” Scientific Reports, vol. 15, no. 1, 2025, doi: 10.1038/s41598-025-97930-3.
  11. “Comparison of API trackers in OpenCV using Raspberry Pi hard- ware,” Science Arena Publications, vol. 6, no. 2, pp. 1–9, 2020, doi: 10.51847/fo5kgrb7zp.
  12. T. Serrano-Ram´ırez, N. D. C. Lozano-Rinco´n, A. Mandujano-Nava, and Y. J. Sa´mano-Flores, “Artificial vision system for object classification in real time using Raspberry Pi and a web camera,” Revista de Tec- nolog´ıas de la Informacio´n y Comunicaciones, pp. 20–25, 2021, doi: 10.35429/jitc.2021.13.5.20.25.
  13. P. Mohadese et al., “Multi-modal driver drowsiness detection in ADAS via attention-guided Siamese network with temporal modeling,” Int. J. Web Res., vol. 9, no. 1, pp. 40–55, 2026.
  14. R. Hardini and Y. Bandung, “Design and implementation of object tracking system based mean-shift with locust search optimization on Raspberry Pi,” in Proc. Int. Conf. Inf. Technol. Syst. Innov. (ICITSI), 2020, pp. 282–287, doi: 10.1109/ICITSI50517.2020.9264941.
  15. N. D. Perez, J. Villaverde, W. Cereneo, A. Custodio, S. B. Conde, and L. Bulawan, “Design and development of a Raspberry Pi-based shopping cart following robot through computer vision and object tracking method,” in Proc. IEEE Region 10 Symp. (TENSYMP), 2022, pp. 1–6, doi: 10.1109/TENSYMP54529.2022.9864520.
  16. F. Mart´ınez, F. Mart´ınez, and C. Penagos, “Integrating low-cost vision for autonomous tracking in assistive robots,” Bull. Electr. Eng. Inform., vol. 14, no. 3, pp. 1881–1889, 2025, doi: 10.11591/eei.v14i3.9242.
  17. G.-C. Cre¸t, L. T¸ epelea, C. Grava, I. Gavrilu¸t, and T. Suru, “Using Raspberry Pi and the OpenCV library to command a robot for tracking a colored object,” in Proc. 17th Int. Conf. Eng. Modern Electr. Syst. (EMES), 2023, pp. 1–4, doi: 10.1109/EMES58375.2023.10171765
  18. X. Tian, H. Feng, and J. Chen, “An industrial production line dynamic target tracking system based on HAAR and CAMSHIFT,” School of Electromechanical Engineering, Lingnan Normal University, Zhanjiang, China, Mar. 13, 2020.
  19. S. Gupta, R. Gupta, S. Shah, J. Chauhan, A. Pandey, and G. Ramasamy, “A Currency and Denomination Detection System using Raspberry Pi and Machine Learning,” 2025, doi: 10.4108/eai.28-4-2025.2357994.
  20. J. Li, “OpenCV-based PID control line following vehicle with object recognition and reaction,” in AIP Conference Proceedings, vol. 3144, no. 1, Art. no. 050009, Jun. 26, 2024, doi: 10.1063/5.0214211.
  21. P. Jiang, Y. Geng, Z. Sang, and C. Wang, “Real-time bolt loosening monitoring method based on edge device deployment and optical flow tracking,” J. Failure Anal. Prevent., vol. 25, no. 4, Jul. 2025, doi: 10.1007/s11668-025-02186-8.
  22. M. Deniz, I. Bogrekci, and P. Demircioglu, “Real-time detection of hole- type defects on industrial components using Raspberry Pi 5,” Applied System Innovation, vol. 8, no. 4, p. 89, 2025.
  23. P. Chandrashekar and P. M. Sajjanar, “Real-time, low-latency surveil- lance using entropy-based adaptive buffering and MobileNetV2 on edge devices,” arXiv preprint, arXiv:2506.14833v1, 2025.
  24. K. Alqahtani, M. A. Cheema, and A. N. Toosi, “Benchmarking deep learning models for object detection on edge computing devices,” in Lecture Notes in Computer Science, pp. 142–150. Springer, 2024, doi: 10.1007/978-981-96-0805-8-11.
  25. K. Manjari, M. Verma, G. Singal, and V. Chamola, “Catalysing assis- tive solutions by deploying light-weight deep learning model on edge devices,” J. Exp. Theor. Artif. Intell., vol. 37, no. 3, pp. 465–486, 2023, doi: 10.1080/0952813X.2023.2219286.
  26. Myroniuk and B. Blagitko, “Optimizations of deep learning ob- ject detection models for inference acceleration on general-purpose and hardware-accelerated single-board platforms,” Electronics and In- formation Technology, vol. 29, no. 6, pp. 57–68, Apr. 2025, doi: 10.30970/eli.29.6.
  27. V. Kamath and R. Adige, “Investigation of MobileNet-SSD on human follower robot for stand-alone object detection and tracking using Raspberry Pi,” Cogent Engineering, vol. 11, no. 1, Apr. 2024, doi: 10.1080/23311916.2024.2333208.
  28. S. Boddu and A. Mukherjee, “Efficient edge deployment of quantized YOLOv4-Tiny for aerial emergency object detection on Raspberry Pi 5,” arXiv preprint, arXiv:2506.09300, 2025.
  29. J. Park, J. Hong, W. Shim, and D. Jung, “Multi-object tracking on SWIR images for city surveillance in an edge-computing environment,” Sensors, vol. 23, no. 14, p. 6373, 2023, doi: 10.3390/s23146373.
  30. L. Rey, A. M. Bernardos, A. D. Dobrzycki, D. Carramin˜ana, L. Bergesio, J. A. Besada, and J. R. Casar, “A performance analysis of you only look once models for deployment on constrained computational edge devices in drone applications,” Electronics, vol. 14, no. 3, p. 638, 2025, doi: 10.3390/electronics14030638.
  31. J. C. da Silva, M. C. Silva, E. J. Luz, S. Delabrida, and R. A. Oliveira, “Real-time object detection performance analysis using YOLOv7 on edge devices,” IEEE Latin America Transactions, 2023.
  32. Mulyanto, R. I. Borman, P. Prasetyawan, W. Jatmiko, and P. Mur- santo, “Real-time human detection and tracking using two sequential frames for advanced driver assistance system,” in Proc. 3rd Int. Conf. Informatics and Computational Sciences (ICICoS), 2019, pp. 1–5, doi: 10.1109/ICICoS48119.2019.8982396.
  33. N. A. Gookyi, F. A. Wulnye, M. Wilson, P. Danquah, S. A. Danso, and A. A. Gariba, “Enabling intelligence on the edge: Leveraging Edge Impulse to deploy multiple deep learning models on edge devices for tomato leaf disease detection,” AgriEngineering, vol. 6, no. 4, pp. 3563–3585, 2024.
  34. P. Kumar, S. K. Choudhary, S. Pallerla, S. Sangani, A. Thota, and V. Matta, “Pi-Vision: Leveraging YOLO v8 for real-time person detection and recognition on Raspberry Pi,” in Proc. 5th Int. Conf. Data Intelli- gence and Cognitive Informatics (ICDICI), 2024, pp. 1446–1450, doi: 10.1109/ICDICI62993.2024.10810836
  35. S. S., P. Dr., R. Manjunatha, R. Stephen, S. Melwin, and V. Prasad, “Real-time object detection system using YOLO and Raspberry Pi,” Int. J. Innovative Research in Technology, vol. 11, no. 12, pp. 5192–5193, 2025.
  36. P. Thumati, S. J. R., V. P., and R. Karipe, “An affordable surveillance robot using Raspberry Pi and YOLO,” Int. J. Engineering Research and Technology, vol. 14, no. 4, p. 275, 2025.
  37. Moksyakov, Y. Wu, S. A. Gadsden, J. Yawney, and M. AlShabi, “Object detection and tracking with YOLO and the sliding innovation filter,” Sensors, vol. 24, no. 7, p. 2107, 2024, doi: 10.3390/s24072107.
  38. S. Cherubin, “YOLO object detection and classification using low-cost mobile robot,” Przeglad Elektrotechniczny, vol. 1, no. 9, pp. 31–35, 2024, doi: 10.15199/48.2024.09.04.
  39. H. Kumar, D. R. K. Raja, S. Suresh, R. Kottamala, and M. Harsith, “Vision-guided pick and place systems using Raspberry Pi and YOLO,” in Proc. IEEE Int. Conf. Next-Generation Electronic Systems (IC- NEWS), 2024, pp. 1–7, doi: 10.1109/ICNEWS60873.2024.10731108.
  40. K. Su, Y. Tomioka, Q. Zhao, and Y. Liu, “YOLIC: An efficient method for object localization and classification on edge devices,” Image and Vision Computing, vol. 147, p. 105095, 2024, doi: 10.1016/j.imavis.2024.105095.
  41. N. N. F. Taek and N. A. Sony, “Design and implementation of a student counting and monitoring system in a laboratory using human tracking method with OpenCV and TensorFlow,” J. Robot., Autom. Electron. Eng., vol. 2, no. 1, pp. 20–29, 2024, doi: 10.21831/jraee.v2i1.554.
  42. Al-Absi, H. A. Al-Mashhadani, S. A. Mostafa, and M. A. Mo- hammed, “Smart office automation system based on Raspberry Pi and Faster R-CNN for face recognition,” Array, vol. 17, p. 100268, 2023.
  43. J. Vega, “Control system for indoor safety measures using a Faster R- CNN architecture,” Electronics, vol. 12, no. 11, Art. no. 2378, 2023, doi: 10.3390/electronics12112378.
  44. A. Hamad, T. A. Salih, and A. F. Mahmoud, “Integration of DeepSORT and RFID technology for enhanced human tracking,” NTU Journal of Engineering and Technology, vol. 3, no. 4, pp. 17–25, 2024, doi: 10.56286/ntujet.v3i4.1095.
  45. Boschi, F. Salvetti, V. Mazzia, and M. Chiaberge, “A cost-effective person-following system for assistive unmanned vehicles with deep learning at the edge,” Machines, vol. 8, no. 3, p. 49, 2020, doi: 10.3390/machines8030049.
  46. T. Gao and J. Suto, “Acceleration of image classification and object tracking by the Intel Neural Compute Stick 2 with power efficiency evaluation on Raspberry Pi 4B,” Sensors, vol. 25, no. 6, p. 1794, Mar. 2025, doi: 10.3390/s25061794.
  47. M. Sikora, S. R. Chirumamilla, T. Rojek, and J. Siemin´ski, “Ball detection using deep learning implemented on an educational robot based on Raspberry Pi,” Sensors, vol. 23, no. 8, p. 4071, Apr. 2023, doi: 10.3390/s23084071
  48. Q. Wang, D. Wang, J. Lu, G. Xiao, D. Liang, G. Lu, and H. Shao, “SAL- YOLO-DeepSeek: A lightweight real-time detection and LLM-driven decision framework for intelligent escalator safety monitoring,” Scien- tific Reports, vol. 15, Art. no. 40600, Nov. 2025, doi: 10.1038/s41598- 025-24260-9.
  49. J. Panda, P. K. Nanda, and T. Pradhan, “Particle filter-based video object tracking scheme with target remodeling and reinitialization and its hardware implementation using Raspberry Pi,” IEEE Access, vol. 12, pp. 98285–98305, 2024, doi: 10.1109/ACCESS.2024.3428321.
  50. Vashisht, G. C. Gandhi, S. Kalra, and D. K. Saini, “Hybrid robot navigation: Integrating monocular depth estimation and visual odom- etry for efficient navigation on low-resource hardware,” Computers and Electrical Engineering, vol. 124, Art. no. 110375, 2025, doi: 10.1016/j.compeleceng.2025.110375.
  51. P. Gao et al., “Dynamic beehive detection and tracking system based on YOLO v5 and unmanned aerial vehicle,” Journal of Biosystems Engineering, vol. 47, no. 4, pp. 510–520, 2022, doi: 10.1007/s42853-022-00166-6.
  52. M. Danish, R. Verma, J. Brazauskas, I. Lewis, and R. Mortier, “Deep- Dish: Multi-object tracking with an off-the-shelf Raspberry Pi,” in Proc. 3rd ACM Int. Workshop on Edge Systems, Analytics and Networking (EdgeSys), 2020, pp. 37–42, doi: 10.1145/3378679.3394535.
  53. M. Danish, R. Verma, J. Brazauskas, I. Lewis, and R. Mortier, “Deep- Dish on a diet: Low-latency, energy-efficient object detection and track- ing at the edge,” in Proc. 17th European Conf. on Computer Systems (EuroSys), 2022, pp. 43–48, doi: 10.1145/3517206.3526273.
  54. J. Brazauskas, C. Jensen, M. Danish, I. Lewis, and R. Mortier, “Cer- berus: Privacy-preserving crowd counting and localisation using face detection in edge devices,” in Proc. 19th European Conf. on Computer Systems (EuroSys), 2024, pp. 25–30, doi: 10.1145/3642968.3654817.
  55. R. ElTobgui, “Visual perception of underwater robotic swarms,” Master’s thesis, 2024.
  56. N. M. Jain and N. A. Shah, “Convolutional neural networks for real- time object detection with Raspberry Pi,” World Journal of Advanced Engineering Technology and Science, vol. 4, no. 1, pp. 87–105, 2021, doi: 10.30574/wjaets.2021.4.1.0067.
  57. S. Hossain and D. Lee, “Deep learning-based real-time multiple-object detection and tracking from aerial imagery via a flying robot with GPU- based embedded devices,” Sensors, vol. 19, no. 15, p. 3371, Aug. 2019, doi: 10.3390/s19153371.
  58. N. Rathour et al., ”Advanced Security with YOLO Object Detection Using Raspberry Pi,” 2024 International Conference on Emerging Tech- nologies and Innovation for Sustainability (EmergIN), Greater Noida, India, 2024, pp. 214-218, doi: 10.1109/EmergIN63207.2024.10960931.
  59. S. N. Katkade, R. R. Manza, and C. Pattebahadur, “YOLOv5-Based Object Detection System for Visually Impaired Individuals Using Rasp- berry Pi,” Artificial Intelligence and Applications, vol. XX, no. XX, pp. 1–5, Jul. 2025, doi: 10.47852/bonviewAIA52024434.
  60. W. Nugroho, R. Zahabiyah, M. J. Arifiant, and A. Afianto, “Automated Component Detection for Quality PCB Using YOLO Algorithm with IoT Real-Time Streaming on Raspberry Pi”, INFOTEL, vol. 17, no. 2, pp. 440–455, Jul. 2025.
  61. Pranav Krishnan, Sahithya Kattamuri, Gayathri R Prabhu, and Manazhy Rashmi. 2024. Assistive Eye: A Comparative Analysis of YOLO Ob- ject Detection Models on Edge Devices. In Proceedings of the 2024 Sixteenth International Conference on Contemporary Computing (IC3- 2024). Association for Computing Machinery, New York, NY, USA, 104–108. https://doi.org/10.1145/3675888.3676037
  62. Yuvaraj R, Senthil Kumar D, Sunil Arjun Bhalerao, Krishnan Muruge- san, Suresh Vellaiyan, Nguyen Van Minh,“Real-time fire detection and suppression system using YOLO11n and Raspberry Pi for thermal safety applications”,Case Studies in Thermal Engineering,
  63. V. L. B. Silva, F. A. P. de Figueiredo and S. B. Mafra, ”Performance Evaluation of Edge Computing Object Detection Models for Maritime Surveillance on a Raspberry Pi,” 2024 IEEE Latin-American Conference on Communications (LATINCOM), Medellin, Colombia, 2024, pp. 1-6, doi: 10.1109/LATINCOM62985.2024.10770682.
  64. Y. Setiawan, M. N. Puji and W. Astuti, ”Traffic Signs Detection System Using YOLO (You Only Look Once) That Provides Notifi- cation,” 2024 IEEE International Conference on Smart Mechatronics (ICSMech), Yogyakarta, Indonesia, 2024, pp. 95-100, doi: 10.1109/IC- SMech62936.2024.10812285.
  65. K. Hari, M. A. Chowdary, M. Sumathi, D. Sainadh and T. Manikanta, ”Deployment of Real-Time Object Recognition in Rasp- berry Pi with Neural Compute Stick for Blind and Deaf People,” 2024 3rd International Conference on Applied Artificial Intelligence and Computing (ICAAIC), Salem, India, 2024, pp. 305-310, doi: 10.1109/ICAAIC60222.2024.10575567.
  66. S. Savaram, K. Goutham and K. Venkatasubramanian, ”Comparative Analysis of YOLO Models for Real-Time Safety Helmet Detection to Enhance Construction Site Safety Using Raspberry Pi,” 2025 International Conference on Next Generation Communication Informa- tion Processing (INCIP), Bangalore, India, 2025, pp. 513-518, doi: 10.1109/INCIP64058.2025.11019464.
  67. L. Duvvuri and R. G. J, ”SSD Framework in Raspberry Pi for Real-Time Object Detection in Autonomous Vehicles,” 2024 IEEE International Conference on Electronics, Computing and Communica- tion Technologies (CONECCT), Bangalore, India, 2024, pp. 1-6, doi: 10.1109/CONECCT62155.2024.10677229.
  68. S. Duman, A. Elewi and A. Souag, ”Real-Time Mushroom Detection and Maturity Classification Using YOLO-Tiny on Raspberry Pi Plat- form,” 2025 9th International Symposium on Innovative Approaches in Smart Technologies (ISAS), Gaziantep, Turkiye, 2025, pp. 1-5, doi: 10.1109/ISAS66241.2025.11101938.
  69. El-Alami, Y. Nadir, and K. Mansouri, “A Hybrid Vehicle Tracking System for Low-power Embedded Devices,” in Proc. IEEE International Conference (published Jun. 28, 2024).
  70. S. N. Adu Tagoe, M. Elmir, and N. Amanquah, “Monitoring of Animal Movement using Computer Vision,” in 2024 IEEE 9th International Conference on Adaptive Science and Technology (ICAST), Accra, Ghana, 2024, pp. 1–6, doi: 10.1109/ICAST61769.2024.10856474.
  71. S. S. Yadav, S. Anand, A. M. D, D. S. Nikitha, and C. S. Thakur, “tinyRadar: LSTM-based Real-time Multi-target Human Activity Recog- nition for Edge Computing,” in 2024 IEEE International Symposium on Circuits and Systems (ISCAS), Singapore, 2024, pp. 1–5, doi: 10.1109/ISCAS58744.2024.10558474.
  72. A. Messaoui, R. Louali, F. Demim, I. Zohir, et al.,“Detection and Tracking of Mobile Objects Using Hybrid Algorithms,” in Proceedings of the 6th International Conference on Electrical Engineering and Control Applications (ICEECA), Vol. 1, Nov. 2025, doi: 10.1007/978- 981-95-1109-9-27.
  73. Sridharan and G. M, ”A Lightweight Convolutional Network for Deep Learning-based Ball Manipulation by a Soccer Playing Robot,” 2024 3rd International Conference on Automation, Computing and Renewable Systems (ICACRS), Pudukkottai, India, 2024, pp. 1305- 1310, doi: 10.1109/ICACRS62842.2024.10841538.
  74. Z. Tang and J. Wang, “Research on real-time face recognition and tracking model based on embedded platform,” in Proc. SPIE/COS Photonics Asia, Nov. 21, 2025.

Developments in computer vision have provided a great deal of capability to smart systems to sense the world around them and be able to make autonomous decisions with a broad spectrum of uses. Nonetheless, executing the visionbased algorithms in resource-constrained edge devices is not easy since their capabilities are severely limited in terms of computational and memory resources, as well as energy supply. This review explores the computer vision algorithms that can be applied to low-cost and low-power embedded systems with a specific focus on the Raspberry Pi family. Methods of object detection and multi-object tracking (including classical vision algorithms, as well as deep learning models like YOLO, SSD, and MobileNet) are reviewed and compared with each other in terms of the indicators of the evaluation characteristics that are commonly reported (such as inference speed, latency, model size, number of parameters, and detection accuracy (mAP)) when available. Also, the techniques of optimisation like quantization, pruning, lightweight backbone design, and embedding edge AI accelerators (e.g., Google Coral TPU and Intel Neural Compute Stick 2) are discussed. This survey summarizes the results of benchmark data sets and real-world systems to highlight major trade-offs in embedded vision implementation and suggests real- world combinations of detectors and trackers to use in appli- cations of surveillance, mobile robots, and smart transportation

Keywords : Edge Computing, Raspberry Pi, Object Detection, Object Tracking, Resource-Constrained Devices, Lightweight Vision Algorithms.

Paper Submission Last Date
31 - March - 2026

SUBMIT YOUR PAPER CALL FOR PAPERS
Video Explanation for Published paper

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe