Authors :
Aishwarya I. Kovalli; Dr. Girish Kumar D.
Volume/Issue :
Volume 11 - 2026, Issue 4 - April
Google Scholar :
https://tinyurl.com/3um96wwh
Scribd :
https://tinyurl.com/2kremaw4
DOI :
https://doi.org/10.38124/ijisrt/26apr992
Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.
Abstract :
The growing need for flexible and personalized fitness support has led to the adoption of artificial intelligence in
modern digital health systems. Most existing fitness applications depend on pre-recorded content and user self-evaluation,
which limits real-time posture guidance and customized feedback. This paper proposes an AI-enabled, web-based fitness
and yoga coaching platform that performs real-time body movement analysis using a standard webcam. The system applies
MediaPipe-based pose estimation to identify key skeletal landmarks and calculate joint angles for assessing pose correctness
by comparing user movements with predefined standard poses. Immediate feedback is provided through visual and textbased prompts, allowing users to correct their posture during exercise sessions. In addition, the platform offers personalized
workout suggestions and performance monitoring by recording accuracy levels, workout duration, and progress trends over
time. Experimental results show that the system delivers accurate pose detection with minimal latency in real-world
environments, making it suitable for real-time use. Overall, the proposed solution serves as an efficient, scalable, and easily
accessible alternative to traditional face-to-face fitness training.
References :
- V. Bazarevsky et al., “MediaPipe: A Framework for Building Perception Pipelines,” arXiv preprint arXiv:1906.08172, 2019.
- Z. Cao et al., “OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields,” IEEE TPAMI, vol. 43, no. 1, pp. 172–186, 2021.
- L. Chen et al., “Human Activity Recognition Using Deep Learning,” IEEE Access, vol. 8, pp. 129–145, 2020.
- S. Jain and A. Singh, “Yoga Pose Detection and Classification Using Deep Learning,” IJCA, vol. 176, no. 32, pp. 1–7, 2020.
- M. Patel et al., “AI-Based Personalized Fitness Coaching Systems,” IEEE Consumer Electronics Magazine, vol. 10, no. 2, pp. 45–53, 2021.
- R. Poppe, “Vision-Based Human Motion Analysis,” Computer Vision and Image Understanding, vol. 108, no. 1–2, pp. 4–18, 2007.
- A. Karpathy, “Computer Vision in the Browser,” Communications of the ACM, vol. 63, no. 5, pp. 84–93, 2020.
- J. Wang et al., “Vision-Based Rehabilitation Systems: A Survey,” IEEE Access, vol. 7, pp. 825–846, 2019.
- K. Fujimoto et al., “Joint Angle Estimation for Human Motion Analysis,” Sensors, vol. 19, no. 3, pp. 1–15, 2019.
The growing need for flexible and personalized fitness support has led to the adoption of artificial intelligence in
modern digital health systems. Most existing fitness applications depend on pre-recorded content and user self-evaluation,
which limits real-time posture guidance and customized feedback. This paper proposes an AI-enabled, web-based fitness
and yoga coaching platform that performs real-time body movement analysis using a standard webcam. The system applies
MediaPipe-based pose estimation to identify key skeletal landmarks and calculate joint angles for assessing pose correctness
by comparing user movements with predefined standard poses. Immediate feedback is provided through visual and textbased prompts, allowing users to correct their posture during exercise sessions. In addition, the platform offers personalized
workout suggestions and performance monitoring by recording accuracy levels, workout duration, and progress trends over
time. Experimental results show that the system delivers accurate pose detection with minimal latency in real-world
environments, making it suitable for real-time use. Overall, the proposed solution serves as an efficient, scalable, and easily
accessible alternative to traditional face-to-face fitness training.