Trajectrack: Intelligent Trajectory Estimation, Speed Analysis, and Lane Detection for Autonomous Vehicles


Authors : Himesh Chauhan; Varun Choudhary; Syed Faizan Haider; Dr. Gokulnath C

Volume/Issue : Volume 10 - 2025, Issue 5 - May


Google Scholar : https://tinyurl.com/3jxn4vek

DOI : https://doi.org/10.38124/ijisrt/25may495

Google Scholar

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.

Note : Google Scholar may take 15 to 20 days to display the article.


Abstract : Autonomous vehicles (AVs) need complex perception systems for safe operation under dynamic traffic scenes. We introduce TrajecTrack, a machine learning-based platform that integrates real-time trajectory estimation, velocity estimation and lane detection from LiDAR and vision inputs. We apply DBSCAN clustering and the constant velocity model for predicted trajectories, with our speed estimation based on YOLOv8 and ByteTrack, plus a new module for lane detection based on edge detection and the Hough transform. Compared to the NuScenes dataset and sample video input, TrajecTrack provides high-accuracy visualizations of the trajectories, velocities and road lane markings and therefore improves the situational awareness of AVs. This paper contributes significantly to the field of AV perception in that it supports a scalable single solution, with future implications being in traffic violative detection.

Keywords : Trajectory Estimation, Speed Estimation, Lane Detection, Autonomous Driving, LiDAR, Camera-Based Perception, DBSCAN, YOLOv8, ByteTrack, Hough Transform, NuScenes Dataset, Real-Time Analysis.

References :

  1. Caesar, H., et al., “nuScenes: A Multimodal Dataset for Autonomous Driving,” CVPR, 2020.
  2. Redmon, J., et al., “YOLOv3: An Incremental Improvement,” arXiv:1804.02767, 2018.
  3. Zhang, Y., et al., “ByteTrack: Multi-Object Tracking by Associating Every Detection Box,” arXiv:2110.06864, 2021.
  4. Ester, M., et al., “DBSCAN: A Density-Based Algorithm,” KDD, 1996.
  5. Kuhn, H. W., “The Hungarian Method for the Assignment Problem,” Naval Research Logistics, 1955.

Autonomous vehicles (AVs) need complex perception systems for safe operation under dynamic traffic scenes. We introduce TrajecTrack, a machine learning-based platform that integrates real-time trajectory estimation, velocity estimation and lane detection from LiDAR and vision inputs. We apply DBSCAN clustering and the constant velocity model for predicted trajectories, with our speed estimation based on YOLOv8 and ByteTrack, plus a new module for lane detection based on edge detection and the Hough transform. Compared to the NuScenes dataset and sample video input, TrajecTrack provides high-accuracy visualizations of the trajectories, velocities and road lane markings and therefore improves the situational awareness of AVs. This paper contributes significantly to the field of AV perception in that it supports a scalable single solution, with future implications being in traffic violative detection.

Keywords : Trajectory Estimation, Speed Estimation, Lane Detection, Autonomous Driving, LiDAR, Camera-Based Perception, DBSCAN, YOLOv8, ByteTrack, Hough Transform, NuScenes Dataset, Real-Time Analysis.

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe