High Impact Factor : 4.396 icon | Submit Manuscript Online icon |

Review of Multi person Video Tracking Optimizations Using Optical Flow, Convolutional Neural Networks, and Marker less Motion Captures

Author(s):

Ashish D. Thete , Prof Ram Meghe Institute of Technology and Research Sant Gadage Baba Amravati University, Amravati; Dr. Prashant V Ingole, Prof. Ram Meghe Institute of Technology and Research Sant Gadage Baba Amravati University, Amravati

Keywords:

Multi-Person Tracking, Convolutional Neural Networks, Optical Flow, Marker-Less Motion Capture, Deep Learning, Scenario

Abstract

The increasing need for accurate and efficient multi-person video tracking has led to much research in the optimization of tracking models for surveillance, healthcare, sports analytics, and behavioral analysis. However, despite the impressive progress made so far, the existing reviews within the domain rarely provide a structured taxonomy that can compare different models across key performance metrics such as accuracy, occlusion handling, real-time efficiency, and adaptability. Additionally, previous reviews lack a systematic analysis of deep learning-driven approaches, hybrid methodologies, and their integration with decentralized frameworks such as block-chain. This work fills these gaps by conducting a comprehensive, iterative taxonomy-based review of recent state-of-the-art multi-person tracking models, evaluating them under a PRISMA-driven framework process. This structured review provides an in-depth comparative analysis, identifying optimal models for real-time tracking, medical diagnostics, and blockchain-based decentralization. It has contributed to the development of next-generation tracking solutions by guiding researchers toward integrating hybrid AI models, decentralized computing, and energy-efficient tracking mechanisms that enhance tracking accuracy, security, and scalability in real-world scenarios.

Other Details

Paper ID: IJSRDV13I30024
Published in: Volume : 13, Issue : 3
Publication Date: 01/06/2025
Page(s): 37-42

Article Preview

Download Article