Graduate Degree Type
School of Engineering
Dr. Yunju Lee
Dr. Jaerock Kwon
Dr. Sunghwan Joo
Accurately identifying human key points is crucial for various applications, including activity recognition, pose estimation, and gait analysis. This study presents a high-resolution dataset created using the VICON motion capture system and three differently oriented 2D cameras, that can be used to train different neural networks for estimating the 2D key joint positions of the person from the 2D images or videos. The participants in the study included 25 healthy adults (17 males and 8 females) performing normal gait movements for about 2 to 3 seconds. The VICON system captured 3D ground truth data, while the three 2D cameras collected images from different perspectives (0°, 45°, and 90°). The dataset was used to train the Body Pose Network (BPNET), a popular neural network model developed by NVIDIA TAO. For comparison, another BPNET model was trained using the COCO 2017 (Common Objects in Context) dataset, a state-of-the-art dataset with more than 118,000 annotated images. Results demonstrate that the proposed dataset achieved significantly higher accuracy compared to the COCO 2017 dataset, despite containing only one-fourth of the number of images than the COCO 2017 dataset. This reduction in data size resulted in improved computational efficiency during model training. Moreover, the proposed dataset's unique focus on gait and its precise prediction of key joint positions during normal gait movements set it apart from other existing datasets.
Potential applications of this study include person identification based on gait features, non- invasive detection of player concussions through temporal analysis in sports activities, and
identification of pathologic gait patterns. The proposed dataset shows promise for further accuracy enhancements with the incorporation of additional data.
Lama, Bibash, "Enhancing Human Key Point Identification: A Comparative Study of High-Resolution VICON Dataset and COCO Dataset Using BPNET" (2023). Masters Theses. 1098.