We have located links that may give you full text access.
Development of a Human Activity Recognition System for Ballet Tasks.
Sports Medicine - Open 2020 Februrary 8
BACKGROUND: Accurate and detailed measurement of a dancer's training volume is a key requirement to understanding the relationship between a dancer's pain and training volume. Currently, no system capable of quantifying a dancer's training volume, with respect to specific movement activities, exists. The application of machine learning models to wearable sensor data for human activity recognition in sport has previously been applied to cricket, tennis and rugby. Thus, the purpose of this study was to develop a human activity recognition system using wearable sensor data to accurately identify key ballet movements (jumping and lifting the leg). Our primary objective was to determine if machine learning can accurately identify key ballet movements during dance training. The secondary objective was to determine the influence of the location and number of sensors on accuracy.
RESULTS: Convolutional neural networks were applied to develop two models for every combination of six sensors (6, 5, 4, 3, etc.) with and without the inclusion of transition movements. At the first level of classification, including data from all sensors, without transitions, the model performed with 97.8% accuracy. The degree of accuracy reduced at the second (83.0%) and third (75.1%) levels of classification. The degree of accuracy reduced with inclusion of transitions, reduction in the number of sensors and various sensor combinations.
CONCLUSION: The models developed were robust enough to identify jumping and leg lifting tasks in real-world exposures in dancers. The system provides a novel method for measuring dancer training volume through quantification of specific movement tasks. Such a system can be used to further understand the relationship between dancers' pain and training volume and for athlete monitoring systems. Further, this provides a proof of concept which can be easily translated to other lower limb dominant sporting activities.
RESULTS: Convolutional neural networks were applied to develop two models for every combination of six sensors (6, 5, 4, 3, etc.) with and without the inclusion of transition movements. At the first level of classification, including data from all sensors, without transitions, the model performed with 97.8% accuracy. The degree of accuracy reduced at the second (83.0%) and third (75.1%) levels of classification. The degree of accuracy reduced with inclusion of transitions, reduction in the number of sensors and various sensor combinations.
CONCLUSION: The models developed were robust enough to identify jumping and leg lifting tasks in real-world exposures in dancers. The system provides a novel method for measuring dancer training volume through quantification of specific movement tasks. Such a system can be used to further understand the relationship between dancers' pain and training volume and for athlete monitoring systems. Further, this provides a proof of concept which can be easily translated to other lower limb dominant sporting activities.
Full text links
Related Resources
Trending Papers
Heart failure with preserved ejection fraction: diagnosis, risk assessment, and treatment.Clinical Research in Cardiology : Official Journal of the German Cardiac Society 2024 April 12
Proximal versus distal diuretics in congestive heart failure.Nephrology, Dialysis, Transplantation 2024 Februrary 30
Efficacy and safety of pharmacotherapy in chronic insomnia: A review of clinical guidelines and case reports.Mental Health Clinician 2023 October
World Health Organization and International Consensus Classification of eosinophilic disorders: 2024 update on diagnosis, risk stratification, and management.American Journal of Hematology 2024 March 30
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app