Video needed to confirm head impact sensor findings: study

Increasing awareness of concussion risks in athletes has prompted the use of head impact sensors to measure frequency and severity of impacts during sports. But the data from these sensors may be conclusive, according to a study from Children's Hospital of Philadelphia (CHOP), which shows these head sensors can record a large number of false positive impacts during real game play.

The CHOP team's study emphasizes that an extra step to video-confirm the sensor data is essential for research and for use of this data in injury prevention strategies for player safety. The findings were published online this month by the American Journal of Sports Medicine.

Statistics show that 20% of high school athletes who play a contact sport suffer a concussion each year. To understand the frequency, magnitude and direction of head impacts that athletes sustain, various sensors have been developed to collect head impact biomechanics data, including instrumented helmets, skull caps, headbands, mouthguards and skin patches.

However, when data are collected during game play rather than in a controlled laboratory environment, there is potential for false positives and false negatives. In this study, CHOP researchers collected data from headband-based head impact sensors worn by male and female soccer players to determine the proportion of false positives within the data and if video confirmation improved the quality of the data.

"Head impact sensors are a readily accessible tool for studying the mechanics of head impacts," said Declan Patton, PhD, lead author of the study and a research associate at the Center for Injury Research and Prevention at CHOP. "However, in order for researchers to have reliable data to analyze, they first need to verify whether sensor-recorded events are actual head impacts using either video- or observer-confirmation."

In this study, researchers fitted 72 high school varsity and junior varsity soccer players (23 female and 49 male) with headband-mounted impact sensors during 41 games over two seasons to capture sensor-recorded events during competition. All the games were video recorded. The research team analyzed the video to quantify the percentage of events recorded by the sensors that corresponded to an observable head impact event. In addition, the researchers video-verified sensor-recorded events against the manufacturer filtering algorithm, which was developed to eliminate false-positives.

The sensors recorded 9,503 events during scheduled game times, which was reduced to 6,796 when the verified game time was identified on video. Of the 6,796 events during verified game time, 4,021 or approximately 60%, were removed as they were associated with players not on the field at the time of recording and therefore not an actual head impact. This indicates that prior studies, which used head impact sensor data without these important methodological steps, probably had a high proportion of non-head impact events in their dataset.

Video footage of the 1,893 sensor-recorded events for players on the field and within the frame of the camera were reviewed, and the events were categorized into three types: impact events (69.5%), trivial events such as adjusting a headband (20.9%,), and non-events like a player remaining stationary (9.6%). The most common impact event was head-to-ball contact, which represented 78.4% of all impact events. Other impact events included player contact (10.9%), falls (9.8%) and ball-to-head contact (0.8%).

The study found that female athletes had a lower proportion of impact events (48.7% vs. 78.4%) and a higher proportion of trivial events (36.6% vs. 14.2%), which may be due to more frequent adjustments of the headband. However, among the actual impact events, the breakdown of types of impacts was similar between the genders.