Time-consistent Ball Tracking and Spin Estimation with Event Camera

Abstract

Ball tracking based on sports videos captured by high-speed cameras has had limited applicability due to the need for sufficient lighting and the necessity of abundant computational resources to process the large amount of captured data. To overcome these problems, this paper addresses ball tracking using an event camera. An event camera outputs luminance changes in a scene as “events” and is characterized by high temporal resolution, high dynamic range, and data efficiency. The proposed method selects the ball position from the bounding boxes output by the object detector by taking into account the ball motion estimated from the events and the past ball positions using Contrast Maximization (CMax). Considering past ball positions enables us to estimate the ball trajectory consistently over time and detect the ball position more accurately. Additionally, CMax is applied to the events within the tracked ball bounding box, allowing for the estimation of the ball spin. Experimental results in this paper show that the proposed method is quantitatively and qualitatively more accurate than conventional ball tracking methods and demonstrates the potential ability to estimate ball spin.

Publication
In Proceedings of the 7th ACM International Workshop on Multimedia Content Analysis in Sports
Takuya Nakabayashi
Takuya Nakabayashi
Ph.D student

My research interests include event-based vision, motion estimation, and edge computing.