Coder Social home page Coder Social logo

iothong / badminton-hit-detection Goto Github PK

View Code? Open in Web Editor NEW

This project forked from kwyoke/badminton-hit-detection

0.0 0.0 0.0 97.02 MB

We attempt to detect shuttlecock hits on generic monocular badminton videos taken from the baseline, but agnostic to camera angles and player skill level. We hope this work will help both professional and amateur players alike in the automatic annotation of their badminton rally videos, for easy retrieval of shots of interest.

Python 0.26% Jupyter Notebook 99.74%

badminton-hit-detection's Introduction

Detecting shuttlecock hit events in professional and amateur badminton videos

In this project, we develop a hit detection algorithm to detect hits in generic monocular videos, agnostic to camera angle and player skill level. We propose to use badminton domain features (court, pose and shuttlecock coordinates) as input to a GRU based model. These coordinates are extracted from each frame, and a sequence of coordinates from 14 consecutive frames is fed into the GRU model to predict three classes: 'no hit', 'hit by near player', or 'hit by far player'. During training, we consider a sequence of 14 frames to be a hit, if a hit is found in the last six frames.

Contributions

  1. We find that our proposed method, even though it is only trained with professional broadcast singles videos, is able to generalize to some extent to amateur videos taken from different camera angles, doubles games, and players of different skill levels. This validates the robustness of badminton domain features (coordinate form) over generic RGB features.
  2. We provide annotated datasets of professional singles, amateur singles and amateur doubles matches here.
  3. We provide manual annotation tools to faciliate annotation of custom datasets under the datasets/ directory.
  4. We provide a semi-automatic annotation pipeline in the annotation_pipeline/ directory.

Demo

Below, we show a few demos of our proposed hit detection algorithm on videos of different camera angles, players of different skill levels, singles and doubles videos.

Hit detection

Professional video:

'pro/test_match1/1_05_02'

Amateur singles video:

'am_singles/match24/1_05_05'

Amateur doubles video:

'am_doubles/match_clementi/doubles5'

Semi-automatic annotation pipeline:

Pose tracking

Annotated pose sample

Shuttle trajectory tracking

Annotated shuttlecock sample

Datasets

We prepare three sets of annotated matches: Professional singles, amateur singles, amateur doubles. They are available in this google drive link and should be placed under the datasets/ directory.

We provide ground-truth annotations of shuttlecock coordinates, hit detections, and player bounding boxes. We also provide manual annotation tools under datasets/.

Dataset statistics

Evaluation

We compare our proposed methods with two baselines, a ResNet image classifier and a rule-based baseline based on comparing the second derivative of the shuttlecock x and y coordinates with an empirically determined threshold. We use mAP as the evaluation metric (see report for more information).

Table of evaluation comparison

Performance at various IoU thresholds

Different methods, same dataset

Graph comparison of different methods on the same dataset

Different datasets, same method

Graph comparison of same method on different datasets

Detecting near player hits vs far player hits

Graph comparison of performance on near vs far player hit detection

Qualitative analysis of strengths and weaknesses

Check out these videos for a demonstration of various qualities:

Usage

  1. Rally videos and their ground-truth annotations are provided under datasets/ in pro.zip, am_singles.zip, am_doubles.zip, or can be downloaded here
  2. Manual annotation tools are provided under datasets/ in the scripts label_tool_bbox.py, label_toolV2.py
  3. Semi-automatic annotation pipeline for pose and shuttlecock coordinates are in annotation_pipeline/
  4. Notebooks for organising datasets into input features and observing dataset statistics can be found in annotation_pipeline/organise_input_features.ipynb and annotation_pipeline/dataset_stats.ipynb
  5. Notebooks for training the proposed domain based algorithm and ResNet are found in domain_rnn.ipynb and ResNet_baseline.ipynb respectively. They take in input features from the directory input_features/, which can be downloaded here
  6. Notebooks for processing classification probabilities from the proposed algorithms can be found in hit_detection/process_pred_probs.ipynb. Sample classification probabilities can be found here
  7. The notebook for rule-based baseline method can be found in hit_detection/rule_baseline.ipynb
  8. Pretrained weights for the proposed domain method can be found in mm_weights/, or downloaded here
  9. Pretrained weights for ResNet can be found in resnet_data, or downloaded here

A small GPU is required for running the semi-automatic annotation pipeline, as well as for training the proposed GRU network and ResNet. The computational load is fairly light, see details in the training notebooks.

References

The following references were immensely useful for this project.

  1. MonoTrack: Shuttle trajectory reconstruction from monocular badminton video on using badminton domain features for hit event detection
  2. TrackNetV2: Efficient Shuttlecock Tracking Network on tracking shuttlecock with deep learning, as well as providing the TrackNetv2 dataset which formed the basis of our Professional dataset.

Full report

The full details are documented in the pdf report.

The full set of code can be found here

Future directions

  1. Domain Adaptation to improve generalisation.
  2. Multimodal feature learning, possibly combine audio and rgb features with domain coordinates.
  3. Larger and more varied training dataset.
  4. Extension to other aspects of badminton video analysis, including stroke classification, strategy analysis etc.

badminton-hit-detection's People

Contributors

kwyoke avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.