Coder Social home page Coder Social logo

physionetchallenges / vanilla-cnn-2024 Goto Github PK

View Code? Open in Web Editor NEW
0.0 3.0 2.0 21 KB

Example classifier with a convolutional neural network for the PhysioNet Challenge 2024

License: BSD 3-Clause "New" or "Revised" License

Dockerfile 0.65% Python 99.35%

vanilla-cnn-2024's Introduction

Example code for the George B. Moody PhysioNet Challenge 2024

What's in this repository?

This repository contains a more complex example that illustrates how to format a Python entry for the George B. Moody PhysioNet Challenge 2024. Like the simple example that we provide as a template for your entries, you can remove some of the code, reuse other code, and add new code to create your entry. You do not need to use the models, features, and/or libraries in this example for your entry. We encourage a diversity of approaches for the Challenges.

For this example, we implemented a convolutional neural network (CNN) that it is inspired by Zhang et al.. You can try it by running the following commands on the Challenge training set. If you are using a relatively recent personal computer, then you should be able to run these commands from start to finish on a very small subset (100 records) of the training data in less than an hour.

How do I run these scripts?

First, you can download and create data for these scripts by following the instructions in the following section.

Second, you can install the dependencies for these scripts by creating a Docker image (see below) or virtual environment and running

pip install -r requirements.txt

You can train your model(s) by running

python train_model.py -d training_data -m model

where

  • training_data (input; required) is a folder with the training data files, including the images and classes (you can use the ptb-xl/records500/00000 folder from the below steps); and
  • model (output; required) is a folder for saving your model(s).

We are asking teams to include working training code and a pre-trained model. Please include your pre-trained model in the model folder so that we can load it with the below command.

You can run your trained model(s) by running

python run_model.py -d test_data -m model -o test_outputs

where

  • test_data (input; required) is a folder with the validation or test data files, excluding the images and classes (you can use the ptb-xl/records500_hidden/00000 folder from the below steps, but it would be better to repeat these steps on a new subset of the data that you did not use to train your model);
  • model (input; required) is a folder for loading your model(s); and
  • test_outputs is a folder for saving your model outputs.

We are asking teams to include working training code and a pre-trained model. Please include your pre-trained model in the model folder so that we can load it with the above command.

The Challenge website provides a training database with a description of the contents and structure of the data files.

You can evaluate your model by pulling or downloading the evaluation code and running

python evaluate_model.py -d labels -o test_outputs -s scores.csv

where

  • labels is a folder with labels for the data, such as the training database on the PhysioNet webpage (you can use the ptb-xl/records500/00000 folder from the below steps, but it would be better to repeat these steps on a new subset of the data that you did not use to train your model);
  • test_outputs is a folder containing files with your model's outputs for the data; and
  • scores.csv (optional) is file with a collection of scores for your model.

How do I create data for these scripts?

You can use the scripts in this repository to generate synthetic ECG images for the PTB-XL dataset. You will need to generate or otherwise obtain ECG images before running the above steps.

  1. Download (and unzip) the PTB-XL dataset and PTB-XL+ dataset. These instructions use ptb-xl as the folder name that contains the data for these commands (the full folder name for the PTB-XL dataset is currently ptb-xl-a-large-publicly-available-electrocardiography-dataset-1.0.3, and the full folder name for the PTB-XL dataset is currently ptb-xl-a-comprehensive-electrocardiographic-feature-dataset-1.0.1), but you can replace it with the absolute or relative path on your machine.

  2. Add information from various spreadsheets from the PTB-XL dataset to the WFDB header files:

     python prepare_ptbxl_data.py \
         -i  ptb-xl/records500/00000 \
         -pd ptb-xl/ptbxl_database.csv \
         -pm ptb-xl/scp_statements.csv \
         -sd ptb-xl/12sl_statements.csv \
         -sm ptb-xl/12slv23ToSNOMED.csv \
         -o  ptb-xl/records500/00000
    
  3. Generate synthetic ECG images on the dataset:

     python gen_ecg_images_from_data_batch.py \
         -i ptb-xl/records500/00000 \
         -o ptb-xl/records500/00000 \
         --print_header
    
  4. Add the file locations and other information for the synthetic ECG images to the WFDB header files. (The expected image filenames for record 12345 are of the form 12345-0.png, 12345-1.png, etc., which should be in the same folder.) You can use the ptb-xl/records500/00000 folder for the train_model step:

     python prepare_image_data.py \
         -i ptb-xl/records500/00000 \
         -o ptb-xl/records500/00000
    
  5. Remove the waveforms, certain information about the waveforms, and the demographics and classes to create a version of the data for inference. You can use the ptb-xl/records500_hidden/00000 folder for the run_model step, but it would be better to repeat the above steps on a new subset of the data that you will not use to train your model:

     python gen_ecg_images_from_data_batch.py \
         -i ptb-xl/records500/00000 \
         -o ptb-xl/records500_hidden/00000 \
         --print_header \
         --mask_unplotted_samples
    
     python prepare_image_data.py \
         -i ptb-xl/records500_hidden/00000 \
         -o ptb-xl/records500_hidden/00000
    
     python remove_hidden_data.py \
         -i ptb-xl/records500_hidden/00000 \
         -o ptb-xl/records500_hidden/00000 \
         --include_images
    

Which scripts I can edit?

Please edit the following script to add your code:

  • team_code.py is a script with functions for training and running your trained model(s).

Please do not edit the following scripts. We will use the unedited versions of these scripts when running your code:

  • train_model.py is a script for training your model(s).
  • run_model.py is a script for running your trained model(s).
  • helper_code.py is a script with helper functions that we used for our code. You are welcome to use them in your code.

These scripts must remain in the root path of your repository, but you can put other scripts and other files elsewhere in your repository.

How do I train, save, load, and run my models?

You can choose to create digitization and/or classification models.

To train and save your model(s), please edit the train_models function in the team_code.py script. Please do not edit the input or output arguments of this function.

To load and run your trained model(s), please edit the load_models and run_models functions in the team_code.py script. Please do not edit the input or output arguments of these functions.

How do I run these scripts in Docker?

Docker and similar platforms allow you to containerize and package your code with specific dependencies so that your code can be reliably run in other computational environments.

To increase the likelihood that we can run your code, please install Docker, build a Docker image from your code, and run it on the training data. To quickly check your code for bugs, you may want to run it on a small subset of the training data, such as 1000 records.

If you have trouble running your code, then please try the follow steps to run the example code.

  1. Create a folder example in your home directory with several subfolders.

     user@computer:~$ cd ~/
     user@computer:~$ mkdir example
     user@computer:~$ cd example
     user@computer:~/example$ mkdir training_data test_data model test_outputs
    
  2. Download the training data from the Challenge website. Put some of the training data in training_data and test_data. You can use some of the training data to check your code (and you should perform cross-validation on the training data to evaluate your algorithm).

  3. Download or clone this repository in your terminal.

     user@computer:~/example$ git clone https://github.com/physionetchallenges/python-example-2024.git
    
  4. Choose the Dockerfile for a GPU-attached or CPU-only machine: (a) If you have a GPU, then you can build a Docker image that supports faster comoputation by renaming Dockerfile_gpu to Dockerfile. (b). Alternatively, if you do not have a GPU, then you can build a Docker image that supports your machine by renaming Dockerfile_cpu to Dockerfile.

  5. Build a Docker image and run the example code in your terminal.

     user@computer:~/example$ ls
     model  python-example-2024  test_data  test_outputs  training_data
    
     user@computer:~/example$ cd python-example-2024/
    
     user@computer:~/example/python-example-2024$ docker build -t image .
    
     Sending build context to Docker daemon  [...]kB
     [...]
     Successfully tagged image:latest
    
     user@computer:~/example/python-example-2024$ docker run -it -v ~/example/model:/challenge/model -v ~/example/test_data:/challenge/test_data -v ~/example/test_outputs:/challenge/test_outputs -v ~/example/training_data:/challenge/training_data image bash
    
     root@[...]:/challenge# ls
         Dockerfile             README.md         test_outputs
         evaluate_model.py      requirements.txt  training_data
         helper_code.py         team_code.py      train_model.py
         LICENSE                run_model.py      [...]
    
     root@[...]:/challenge# python train_model.py -d training_data -m model -v
    
     root@[...]:/challenge# python run_model.py -d test_data -m model -o test_outputs -v
    
     root@[...]:/challenge# python evaluate_model.py -d test_data -o test_outputs
     [...]
    
     root@[...]:/challenge# exit
     Exit
    

What else do I need?

This repository does not include data or the code for generating ECG images. Please see the above instructions for how to download and prepare the data.

This repository does not include code for evaluating your entry. Please see the evaluation code repository for code and instructions for evaluating your entry using the Challenge scoring metric.

How do I learn more? How do I share more?

Please see the Challenge website for more details. Please post questions and concerns on the Challenge discussion forum. Please do not make pull requests, which may share information about your approach.

Useful links

vanilla-cnn-2024's People

Contributors

matthewreyna avatar

Watchers

Gari avatar  avatar Kostas Georgiou avatar

Forkers

irs-uom irs-uom

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.