Tested with python 3.10, Ubuntu 22.04.2 LTS
- Some form of conda is required for this project.
- Create a conda environment.
conda create --name cnsbench python=3.10 -y
conda activate cnsbench
- Install Pytorch according to the instructions.
# As of 12 Jul 2023
conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia -y
- Install MMSegmentation with the following instructions.
pip install -U openmim
mim install mmengine
mim install "mmcv>=2.0.0"
pip install "mmsegmentation>=1.0.0"
- Install left-over dependencies (OpenSlide, ultralytics, scikit-image)
sudo apt -y install openslide-tools #debian/ubuntu
conda install scikit-image -c conda-forge -y
pip install -r requirements.txt
Each script handles a step of the process, beginning with get_datasets.py
- dataset: The name of the dataset to train on
- normalised: Whether stain normalised version should be used
- batch: The batch size
- lr: The learning rate
- epochs: The number of epochs for training
- iterations: The number of iterations (total batches) for training
- cache: Allows for disabling caching
- model: Choose either unet or deeplabv3plus
- dataset-root: The base directory for downloaded datasets
- *-interval: the number of iterations before something happens
- wandb: Allows training results to be logged to Weights and Biases
To download all datasets, simply run
python get_datasets.py
To change the root directory for downloaded datasets, use the --dataset-root flag
python get_datasets.py --dataset-root ~/my_datasets
There are training scripts for both YOLOv8 and the MMSegmentation models. To run the YOLOv8 training script on MoNuSeg:
python train_yolo.py --dataset MoNuSeg
To run the YOLOv8 training script on Normalised MoNuSeg images:
python train_yolo.py --dataset MoNuSeg --normalised
To change some hyperparameters (note that --cache DISABLES caching):
python train_yolo.py --dataset MoNuSeg --batch 16 --lr 0.001 --epochs 200 --cache
To run the MMSegmentation training script:
python train.py --dataset MoNuSeg --model deeplabv3plus --normalised
Some other settings
python train.py --dataset MoNuSAC --model unet --wandb --iterations 25000 --val-interval 10000 --log-interval 10 --checkpoint-interval 5000
For YOLOv8:
python visualise_yolo.py --dataset MoNuSeg --model <your_model>.pt
Additional options
python visualise_yolo.py --dataset MoNuSeg --model <your_model>.pt --max-det 1000 --normalised --binary --outline
For MMSegmentation
python visualise.py --dataset CryoNuSeg --config <your_config>.py --checkpoint <your ckpt>.pth --binary --outline --normalised
For YOLOv8:
python export_yolo.py --model <yolo_model>.pt --dataset TNBC --normalised
For MMSegmentation:
python export.py --checkpoint <your_ckpt>.pth --dataset TNBC --normalised
python evaluate.py --dataset MoNuSeg --compare-root <path_to_model_predictions>
python train.py --dataset TNBC --model unet --normalised
python export.py --checkpoint "work_dirs/TNBC/unet/<date>/iter_20000*.pth" --dataset TNBC --normalised
python evaluate.py --dataset TNBC --compare-root work_dirs/export/stainnorm/TNBC/unet