Classifier-Head Informed Feature Masking and Prototype-Based Logit Smoothing for Out-of-Distribution Detection
This is the source code for "Classifier-Head Informed Feature Masking and Prototype-Based Logit Smoothing for Out-of-Distribution Detection".
Please download ImageNet-1k and place the training data and validation data in ./data/ilsvrc2012/train
and /data/ilsvrc2012/val
, respectively.
We have 4 OOD datasets from iNaturalist, SUN, Places, and Textures, and de-duplicated concepts overlapped with ImageNet-1k.
For iNaturalist, SUN, and Places, we have sampled 10,000 images from the selected concepts for each dataset, which can be download via the following links:
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/iNaturalist.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/SUN.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/Places.tar.gz
For Textures, we use the entire dataset, which can be downloaded from their original website.
Please put all downloaded OOD datasets into ./data/ood_data/
.
The downloading process will start immediately upon running.
We provide links and instructions to download each dataset:
- SVHN: download it and place it in the folder of
/data/ood_data/SVHN
. Then runpython select_svhn_data.py
to generate test subset. - Textures: download it and place it in the folder of
./data/ood_data/dtd/images
. - Places365: download it and place it in the folder of
./data/ood_data/places365/test_subset
. We randomly sample 10,000 images from the original test dataset. - LSUN-C: download it and place it in the folder of
./data/ood_data/LSUN-C
. - LSUN-R: download it and place it in the folder of
./data/ood_data/LSUN_R
. - iSUN: download it and place it in the folder of
./data/ood_data/iSUN
.
For example, run the following commands in the root directory to download LSUN-C:
cd /data/ood_data
wget https://www.dropbox.com/s/fhtsw1m3qxlwj6h/LSUN.tar.gz
tar -xvzf LSUN.tar.gz
For CIFAR, the model we used is in the checkpoints/network/baseline
folder.
For ImageNet, the model we used in the paper is the pre-trained ResNet-50 and MobileNetv2 provided by Pytorch. The download process will start upon running.
It is tested under Python 3.9 environment, and requries some packages to be installed:
We need precomputing for calculate class-mean feature vector.
Run ./scripts/run_precompute.sh
. For example:
CIFAR-10/CIFAR-100 with DenseNet:
sh scripts/run_precompute.sh CIFAR-10/CIFAR-100 densenet
Run ./scripts/run_eval.sh
. For example:
CIFAR-10/CIFAR-100 with DenseNet:
sh scripts/run_eval.sh CIFAR-10/CIFAR-100 densenet HIMPLoS 60 1.6
Run ./scripts/run_eval.sh
. For example:
ImageNet with ResNet-50:
sh scripts/run_eval.sh imagenet resnet50 HIMPLoS 30 0.8