This is the code for the reproducibility of parts of the paper: Pre-Train Your Loss: Easy Bayesian Transfer Learning with Informative Priors.
The code inside folder BayesianTransferLearning
is copied from the authors GitHub, but adjusted for replicating the results. (THIS CODE BELONGS TO THE AUTHORS).
# using pip in a virtual environment
pip install -r requirements.txt
# using Conda
conda create --name <env_name> --file requirements.txt
conda activate <env_name>
- Run
./BayesianTransferLearning/Prapare Data/oxford-102-flowers.py
to download the Oxford-102-Flowers data. - Run
downsampled_data/create_downsampled_folders.py
from thedownsampled_data
directory to create the subfolders with smaller data set sizes. - Download priors from the original authors and put them in to
priors
folder.
There are three experiments we ran corresponding to python scripts:
-
The influence of low-dimensional rank on performance (
run_experiment_rank.py
) -
The influence of prior scaling on performance (
run_experiment_scale.py
) -
Comparison of Bayesian and non-Bayesian learning (
run_experiment_comparison.py
)
To run experiments simply run the corresponding scipt.
The results will accumulate in a text file (results_*.txt
).