This repository contains our solution for the second assignment of Digital Musicology (DH-401) course. The assignment consisted of three tasks: (A) comparing performed and unperformed versions of a piece of our choice, (B) use our observations to make original MIDI more expressive, and (C) listen to the generated MIDI and evaluate how human-like it is. We used Schubert Impromptu Op. 90 No. 3 in all our experiments.
We used Aligned Scores and Performances (ASAP) dataset for the assignment.
Follow this steps to reproduce our work:
-
(Optional) Create and activate new environment using
conda
orvenv
(+pyenv
).a.
conda
version:# create env conda create -n project_env python=PYTHON_VERSION # activate env conda activate project_env
b.
venv
(+pyenv
) version:# create env ~/.pyenv/versions/PYTHON_VERSION/bin/python3 -m venv project_env # alternatively, using default python version python3 -m venv project_env # activate env source project_env
-
Install all required packages
pip install -r requirements.txt
-
Install
pre-commit
:pre-commit install
-
Download dataset:
mkdir data cd data git clone https://github.com/fosfrancesco/asap-dataset.git
-
If you want to convert MIDI to wav file:
apt-get install fluidsynth > /dev/null cp /usr/share/sounds/sf2/FluidR3_GM.sf2 ./font.sf2
To transfer MIDI from unperformed version to a performed one, run the following command:
python3 run_transfer.py
See run_transfer.py --help
for command-line arguments.
Generated MIDI is located in results
dir. Our final MIDI is called generated_midi_with_pedal.mid
. We provide corresponding audio version, however, it is better to use piano roll (like this one).
The project structure is as follows:
├── data # dir for all data, including raw and processed datasets
│ └── asap-dataset
├── results # dir with original midi and its generated_versions
│ ├── generated_midi.mid
│ ├── generated_midi.wav
│ ├── generated_midi_with_pedal.mid
│ ├── generated_midi_with_pedal.wav
│ ├── original_midi.mid
│ ├── original_midi.wav
│ ├── xml_midi.mid
│ └── xml_midi.wav
├── experiments.ipynb # used for experiments
├── run_transfer.py # Core script, trasfer midi to performed version
├── README.md # this file
├── requirements.txt # list of required packages
├── observations.md # observations/ideas to implement
└── src # package with core implementations
├── interpret.py # core algorithms for MIDI generation (main source)
├── data.py # data loading and processing, used for experiments
├── midi_transfer.py # used for experiments (outdated)
├── estimators.py # used for experiments (outdated)
├── __init__.py
└── plots.py # used for experiments (outdated)
The project was done by:
- Petr Grinberg
- Marco Bondaschi
- Ismaïl Sahbane
- Ben Erik Kriesel