praca-magisterska/project
2019-06-19 13:24:31 +02:00
..
__pycache__ -init-first-stage- 2019-05-28 12:40:26 +02:00
data add readme 2019-06-19 10:53:44 +02:00
generate.py get it working, on music21 and sequence style enoding 2019-06-01 17:05:38 +02:00
midi.py get it working, on music21 and sequence style enoding 2019-06-01 17:05:38 +02:00
readme.md readme update 2019-06-19 13:24:31 +02:00
settings.py data extraction to groups of instruments; drums is now avaible 2019-05-30 23:11:25 +02:00
settings.pyc -init-first-stage- 2019-05-28 12:40:26 +02:00
train.py get it working, on music21 and sequence style enoding 2019-06-01 17:05:38 +02:00

###MUSIC GENERATION USING DEEP LEARNING ###AUTHOR: CEZARY PUKOWNIK

#Files:

  • midi.py - code for data extraction, and midi convertion
  • train.py - code for model definition, and training session
  • generate.py - code for model loading, predicting ang saving to midi_dir
  • settings.py - file where deafult settings are stored
  • readme.md - this file

#Directories:

  • data/midi - directory where input midi are stored
  • data/models - directory where trained models are stored
  • data/output - directory where generated music is stored
  • data/samples - directory where extracted data from midi is stored
  • data/samples.npz - deprecated

#How to use:

  1. Use midi.py to export data from midi files

./midi.py [midi_folder_path] [output_path]

  1. Use train.py to train a model (this can take a while)

./train.py [input_training_data] [model_save_path] [epochs]

  1. Use generate.py to generate music from trained models

./generate.py [trained_model_path] [output_path] [treshold]