27 lines
1.0 KiB
Markdown
27 lines
1.0 KiB
Markdown
### MUSIC GENERATION USING DEEP LEARNING
|
|
### AUTHOR: CEZARY PUKOWNIK
|
|
|
|
# Files:
|
|
- midi.py - code for data extraction, and midi convertion
|
|
- train.py - code for model definition, and training session
|
|
- generate.py - code for model loading, predicting ang saving to midi_dir
|
|
- settings.py - file where deafult settings are stored
|
|
- readme.md - this file
|
|
|
|
# Directories:
|
|
- data/midi - directory where input midi are stored
|
|
- data/models - directory where trained models are stored
|
|
- data/output - directory where generated music is stored
|
|
- data/samples - directory where extracted data from midi is stored
|
|
- data/samples.npz - deprecated
|
|
|
|
# How to use:
|
|
1. Use midi.py to export data from midi files
|
|
>>> ./midi.py [midi_folder_path] [output_path]
|
|
|
|
2. Use train.py to train a model (this can take a while)
|
|
>>> ./train.py [input_training_data] [model_save_path] [epochs]
|
|
|
|
3. Use generate.py to generate music from trained models
|
|
>>> ./generate.py [trained_model_path] [output_path] [treshold]
|