Home

Awesome

Meddleying-MAESTRO


Full-featured Algorithmic Intelligence Music Augmentator (AIMA) with full multi-instrument MIDI output.

NOW with Karaoke support :)


A perfect tool for a musician or a composer to stay competitive and relevant in the era of Artificial Intelligence :)


5 Very Good Reasons Why Meddleying MAESTRO (MM) and NOT an AI model/system:

  1. Without working and functional General Artificial Intelligence, creation of proper Music AI is NOT currently possible. All current SOTA Music AI implementations (i.e MuseNet or Magenta) rely on a similar music augmentation algorithms, heavy pre/post music/MIDI processing, and other tricks/hacks to compensate for shortcomings of regular AI (as opposed to GAI).

  2. No need for 10k USD GPUs to train/run the code/software. All you need is the cheapest computer/CPU to use/run the MM code. MM is small and fast enough to be deployed/ran on Raspberry PI (see MM repo for RP code/implementation).

  3. Super fast "training" on MIDI dataset/super fast music generation. It takes about 10-20 minutes to process/tune the average MIDI dataset with MM as opposed to hours or days with AI implementations. Same applies to music generation, as It takes the cheapest computer w/o a GPU and about 1 minute to generate an orignial performance with MM.

  4. Code/implementation/ideas used for MM can be adopted for RAW audio/music generation.

  5. MM does NOT have the same ethical and copyright issues as AI models/systems as it is pure algorithms/code/regular software, while offering a similar output/quality of music.


SoundCloud: https://soundcloud.com/aleksandr-sigalov-61/sets/meddleying-maestro

Video: https://youtu.be/46hKTkU7CDU


How to run:

Option 1:

  1. Click on Meddleying_MAESTRO.ipynb above
  2. Click on blue "Open in Colab" button in the Github preview

Option 2:

  1. git clone https://github.com/asigalov61/Meddleying-MAESTRO/
  2. cd /Meddleying-MAESTRO/
  3. Install all requirements from Requirements
  4. Unzip provided MIDI dataset in Dataset to Dataset or copy your MIDIs to Dataset
  5. python MM_MIDI_Processor.py
  6. python MM_Generator.py
  7. If everything worked the graph of the composition will pop up. Close it to start the fluidsynth player.
  8. Type quit in fluidsynth to return to command line prompt.

Enjoy! :)


Project Los Angeles

Tegridy Code 2020