Jazzify: Reharmonising Jazz with Transformers

Jul 4, 2025

Building Jazzify was a weird mix of my two worlds: music and machine learning. The idea was simple — make jazz reharmonisation more accessible by using transformers to automate chord substitutions. But actually making it work? That was another story entirely.

The project started with a bold question: Can a machine learn to reharmonise a melody like a jazz musician? I trained a transformer model on a curated jazz dataset to try and answer that. The goal was to preserve the melody while injecting stylistically accurate harmonic variations — like tritone substitutions, modal interchange, and all that good stuff you only learn after years of theory and listening.

But here’s what they don’t tell you in papers or GitHub READMEs:

  • Getting quality jazz datasets is brutal. Most were either too simple, poorly formatted, or lacked proper chord annotations. I ended up doing a lot of manual cleaning and annotation — which felt more like archaeology than data science.
  • Evaluating musical output is subjective. You can't just say, “The model got it right.” You have to hear it, and even then, what sounds ‘right’ to one person might sound off to another. So, I ran expert evaluations, collected mean opinion scores, and stared into the void of melodic coherence metrics.
  • Model training was painful. Transformers are powerful, but they're also heavy. I had to juggle between Google Colab limits and my underpowered GPU, tweaking batch sizes and using FP16 just to get the model to train without crashing.
  • Feature extraction and preprocessing were half the battle. Representing chords and melodies in a way that makes sense to a model took ages to figure out. I ended up using keyword representations and transformer-friendly tokenisation for MusicXML files.

Still, it worked. The model could reharmonise minor-key melodies with surprisingly decent results — good enough to impress a few musicians and maybe even pass for creative on a good day.

I learned a lot. Not just about AI, but about how messy and non-linear real-world research can be. There’s a long way to go — real-time reharmonisation, support for major keys, maybe even letting the model improvise. But Jazzify was a step forward — a weird, chaotic, surprisingly musical step forward.

Stay tuned for more experiments in machine learning, music, and everything in between.