Tartan-ify visualises the similarities within a single piece of music in a self-similarity matrix.

This was inspired by Song Sim and I built it (but since extended it) at Hackference. You can find the code for Tartan-ify on GitHub.

How it works

Firstly, all of the processing happens in your browser.

Tartan-ify divides the music into segments and compares each segment against every other segment. The comparisons are shown in a matrix with both axes donoting time and the colour denoting the difference between the segments at those times.

Tartan-ify does the comparison by analysing the spectrum of each segment using a Fast Fourier Transform, and then taking the sum of the difference of the power of each frequency between the segments.

How to use Tartan-ify

To use Tartan-ify, select a piece of music and then click Tartan-ify. The tool will detect the song's BPM and use this to split the song into segments and will choose sensible defaults for the visualisation. If the BPM can't be detected, 113 BPM will be used.

The smaller the segments and the longer the song, the longer it will take to generate the visualisation.

Run the tool

Select your music

Not yet selected
From examples
Advanced options


Not yet selected

This section defines how the song will be split into segments. The larger the BPM value, the smaller the segments.

Large segments can allow you to see the large scale structures of a song, for example, verse, chorus, bridge. Smaller can allow you to see smaller structures, like repeated motifs or notes.

Autodetect will try to find the BPM. You can specify a multipler to an autodetected BPM - this can be useful, for example, to halve (0.5) or double (2) the found BPM.




Which scale to use to visualise the data.


Setting a minimum and maximum percentile will fine-tune the range. The range of different sounds or lots of silence can reduce the useful range of colors of visualisation.


Other modes

Generates multiple visualisations in one run.