A proposal for a standard format for representing music in JSON, with the aim of making emerging web apps using the new Web Audio and Web MIDI APIs interoperable.
This document is intended as a discussion starter. Please comment, propose ideas and make pull requests.
Here are the first two bars of Dolphin Dance represented in Music JSON:
{
"name": "Dolphin Dance",
"events": [
[2, "note", 76, 0.8, 0.5],
[2.5, "note", 77, 0.6, 0.5],
[3, "note", 79, 1, 0.5],
[3.5, "note", 74, 1, 3.5],
[10, "note", 76, 1, 0.5],
[0, "chord", "C", "∆", 4],
[4, "chord", "G", "-", 4]
],
"interpretation": {
"time_signature": "4/4",
"key": "C",
"transpose": 0
}
}
A sequence is an object with the properties name
and events
,
where name
is a string and events
is an array of events.
{
"name": "My Tune",
"events": [event1, event2, ...],
"sequences": [sequence1, sequence2, ...],
"interpretation": {...}
}
A sequence may also optionally contain an array of dependent sequences
and
an interpretation
object, which is used to give hints to a renderer.
An event is an array describing the time and type and the data needed to describe the event.
[time, type, data ...]
An event MUST have a start time
and a type
.
An event also contains extra data that is dependent on the type.
time
– FLOAT, describes a point in time from the start of the sequence.
time
values are arbitrary – they describe time in beats, rather than
in absolute time, like seconds. The absolute time an event is played is dependent
upon the rate and the start time of the sequence it inhabits.
type
– STRING, the event type. The type determines the structure of the
rest of the data in the event array.
[time, "note", number, velocity, duration]
[time, "param", name, value, curve, duration]
[time, "control", number, value]
[time, "pitch", semitones]
[time, "chord", root, mode, duration]
[time, "sequence", id, address]
[time, "note", number, velocity, duration]
number
– INT [0-127], represents the pitch of a note
velocity
– FLOAT [0-1], represents the force of the note's attack
duration
– FLOAT [0-n], represents the duration at the sequence's current rate
We'd welcome feedback on the merits of using a "note" with a duration over separate "noteon" and "noteoff" events (as in MIDI) github.com/soundio/music-json/issues.
[time, "param", name, value, curve, duration]
name
– STRING, the name of the param to control
value
– FLOAT, the new value of the param
curve
– STRING ["step"|"linear"|"exponential"], represents the type of ramp to use to transition to value
duration
– NUMBER [seconds], where curve
is not "step"
, defines the duration of the ramp
Useful for MIDI apps, but it is preferred to use "param" events.
[time, "control", number, value]
number
– INT [0-127], represents the number of the control
value
– FLOAT [0-1], represents the value of the control
[time, "pitch", semitones]
value
– FLOAT [semitones], represents a pitch shift in semitones
A chord gives information about the current key centre and mode of the music. A chord event could be used by a music renderer to display chord symbols, or could be interpreted by a music generator to improvise music.
[time, "chord", root, mode]
root
– STRING ["A"|"Bb"|"B" ... "F#"|"G"|"G#"], represents the root of the chord
mode
– STRING ["∆"|"-" ... TBD], represents the mode of the chord
[time, "sequence", data, rate]
data
– STRING|OBJECT, the name of a sequence found in this sequence's sequences
array, or a sequence object.
rate
– FLOAT [0-n], the rate at which to play back the sequence relative to the rate of the
current sequence.
Events in the child sequence should be played back on the target(s) of the current sequence.
The sequence event MAY have an optional final parameter, address
, that defines
an alternative target to play the child sequence to.
[time, "sequence", sequence, rate, address]
address
– NUMBER|STRING, the id or path of an object to play the sequence to.
// Trigger object id 3
[0.5, "sequence", "groove", 1, 3]
The optional interpret object contains meta information not directly needed to render the music as sound, but required to render music as notation. A good renderer should be capable of making intelligent guesses as to how to interpret Music JSON as notation and none of these properties are required.
{
"time_signature": "4/4",
"key": "C",
"transpose": 0
}
- sound.io creates and exports Music JSON.
- Soundstage, the JS library that powers sound.io, can be used to edit and play Music JSON in any web page.
- MIDI Soundio's MIDI library converts MIDI events to Music JSON events with it's
normalise
method. - Scribe is a music notation interpreter and SVG renderer that consumes (an old version of) Music JSON.
- OSC spec: http://opensseqoundcontrol.org/spec-1_0
- OSC example messages: http://opensoundcontrol.org/files/OSC-Demo.pdf
- Music XML: http://www.musicxml.com/for-developers/
- VexFlow: http://www.vexflow.com/