the touch.
The keyboard, made readable.
MIDI is forty-three years old. It was designed in 1983 to let two synthesisers, made by competitors, agree on what a note was — when it started, when it ended, how hard it had been struck, on which channel, with which programme. The protocol fits in three bytes: a status byte and two data bytes, sent over a 5-pin DIN cable at 31,250 baud. That spec has not been broken since.
WebMIDI is the same protocol, with the wire replaced by an event. Plug a controller into a laptop USB port; navigator.requestMIDIAccess() returns a list of inputs; each input fires onmidimessage with a three-byte buffer in the same shape your synthesiser expected forty years ago. The browser is now a MIDI host. It can read what you played; nothing else has to know.
Three bytes. Forty-three years. Still in use.
What you are looking at.
The piano roll above scrolls right-to-left through a 6-second window. Each note you play paints a phosphor-green bar at its pitch and stretches leftward as you hold it. The bars below the chamber are the keys themselves — they light when struck, regardless of source. Above them, in mono, are the QWERTY equivalents: the bottom row of your laptop keyboard maps to one octave from C4, the row above that to the next octave up.
Three input paths feed the same handler: WebMIDI events from any connected device, key-down events from the keyboard you are typing on, and pointer-down events on the visual keys. They all converge on a single function — noteOn(note, velocity) — that updates the held-notes set, schedules a Web Audio voice, and pushes a draw event into the roll. The page hears you the same way regardless of how you tell it.
Old protocol, new permission.
WebMIDI requires explicit user consent. The first time you click Enable MIDI, your browser asks whether this page may speak to your music devices — the same prompt it shows for camera and microphone access. The surface is small: list inputs, list outputs, listen for messages, send messages. There is no rendering, no synthesis, no clock. Browsers refuse to provide one. You bring the synthesizer.
The triangle-wave voices you can hear are not coming from MIDI; they are coming from Web Audio, which is a separate, wholly modern stack that arrived alongside WebMIDI as part of the same conversation about the browser-as-instrument. MIDI tells you what was played. Audio tells you what it sounds like. The page glues them together, in nine lines of JavaScript, and sounds like a real instrument because the protocols have always been good enough.
MIDI tells you what was played. Audio tells you what it sounds like.