U.S. markets open in 6 hours 50 minutes
  • S&P Futures

    3,377.00
    +25.00 (+0.75%)
     
  • Dow Futures

    27,910.00
    +246.00 (+0.89%)
     
  • Nasdaq Futures

    11,491.50
    +84.25 (+0.74%)
     
  • Russell 2000 Futures

    1,517.60
    +13.20 (+0.88%)
     
  • Crude Oil

    40.37
    +0.15 (+0.37%)
     
  • Gold

    1,903.30
    +7.80 (+0.41%)
     
  • Silver

    23.90
    +0.40 (+1.71%)
     
  • EUR/USD

    1.1755
    +0.0029 (+0.25%)
     
  • 10-Yr Bond

    0.6770
    0.0000 (0.00%)
     
  • Vix

    26.37
    +0.10 (+0.38%)
     
  • GBP/USD

    1.2940
    +0.0019 (+0.15%)
     
  • USD/JPY

    105.4930
    +0.0630 (+0.06%)
     
  • BTC-USD

    10,845.67
    +49.63 (+0.46%)
     
  • CMC Crypto 200

    224.32
    +2.92 (+1.32%)
     
  • FTSE 100

    5,866.10
    -31.40 (-0.53%)
     
  • Nikkei 225

    23,184.93
    -0.19 (-0.00%)
     

An update to a 37-year-old digital protocol could profoundly change the way music sounds

Dan Kopf
A synthesizer
A synthesizer

A lot of big things happened in music in 1983. It was the year Michael Jackson’s album Thriller hit number one across the world, compact discs were first released in the US, and the Red Hot Chili Peppers formed. Yet there was one obscure event that was more influential than all of them: MIDI 1.0 was released. MIDI stands for “Musical Instrument Digital Interface” and, after 37 years, it has finally received a major update. MIDI 2.0 is live, and it could mean the end of the keyboard’s dominance over popular music.

Whether you know it or not, MIDI has changed your music listening life. MIDI is the protocol by which digitized information is converted into audio. When a musician plays into a MIDI-enabled device, like a synthesizer or drum machine, MIDI is used to digitize the different elements of the music, like the note and the power with which it was played (a softly plucked C, for example, or a full-on fortissimo F-sharp). This allows music producers and technicians to adjust aspects of the music later on. For example, they might choose to change the pitch of certain notes or even switch the sound from a keyboard to a trumpet or guitar. Basically, it is what musicians use to program music. Ikutaro Kakehashi and Dave Smith, the leaders in creating MIDI in the early 1980s, rightfully won a Technical Grammy for their work in 2013.

The digitalization of music existed before MIDI, but creating a universal standard was a hugely important step, according to the composer Adam Neely. Instruments and computers made by different companies could easily communicate with each other using this agreed to protocol, simplifying the creative process and giving musicians choice of what instruments to use. If you want to use a Roland or Yamaha keyboard, or an Apple or Microsoft computer, MIDI can work with all of them.

“[MIDI] is now at the core of music making in the entire music industry, with the possible exception of classical music and acoustic based music which doesn’t interface with computers,” says Neely. Most people don’t directly use MIDI though, Neely explains, but work with it through a digital audio workstation like Ableton Live or Pro Tools.

Though MIDI has done an exceptional job of digitizing music for the last 37 years, it hasn’t been perfect. MIDI quantizes music, meaning it forces music components into a particular value. In MIDI 1.0, all data was in 7-bit values. That means musical qualities were quantized on a scale of 0 to 127. Features like volume, pitch, and how much of the sound should come out of the right or left speaker are all measured on this scale, with 128 possible points. This is not a lot of resolution. For some really sophisticated listeners, they can clearly hear the steps between points.

A world of possibilities

In his influential 2012 book How Music Works, former Talking Heads frontman David Byrne notes that keyboards have become the central instrument in music composition because they translate well to MIDI. MIDI’s low level of resolution made it better suited to modeling Western music and music played instruments with discrete tones, like keyboards. Music which relies on notes outside of standard Western music, and music played on string instruments are not as well represented. Neely says that is particularly difficult to capture the sounds of Indian and Turkish music. For sophisticated MIDI users, these issues could be addressed, but they were challenging, and not all artists have the time or desire to get into the technical minutiae of programming MIDI.

These may now be issues of the past. In early January 2020, the MIDI Manufacturers Association, the nonprofit organization that manages MIDI, announced the release of MIDI 2.0. The new protocol involved years of work from the organization’s volunteers, and getting companies like Google, Apple, Microsoft, and all of the major music manufacturers on board.

There are a few major changes in the new version. The biggest development is the expansion from 7-bit values to 32-bit values. Mike Kent, one of the technical leaders in creating MIDI 2.0, says this is like going from the resolution of a 1980s television to the high-def televisions of today. It means that instead of 128 steps for features like volume, there will now be billions. An area where producers think this might be particularly helpful is allowing for subtle “pitch bend” (see the video below) and controlling how much bass and treble are emphasized in every note.

Also, with more memory, there are simply many more possible features that MIDI 2.0 can try to emulate. More memory should also reduce the chance of the timing between playing a MIDI instrument and digital recording to be slightly off. This should mean music played on MIDI 2.0 instruments will feel more analog, and make it possible for non-keyboard instruments to work better with MIDI. Historically, guitar, violin, and trumpet players have had to learn play keys in order to better translate their work through MIDI. Now, hopefully, they will be able to play their instrument of choice as an input into MIDI-compatible recording software.

“Digital instruments have changed the way we make music and have made completely new music forms, which I love. I am a synthesizer geek,” said Mike Kent. He continued:

“But acoustic instruments—which have been around for hundreds of years—have a different type of expression and sometimes digital instruments have not been able to deliver the same expression that analog instruments have been able to deliver. I play synthesizer and I play the trumpet and the trumpet really feels to me like the a part of my body. I think music and it comes out [of my trumpet]. But I believe because of MIDI 2.0 synthesizers and other electronic instruments will become more expressive. We will have more individual control over each note.”

Another major advancement is that MIDI 2.0 is that it allows for bi-directional communication between devices. In the original MIDI, one device could send information to another, but that device could not communicate back. The fact that MIDI 2.0 is bidirectional has two major effects. First, it means that it is backwards compatible, and won’t make the billions of MIDI 1.0 devices already out in the world obsolete. Second, MIDI 2.0 devices will be able to communicate with each other about how features should be digitized, making life a lot easier on music makers, because they don’t have to address this later on.

 “We will have more individual control over each note.” Angelo Duncan, a musician and electronic music production teacher at San Francisco’s Women’s Audio Mission, is excited about the changes. Duncan’s primary issues with MIDI have been latency (the fact that MIDI doesn’t perfectly capture the timing of how the music was played) and that MIDI guitars typically don’t typically work as well as keyboards. The first problem is likely to be solved by MIDI 2.0, and Duncan is optimistic it might help with the second.

“I’ve always been interested in guitar MIDI controllers. I do play keys, but I am a much more proficient guitarist,” said Duncan, adding:

“I think using a MIDI guitar would change the way I make music. The way our brain orients to making music on a guitar is just different to a keyboard layout. I used to have a MIDI guitar instrument, but I don’t have it anymore because I felt like there was a lot of latency and I didn’t really like the results I got. I am hoping [MIDI 2.0] will solve some of the issues I had before.”

In terms of the overall effect of MIDI 2.0, Duncan believes it will simplify the workflow of a lot of producers, since it is supposed to communicate better with software like Ableton. Duncan also thinks it will have a big impact on the composition of musical scores for movies and television, which are almost always written in MIDI (like the Game of Thrones theme song). Scores often use strings and brass instrumentation, and MIDI 2.0’s higher resolution should better capture the textures, tonality, and range of those instruments, according to Duncan.

Among the producers Quartz spoke with, there were very few concerns about the updated protocol. Of course, with any new technology there will be hiccups, but since MIDI 2.0 doesn’t make MIDI 1.0 obsolete, there is more excitement about the possibilities than concern about drawbacks.

Adam Neely points out that most of the effects won’t be heard for years, or perhaps decades. The amount of choice MIDI 2.0 allows may now seem superfluous, but he thinks it’s hard to know what music will sound like 50 years from now. The recent MIDI update could allow people to build musical worlds we can’t yet imagine.

 

Sign up for the Quartz Daily Brief, our free daily newsletter with the world’s most important and interesting news.

More stories from Quartz: