Today’s moment of science begins with a quiz. Memorize this sequence: one, seven, five, seven, three, nine, eleven, four, six. Got it? Now try this sequence: (Simple melody such as “Happy Birthday”)
Unless you’re really good with numbers, the second sequence is a whole lot easier to follow along with than the first one. But it can still be considered a numerical sequence: just call every note in the octave a number from one to twelve. This ability of the human brain to follow melodic sequences, and to tell when something in them has been changed, is being used by computer programmers to catch bugs. Not the real kind of bugs–software bugs.
James Alty of England’s Loughborough University and Paul Vickers of Northumbria University have started converting computer languages into musical sequences. The idea is that software programmers who are scanning the new programs for mistakes can, instead of examining the program language itself, sit back and listen to some computer generated melodies play and wait for mistakes. A melody in a major key might represent one processing path, the same melody in a minor key a separate path. Didn’t think “Stairway to Heaven” went like that? Stop the program: you’ve just found an error.
This is a creative and, as it turns out, an effective way to help the next generation of software be closer to bug-free. If rock bugs you more than bugs, never fear: the programmers are doing the same thing with soothing classical melodies as well.