views:

430

answers:

7

A couple of weeks ago, my piano teacher and I were bouncing ideas off of each other concerning meta-composing music software. The idea was this:

There is a system taking midi input from a bunch of instruments, and pushes output to the speakers and lights. The software running on this system analyzes the midi data it's getting, and determines which sounds to use, based on triggers set up by the composer (when I play an F7 chord 3 times within 2 seconds, switch from the harpsichord sound to the piano sound), pedals, or actual real-time analysis of the music. It would control the lights based on the performance and sounds of the instruments in a similar fashion - the musician would only have to vaguely specify what they wanted - and real time analysis of their playing would do the rest. On the fly procedurally generated music could play along with the musician as well. Essentially, the software would play along with the performer, with one guiding the other. I imagine that it would take some practice to get use to such a system, but that it could have quite incredible results.

I'm a big fan of improv jazz. One characteristic of improv that is lacking from other art forms is the temporalness of it. A painting can be appreciated 10 or 1000 years after it has been painted, but music (especially extemporized music) is about the performance as it is the creation. I think that the software that I described would add a great deal to the performance, as with it, as playing the exact same piece would result in a completely different show each time.

So, now for the questions.

Am I crazy?

Does software to do any or all of this exist yet? I've done some research and haven't turned up anything. The key to this system is that it is running during the performance.

Were I to write something like this, would a scripting language such as Python be fast enough to do the computations that I need? Presumably it'd be running on a fairly quick system, and could take advantage of the 2^n core processors Intel keeps releasing.

Can any of you share your experience and advice concerning interfacing with musical instruments and lights and the like?

Have any ideas or suggestions? Cold and harsh criticism?

Thanks for your time in reading this, and for any and all advice! (And sorry for the joke in the tags, I couldn't resist.)

+1  A: 

I have used PyAudio quite extensively for dealing with raw audio inputs, and found it to be very unpythonic, acting much more like a very thin wrapper over C code. However, if you're dealing with midi, rather then raw waveforms, then your tasks are quite a bit simpler, and python should be quite fast enough, unless you play at 10000 beats per minute :)

Some of the issues: detecting simultaneity, harmonic (formal - i.e., chord structure) analysis.

This is also an 80/20 problem that if you restrict the chord progressions allowed, then it becomes quite a bit simpler. After all, what does "playing along" mean, anyway, right?

(Also, at electronic music conf's I've been too, there are lots of people doing various real-time accompaniment experiments based on input sound and movement). Good luck!

Gregg Lind
I think as a first goal, the software would detect one note triggers, and just do ambient sounds - maybe guess at the key signature.What are the names of these electronic music conferences? If other people are doing this, it'd be great to get in touch with them.
mdkess
http://spark.cla.umn.edu/ Spark Festival of Electronic Music... but I think most large electronic music schools might have similar projects.
Gregg Lind
+2  A: 

Look at PureData. It can do extensive midi analysis and folks use it for performance.

Indeed, here's a video that flashes past a puredata screen. It shows someone interacting with a rather complex instrument using PD.

Also, look at CSounds.

S.Lott
+6  A: 

People have used Max MSP to do this kind of thing with Midi and creating video accompaniment, or just Midi accompaniment. It's a completely domain specific app, that probably was inspired by small talk or something, which barely any real programmer could love, but musician-programmers do.

Despite the text on the site I just linked to, and the fact that 'everyone' uses the commercial version, it wasn't always a commercial product. Ircam eventually released it's own lineage. It's called jMax. PureData, mentioned in another post here is another rewrite of that lineage.

There's also CSound; which wasn't meant to be real-time, but is likely able to be pretty real-time now that you have a decent computer compared to where CSound started.

Some people have also hacked Macromedia Director extensions to allow for doing midi stuff in Lingo... That's very outdated, and hence some of them have moved to more modern Adobe environments.

dlamblin
+2  A: 

You might also look at ChucK and SuperCollider, the two most popular 'real' realtime music programming languages.

Also, you might be surprised at how much you can accomplish with Ableton Live racks.

(and it's CSound. No 's' at the end)

msutherl
+1  A: 

see also:

I have no idea if the second one is actually real or worth looking at. Keykit, however, is.

dreftymac
+1  A: 

You might contact Gary Lee Nelson in the TIMARA department at Oberlin. 20 years ago I did a project that auto-generated the rhythm section for 12 bar blues and I recall him describing a tool that he knew of that did essentially what you're describing.

plinth
+1  A: 

You might be interested in GenJam

Steve Fallows