Hacker News new | past | comments | ask | show | jobs | submit login

David Cope experimented with Markov-chain-style generation of Bach pieces in the '80s, but he ended up switching to other models, like augmented transition networks (ATNs), because he wanted more global coherence, rather than locally coherent but wandering/aimless pieces.

He eventually ended up with a more complex system, EMI, that generates pieces in the style of around 100 composers, some of which have passed the "musical Turing test" in that scholars of the composer in question thought it might've been a genuine work.

There's a lot of pretty interesting experimentation with just about every possible generative grammar by other researchers, though, from Markov models to HMMs, context-free grammars, L-systems, cellular automata, etc. This 2009 book has a pretty good overview of what people have done, though it's textbook-priced: http://www.amazon.com/gp/product/3211999159/ref=as_li_ss_tl?...




Yeah and there's a lot more out there

http://www.amazon.com/Art-Artificial-Evolution-Evolutionary-... (about graphics as well as music)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: