Mez is a realtime, 3D music visualization instrument. Lately it's been my primary project. Check this page for more information.
An early version of MEZ worked directly from MIDI notes
Virtual and Augmented Reality
Synthesizing novel environments that blend well with the realism of our local realities is difficult to do well. I've experimented with many of the frameworks for doing such, which resulted in this crown of colors, which tracks your head, responds to music, and attempts to occlude itself correctly. While silly, it did give me more of an idea for how to design around limitations in future AR systems.
I'm also quite interested in the VR concert field, which has exciting implications for content creation as well. Distributed music making systems, where the crowd's input is synthesized live into some sort of musical piece, tend to not work well because of how sensitive the auditory system is to latency (not to mention the instrument design question). However, not every potentially crowdsourced parameter must go to generating music. If, instead, the virtual crowd in some way influences both their and others' visulization of the music, then that would be an experience worthy of one's full attention — a wonderful thing for the nascent industry. Of course, it would have to be a great deal more than just avatars going around with their Tilt Brush designed music-reactive creations on their head!
If you're interested in discussing that idea further, email me.