Not just a ‘theremin’ — that totally downplays the power of midi.
Someone else mentioned the mimu gloves, and I love the idea of a vision based controller - almost everyone has a phone, tablet or laptop with a camera, especially if theyre making music.
I also love that this could blur the lines of music playing and dance.
The Leap Motion Controller came out in 2014 already (11 years ago, wow!) and isn't very expensive. The SDK was lacking in the beginning if I recall correctly, but a webcam seems to be inferior. Technology isn't the limiting factor for a quite some time now. I'm sure many projects existed to translate gestures to MIDI, some less polished, some more polished[0][1].
Reminds me... I even used two PlayStation Eyes (EUR 5 each) with OpenCV and the EVM algorithm[2] on a ThinkPad X230 for a dance performance piece back in 2015. Movements rather than gestures and OSC instead of MIDI, but it worked great!
From the article: "When using AirBending for pitch control, you can lock your gestures to specific musical scales and keys. This ensures every note you play is perfectly in tune with your composition"
Reminds me of the Moog Theremini - that was a fun bit of kit.
Open the app and open Logic Pro. Create a MIDI track on Logic, try waving to the app it should automatically receive MIDI message from all channels and all MIDI devices.
Then if you want to filter the track to receive specific MIDI channel from specific device, for example AirBending channel 2, then find it in the dropdown in the MIDI inspector section in the same MIDI track.
Since I developed it using Vision framework from Apple, the current focus is still for Apple devices. So, not in the near future to develop for Linux and Windows.
Not just a ‘theremin’ — that totally downplays the power of midi. Someone else mentioned the mimu gloves, and I love the idea of a vision based controller - almost everyone has a phone, tablet or laptop with a camera, especially if theyre making music.
I also love that this could blur the lines of music playing and dance.
Great job OP, thanks for sharing.
The Leap Motion Controller came out in 2014 already (11 years ago, wow!) and isn't very expensive. The SDK was lacking in the beginning if I recall correctly, but a webcam seems to be inferior. Technology isn't the limiting factor for a quite some time now. I'm sure many projects existed to translate gestures to MIDI, some less polished, some more polished[0][1].
Reminds me... I even used two PlayStation Eyes (EUR 5 each) with OpenCV and the EVM algorithm[2] on a ThinkPad X230 for a dance performance piece back in 2015. Movements rather than gestures and OSC instead of MIDI, but it worked great!
[0]: https://midipaw.com/
[1]: https://uwyn.com/geco/
[2]: https://people.csail.mit.edu/mrub/evm/
From the article: "When using AirBending for pitch control, you can lock your gestures to specific musical scales and keys. This ensures every note you play is perfectly in tune with your composition"
Reminds me of the Moog Theremini - that was a fun bit of kit.
https://en.wikipedia.org/wiki/Theremini
Imogen Heap demonstrating her Mi.Mu gloves: https://www.youtube.com/watch?v=ci-yB6EgVW4
Using the gloves during an NPR Tiny Desk concert: https://www.youtube.com/watch?v=3QtklTXbKUQ&t=555s
Theremin.
Here's someone playing a theremin who's reasonably good.[1]
[1] https://www.youtube.com/watch?v=K6KbEnGnymk
That looks great!
Do you have any suggestion for how to learn how to hook this up to Logic for anyone who hasn’t used midi before?
I wonder how this would perform under live stage lighting conditions, i.e coloured strong lights and high contrast.
Open the app and open Logic Pro. Create a MIDI track on Logic, try waving to the app it should automatically receive MIDI message from all channels and all MIDI devices.
Then if you want to filter the track to receive specific MIDI channel from specific device, for example AirBending channel 2, then find it in the dropdown in the MIDI inspector section in the same MIDI track.
Seems like the same software could be used as a soundtrack for Tai Chi exercises. Would be pretty neat.
It seems possible, since Apple’s Vision framework can read body pose too. Maybe I can try it for the next update.
This instruments timbre and tone are literally dream shit to me, so wavy and I dont know—unearthly/worldly
Great work but wouldn't the iPhone with the lidar depth sensor be a better device?
It’s on the plan to expand this app for iPhone. But, I haven’t tried lidar, so I decided to release for macOS first.
Also with iPhone, I have to think how to transmit MIDI data to DAW on laptop. Well, most likely via USB or network.
Apple devices can do MIDI over Bluetooth. I've used this in the past to send VisionPro hand tracking data as MIDI.
I do this with "Apple Lightning to USB Camera Adapter". iPad basically talks to the midi sequencer via USB.
I would love this as an AUv3 for iPadOS, any plans?
I wonder if Linux version will be available.
Since I developed it using Vision framework from Apple, the current focus is still for Apple devices. So, not in the near future to develop for Linux and Windows.
Thanks for the confirmation!
This is really cool, thanks for sharing.