Leap Motion Synth

I helped a friend develop a simple tone generator as a media for musical experiential learning for kids. He wants to use Leap Motion so kids can use their finger gestures to generate tone as well as learning the pitch of the notes.

Leap Synth

This was a good experience for me as I wanted to learn further about designing UI for gestural input device such as Leap Motion. This time, I propose this scenario:

  1. Use the right hand index finger for choosing which note to trigger
  2. Use the left hand to trigger playing and stopping note. When the palm is closed, a note is triggered, when the hand is opened, a note is stopped being played

As with previous projects, I used Processing for development as I can easily export this as a Windows application so he could deploy it without many hassles. The main challenge was to get Processing to detect which hand is right or left. In the end, I decided to detect hand position in relative to the Leap Motion. Afterwards, the finger detection and tracking was done. Mind that this was done May 2014, and several months after, Leap Motion released a new API which provide an easier way to detect left/right hand. Ha!

wpid-wp-1437968120433.jpeg

wpid-wp-1437968111415.jpeg

I went through several iterations, including using threads to ensure a smooth experience. However, in the end, I settle for a thread-less solution, since it didn’t require hands position detection in the start. It was a good learning experience, especially for designing UI. As I saw that this solution wasn’t really ideal, since the hands became very busy, though accurate enough to implement the choose-and-confirm paradigm as being employed in mouse.

I know that further development in UI paradigm is required to further improve the application of Leap Motion.

Leave a Reply