Gestural Robot – Project Log pt. 1

Last week, between spare time from making material for coding classes at Framework, I made a small prototype of a finger gesture-controlled robot. Gestural control has been something that fascinates me for quite a year now and since I have a robot kit idle, I thought that this would be a good time to use gesture as an input to something other than pixel-based material.

I use Leap Motion as the sensor and input, get Processing to read the data of finger numbers and transmit it via serial to Arduino which in turns, orchestrate which wheel to move and its respective movement direction. I use a 2-wheel robot which is controlled by an L298 motor shield.

Continue reading →

Speaking At GNOME.Asia 2015

This is long overdue, my bad, I should’ve written this months ago. Haha. Anyway, I had the chance to speak at GNOME.Asia 2015, a regional level conference on GNOME and open source software in general. In case you didn’t know, GNOME is one of the available desktop environment for Linux-based OS. If you’ve used (or still uses) Linux for the past few years, chances are, your application window (among others) is managed by GNOME. That’s how important GNOME is. Therefore, it’s such an honour to be able to speak hear, even though I didn’t register until the very last day of abstract submission.

Continue reading →

Leap Motion Synth

I helped a friend develop a simple tone generator as a media for musical experiential learning for kids. He wants to use Leap Motion so kids can use their finger gestures to generate tone as well as learning the pitch of the notes.

Leap Synth

This was a good experience for me as I wanted to learn further about designing UI for gestural input device such as Leap Motion. This time, I propose this scenario:

  1. Use the right hand index finger for choosing which note to trigger
  2. Use the left hand to trigger playing and stopping note. When the palm is closed, a note is triggered, when the hand is opened, a note is stopped being played

As with previous projects, I used Processing for development as I can easily export this as a Windows application so he could deploy it without many hassles. The main challenge was to get Processing to detect which hand is right or left. In the end, I decided to detect hand position in relative to the Leap Motion. Afterwards, the finger detection and tracking was done. Mind that this was done May 2014, and several months after, Leap Motion released a new API which provide an easier way to detect left/right hand. Ha!

wpid-wp-1437968120433.jpeg

wpid-wp-1437968111415.jpeg

I went through several iterations, including using threads to ensure a smooth experience. However, in the end, I settle for a thread-less solution, since it didn’t require hands position detection in the start. It was a good learning experience, especially for designing UI. As I saw that this solution wasn’t really ideal, since the hands became very busy, though accurate enough to implement the choose-and-confirm paradigm as being employed in mouse.

I know that further development in UI paradigm is required to further improve the application of Leap Motion.