Gestural Automated Phone System

Tools: Kinect, Android Phone
Software: custom-made using Processing for both the MacBook and the Android
OS: Mac OS X 10.6
Year: 2012

Now, here’s something that I choose not to participate: fanboy-ism. You know, these days, the internet is filled with fanboys. Each freely talks about he/she’s opinion, which mostly are directed towards disregarding companies that are not his/her preference. Long story short, you have Apple fanboys, Microsoft fanboys, Google fanboys and Linux fanboys among others. Funny thing is, not all of them actually knows the capabilities of devices that they regarded so majestically (which ultimately turned them into fanboys in the first place). Very rarely do they able to code (Linux Fanboys not included, as in the previous sentence). So I found it very funny to worship a company that release a device that you never dived into. Anyway, that’s my 2 cents.

So, pardon the long intro. 3 days ago, I acquired an Android phone, it’s a Sony Ericsson Xperia Live. Cheap, but with quite a good spec. It’s my first Android phone. That night, I quickly do a research and founded that I’m actually able to create a program using Processing and running it straight in my Android phone. My mind was boggled. Imagination ran wild. And after a quick hello world, I decided to combine my previous knowledge in Kinect to control this wonderful phone. As a response to the previous paragraph rambling, I aimed to combine products from Apple (MacBook), Microsoft (Kinect) and Google (Android) into one system. As I’ve said before, I’m no fanboy. I admire every good piece of technology, no matter who’s the vendor.

So in general what I have here is a gestural automated phone system. I named that one myself. Sounds horrible. It’s a system that enables me to do a phone call without touching the phone, entering the number (or choosing from the address book) and pressing that call button. That action is triggered by a gesture, detected by the Kinect. In short, I’m making a system where my hand movement will make my phone calling another phone, without touching it. Sounds clear?

Under the hood, I have 2 softwares running at the same time. The first software is running on the MacBook, it’s a Kinect gesture detection. The second one runs on the phone, which will receive command from the MacBook and do a phone call afterwards. So, if I move my hand towards the Kinect, my hand will be detected and by moving it to the right corner of the screen, it will make the MacBook sending a command to the phone. This command is sent using OSC (Open Sound Control) protocol which required both the phone and the MacBook to be in the same network. Upon receiving the command, the phone will do a phone call. Here’s the demo (turn the volume UP!):

That video serves as a proof of concept, a crude demo which was achieved after a night of rapid prototyping. And yes, this is why I love Processing. It’s a perfect platform to prototype a rough concept. Of course, I can see many improvements required, but for now here’s what I have.

I can see this system implemented with the phone attached to its user who has a handsfree device available. Imagine waving your hand and make a call without having to reach the pocket first. Hmm. Sounds like a part of Iron Man. 🙂

Apple, Microsoft, Google living in harmony.

Leave a Reply