99 Names

99 Names is a web VR experience that exhibits 99 Names of Allah in an immersive 3D space. This is my first foray in both Virtual Reality application and WebGL programming through ThreeJS. It’s a simple application, where user can access the web page using their phone or desktop/laptop browser and instantly they can feel the experience, where they’re surrounded by rotating circles showing the 99 names of Allah.

99 Names

The barebone of the project is completed using Web VR Boilerplate, where it ensures that everyone can get a grip of the experience, whether they’re on a desktop, or smartphone, with or without a virtual reality head-mounted display such as Oculus Rift or Google Cardboard. All within the web browser, no application to install. I think that this is a good way to evangelize VR, since at this point, the VR world really needs a whole lotta application to expose its features and shortcomings.

I had so much fun making it. The boilerplate makes it really easy to develop VR experience, so I can focus on the main content, which was all made using ThreeJS. Though I’ve read about it a lot in the past (it’s part of my to-learn-list for about 3 years now, haha), but this is actually the first time thoroughly learning about it. I can say that the best way of learning programming language/library is by making things with it. So far, I’ve learned a lot about 3D pipeline. Which makes me wonder, why didn’t I do this years ago?

However, from the interaction design point of view, I realize that catering VR experience to both kind of platform (desktop or smartphone) is tricky. For example, in smartphone based VR, the user input is limited. Not all phone can use the magnetic input from Google Cardboard, a thing that hopefully will be rectified by Google Cardboard 2. I’m not sure about the other HMD, maybe you, dear reader have another data?

While on the other hand, I can make plethora of input in the desktop version, since the user can use keyboard or mouse or joystick, or other devices to give input to the application. A thing that obviously won’t be mapped precisely in the smartphone counterpart. I did ran into the vreticle library which will help me make a gaze input system for VR, but I still founded some trouble implementing it.

Therefore, at this point, this experience is a passive one, no user input is involved. But I do hope to complete it with one at some point.

99 Names can be accessed at adityo.net/99names. Play with it and let me know what you think.

Meanwhile, here are some screenshots of the steps I did in making it

99-names-1st-step

99-names-2nd-step

99-names-3rd-step

1903

Client: Abdul Dermawan Habir & Co.
Year: 2013
Location: A Study of Light Exhibition. Exhibited in Dia.lo.gue Artspace, Kemang, Jakarta
Tools: custom software made in C++ (openFrameworks and ofxTimeline addon)
Hardware: Arduino, Arduino Expansion Shield, Relay Module, Transistor, LED lamps
OS: Snow Leopard

1903 is a light installation that combines elements of audio, sculpture, light and fashion into a single installation. The creator, Abdul Dermawan Habir contacted me several months ago through this very website. He was looking form somebody who can program synchronized lamps and sound. Of course, I was up for it.

The installation tells a murder story through series of voice overs and mannequins. Its technical scenario is quite simple, every time a voice overs played, some lamps will be turned on to highlight a certain mannequin which represents the talking character. In additon, one lamp will be dimmed depending on the storyline to add more dramatic effect. This installation is setup on a close room. To see it, audience have to peep through a small hole, press a door bell and listen using a headphone.

As you may expect, this installation uses Arduino and relay modules to turn on the total of 9 220V lamps plus an NPN transistor and a resistor to dim a 5V LED lamp. To control the Arduino, I used openFrameworks and its Firmata library, plus the very helpful ofxTimeline addon which provides the GUI. With this, I can easily dictate which Arduino pin should be turned on according to the dialogue. Mind you that this isn’t audio reactive so I have to be very precise where to put the pin on/off command.

There are more detailed story on why I go with this solution, but I’ll leave it for later post. Meanwhile, you can see pictures from the installation below, including behind the scene (literally) images. It was exhibited in Dia.lo.gue artspace in Kemang, Jakarta, from 12-23 January 2013.

More Work for Nike

Here’s a quick recap of 3 projects I’ve done with Nike in the past month. Projects are arranged by its date.

Interactive display for Nike Malaysia Booth at Stadion Bukit Jalil, Kuala Lumpur, Malaysia.
Date: 23 July 2012

Nike Malaysia wants the exact same content that we’ve previously developed for Nike Senayan City store for their booth exhibited during the Arsenal – Malaysia friendly match. So we flew there with our contents, have the Nike Malaysia guys set up the required hardwares and after 2 days of working, we have it set up properly.

Picture 1-5.

New Interactive Display Content for Nike Senayan City Store.
Date: 5 August 2012
A content update for Nike Senayan City Store. This time we want to display not only the triggered video, but also the image of the actual people playing in front of it. Keeping in with the whole triangle theme of the triggered video, we decided to show the person in a triangulated form an in addition that person can also create triangulation form using his/her hands.
This was made using vvvv in Windows 7.

Picture 6-7.

Treadmill Visualization for Nike run Event at Grand Indonesia
Date: 15 August 2012
For this event, Nike wants us to deliver 2 things: a displayed output of their Nike Run mobile app, which displays how far have the runner in treadmill gone, and a reactive visualizer which reacts to the speed of the runner. I took charge in the former and used a Kinect to do a frame differencing which in turns dictate the speed of the displayed grid and particles to create a sort of sci-fi style warping effect. This was made using Processing in Mac OS X 10.6.

Picture 8-14.

Augmented Reality Demo

Every now and then, I always get a question like this: “Hey, can you do an Augmented Reality stuff?” No matter what I feel about Augmented Reality (AR), it’s without a doubt one of the most underutilized-but famous interactive technology these days. So, in order to get my hands dirty on what today’s AR technology has to offer, I created two demos using different approach and tools.

The first demo is a marker-based AR made using Quartz Composer. In this video, I create a software that detects a marker and display a cube on top of it. To make it a bit more interesting, I set an audio analyzer so that the cube’s size is determined by the volume of a music coming in to my laptop. While still keeping the tracking system working, the cube will dance to the music. This is created to show that the animation for AR application don’t have to be static. I believe that an interactive animation will ad some depth to the AR app itself. Here’s the demo of it:

The second demo is a marker-less AR using Kinect. This time, I wanted to make an AR app without using a marker. I decided to use Kinect as an input, because it can accurately detect my body part without having to setup stuffs like a static background. In this app, a particle is created on top of my detected body part, sort of a wizard waving his hands to emit his magic power. A simple demo, but shows that a Kinect can be utilized to develop an AR system without relying on marker, thus resulting in a more natural way to interact. Let’s face it, nobody walks around carrying a marker 🙂 Here’s the demo:

Interactive Wall for Hero

Client: Hero
Year: 2012
Location: Hero Award Night at Sampoerna Strategic Square
Tools: custom software made in C++ and Adobe Flash Builder
Hardware: Kinect
OS: Snow Leopard

Interactive Wall made for Hero, a local retail chain. Upon walking in front of it, a picture will appear depicting the long history of the company. Different pictures appear as somebody walks along the wall display.

This work utilises 2 Kinects, as I have the job to detect person along 6m screen in quite a narrow space, probably 2m between the screen and the opposite wall. So with that, I have no other choice but to use 2 Kinects and placed them high enough and separate them so they’d cover as much space as possible.

I only have a short time to complete this project, luckily the client is kind enough to pair me with Erin, another developer who appeared already created the content for the display. So, my job is clear, to get input from Kinects and feed the input from them to the created content. Erin used Adobe Flash Builder and so I have the option to create a program which will detect people and send the detected person’s position using TUIO, which I know already has a ready-made library for Flash.

The result is as you can see from the above video.

One Code One Week – Week 3: SssssssssS

Title: Hike
Time: 22 January 2011
Place: Home, just in front of TV
Tool: Processing

This is my tribute to the film “Sssssss”, a 1973 killer snake movie. I just love the title and I decided to exploit the title and re-imagining it in a different way.

Code starts here:


PFont fontA;

void setup() {
size(800, 800, P3D);
smooth();
colorMode (HSB);
fontA = loadFont(“GloucesterMT-ExtraCondensed-48.vlw”);
textFont (fontA, 190);
}

void draw() {
background (140);
translate (width/2, height/2-60);
for (int i = 0; i < 51; i++) {
textAlign(CENTER);
pushMatrix();
fill (100, 250, i*2);
rotate (PI*i/10);
text (“SssssssssS”, 0, 0, i/4);
popMatrix();
//fill (0);
//line (20, 20, i, i);
}
translate (80, 120);
for (int i = 0; i < 55; i++) {
textAlign(CENTER);
pushMatrix();
fill (150, 90, i*2);
rotate (PI*i/10);
text (“SssssssssS”, 0, 0, i/4);
popMatrix();
}
translate (-160, 0);
for (int i = 0; i < 50; i++) {
textAlign(CENTER);
pushMatrix();
fill (250, 100, i*2);
rotate (PI*i/10);
text (“SssssssssS”, 0, 0, i/4);
popMatrix();
}
}

One Code One Week – Week 2: Hike

Title: Hike
Time: 15 January 2011
Place: A hotel in Hawaii
Tool: Quartz Composer, v002 Rutt Etra QC plugin


Inspired by the hill we went for a hike. The ups and downs, it’s really represented well by the Rutt/Etra style video synthesizer.

Thanks to vade who made the amazing Rutt/Etra QC plugin. This one is basically my practice using QC, so I just use that plugin and combine it with an audio input to make the video synthesizer reactive. Easy to do, but in case you need the code just give a comment.

One Code One Week – Week 1: Sydney Airport

Introduction:
As part of my 2012 resolution, I’d like to start a series I call “One Code One Week”. Self described, but in case I caused a confusion, throughout this series I’m trying to share my visual work, coded in several different languages (according to my mood). The mission is to share my knowledge as well as improving my personal skill. So without further a due, here’s what I have for the first week.

Title: Sydney Airport
Time: 8 January 2011
Place: International Departure Terminal, Sydney International Airport
Tool: Processing

Inspired by the busy sights and sounds of the airport’s temporary inhabitants. Glitchy, minimalistic yet noisy by the look of it.

Code:
void setup () {
size (400, 300);
background (140);
colorMode(HSB);
noStroke();
rectMode(CENTER);
}

void draw() {
for (int i=0; i<250; i++) { fill (125, 125, (i*60)%125); rect (i*random(width/2), height/3, 20, (i*5)%50); fill (250, 125, (i*60)%125); rect (i*random(width/4), height/2, 20, (i*5)%50); fill (50, 125, (i*60)%125); rect (i*random(width/8), 2*height/3, 20, (i*5)%50); } }

Creating My Own Infographic CV

Let me start by saying I’ve always wanted to have an amazing good looking CV. I was preparing my CV a few months ago and being someone who’s in a transition from a pure engineer/programmer to a designer, this could be hard. I can’t really figure which part of my past experience should I emphasize on my CV. I don’t want to make a 4 pages of CV filled with experience of creating networks for telco client when I’m applying for a job as a UX designer. Then I reflected a bit and then I realized, “hey, I can actually make a CV that sells my technical skills in an engaging way that subtly tells my ability as a designer!” And that is by making an infographic CV. Worth to note that I still sell my technical programming skill, but this time I won’t let my design skill left unnoticed.

So, after some sketches and preparation, I finished designing my new looking CV using Photoshop. All my skills, experiences, awards in one nifty good looking (for me at least) page. Here you go.

I hope this post can give you inspiration to go and re create that traditional CV into something different. If you read this and say “hey this is good” and there’s a position in your company then by all means, contact me via the email address at the top right corner of this page. I’m looking for a job now 🙂