Kinect for Children with Special Needs

There are times when something out of nowhere surprises you, positively, swiftly. For me, that was couple of days ago.

I was looking at this site’s stats when I noticed a spike in the site’s hit. The source was . Now I didn’t have any idea what this was, until I paid that site a visit.  Turned out, that site is a site that hosts information about using Kinect as a tool to teach pupils with special needs. It wasn’t just a plan or design, it has already been implemented in some schools in Wales and England. And my open sourced work, was there. Used as a tool to teach kids with autism. I cried.

You see, that work wasn’t intended to be used as a teaching tool, let alone for kids with special needs. I created that as my first Kinect-based work, used to get a basic understanding of what the sensor offers. I then made it open source, put the code in this website, just so people can play or pick something up from it. Turns out, people from a more noble purpose than just “marketing gimmick” or “engaging event” use it for something worth more to tell or gratifying for the bigger community.

Long story short, I then contacted Anthony Rhys, the mastermind behind . I asked about how is the Kinect utilized in teaching the pupils and what are their reaction. Here’s what he had to say:

The Kinect helps our pupils in a number of ways. It is mainly used, not for any specific maths or english targets but for what we call ‘engagement’- which means the pupils being alert and interested in what they are doing. Usually we use switches for cause and effect work or touch screens but with the Kinect programs the cause and effect aspect is so immediate and instant- and natural for the pupils, as all they have to do is move their bodies however they can to have an effect on the screen. Many of our pupils are learning basic child developmental targets- such as cause and effect and responding to their environment so this natural immediacy is important to us as it removes another barrier a lot of the time- i.e. no need to press a switch to have an effect- just move! We have been using the Kinect a lot with our physically disabled pupils- as it enables them to interact with their own environment on their own using whatever movement they have- legs, hands or arms etc. The only downside is they need good visual skills too as there are not a lot of musical programs out there yet for the Kinect. We also use it a lot with our low functioning autistic pupils- who have good physical movement but have massive sensory processing issues and large skill deficits- they are able to explore the effects their own bodies are having ‘live’ in front of them and it has aided their awareness of the outside world I think, and an awareness of themselves.

Trinity, one of the pupils, playing with Evil Twin

Trinity, one of the pupils, playing with Evil Twin

Evil Twin has been useful for two reasons, firstly there’s the mirroring effect of seeing themselves on the screen- some pupils seem to respond well to that aspect of the Kinect (others find it too overstimulating and prefer just abstract graphics). Then there is the mirrored graphic element (the evil twin), I think pupils find that intriguing- that the colour person moves the opposite way to them, and that it is them but not them too! We’ve not had lots of opportunities to use it yet, only a few sessions but hoping to build on this.

The other aspect that works is the body tracking element- it picks up the pupils really easily-formal skeletal tracking is not much good in our environment, especially for our wheelchair users, (no chance at all of the famous ‘Kinect pose!’). The best responsive programs at the moment either use the camera and movement, or whole body ‘blob’ type tracking linked to movement. It is a shame about the full screen element- as the larger and more noticeable the better!

Also closest point tracking programs seem to be not noticeable enough at the moment- the interactive graphical element has to be quite marked and immediate to ‘draw them in’ and when working with pupils with limited movement the closest point to the Kinect is not always the bit that they can move!!

Anyway, the more we use it the more we learn about the pupils interaction with it and how we can use it. It’s quite strange as there is no real precedent to this gestural based technology- so we’re not sure how the pupils are going to interact with it, or whether they are going to interact with it at all!

On the wiki on the evidence section is an essay I did on one teenagers interaction with the program that goes into some detail on his interaction with the system and the benefits he gained from it.

What is important in all this is developers like yourself, who have put things out on shareware so that we can use it- only two of the programs we use were meant for special needs pupils (and both are totally free!) all the other ones we’ve adopted- and all of them have either been free, or the creators have given us access to them for free so we can give them a try. What they often see as a bit of fun coding, or a client project used for a few days or a week to us is a very exciting tool to use with our pupils!

Evil Twin as seen on KinectSEN’s wiki page

Our school has also been fortunate to buy two eye gaze controlled computers which have been very, very useful with some of our pupils, and also an interactive floor projector which lets them roll around on the floor and make great things happen, and the iPads have been super too- all gesture based technology that is beginning to enable our pupils to do so much-as I often say, for me and you technology makes what we do in our everyday lives easier, for our pupils it makes impossible things possible.

Now that itself can be a single blog post. However, I just want to add some of my personal opinion, since this felt so special for me.

First of all, notice how powerful the Kinect is. I’ve argued how easy it has enabled me to build a Computer Vision based app, and now people has already iterate from it. Second, though Kinect was build for normal people, it’s also proven to be able to be used by people with physical disabilities. And it can detect them too, without that famous Kinect callibration pose. Third, Kinect is having the advantage of having fully functional and hackable drivers and SDKs, both closed and open source. This can only lead to good thing, as many developers can start making things, release it in the wild and might even be useful for someone else. Obviously this has been my case, but then again, “Evil Twin” was largely based on Dan Shiffman’s OpenKinect for Processing example. So, you can see the snowball rolling from there.

This experience has taught me about many things, most notably my position and responsibility to my surroundings. I think as somebody who can make things, I should give more often to the society, especially if it helps. So, if you’re interested in developing more app for education or even for kids with special needs, then don’t hesitate, contact me. I’d really love to help

Have a happy holiday everyone.

Leave a Reply