Blog Archives

KinectSEN – exploring gesture based technology to engage all learners

Last Thursday, I had the pleasure to be invited to be part of the professional learning community (PLC) looking into gesture based technology and the impact it can have on special needs learners. Gesture Based Technology (GBT) includes technology that involves a natural user interface for its input. This can be the Kinect, iPads, Eyegaze, mobile floor projectors to name a few of the technologies and is mainly used at present in gaming consoles at home. The power of natural user interfaces is that is allows students with SEN to be included in sessions and enable them to explore movement, creativity and engagement. From the evidence that I have seen so far, it gives students an opportunity to be actively involved in effecting their environment and allows them to do things that they simply could not do before. What is great to see is the instant effect that students have using this technology and this achieved by them moving in whatever way they can.

One of the mobile interactive floor projectors

The PLC is co-ordinated by the fantastic Anthony Rhys (@trinityfieldsit) and from looking around Trinity Fields School it was amazing to see the progress the school had made in enabling students to interact in their lessons using GBT. The day focused on how this technology can be used to enable students who are often withdrawn and shy to engage in various activities. It was great to see the evidence of other students using this technology to encourage creativity, movement, engagement and exploration.

kinect flow.JPG

Over the next few months, I will be hoping to blog the progress that has been made in incorporating GBT into our schools and present evidence for this. I came away from the day even more enthused to make this happen and it was great to meet other practitioners just as interested in the technology to help engage thise student with severe learning difficulties. We are currently looking into adding this to our sensory room so that we have an interactive floor and wall display at the fraction of the cost that some SEN companies would charge.

If you are interested in GBT and the use of the Kinect and would like to find out more about how different schools are incorporating this technology please visit the Kinect Wiki site at


Processing Update

I mentioned in my previous post that I would keep you updated on my progress with Processing and it seems I have become foul of enjoying the summer holidays too much. Nevertheless I will get an update in before the holidays end. I have progressed in learning code in Processing to create some simple applications and started to use the Kinect to interact with different applications. Here is a quick summary of what I have done using the 'Getting Started with Processing' and 'Making Things See' textbooks I mentioned in previous post.

  • Learnt to create various different shapes and incorporate pictures into applications.
  • Learnt the functions of variables, for loops, if statements etc (lots of coding language)
  • Coded different programmes that allow these objects to move across the screen and react to key presses on the keyboard and mouse.
  • Learnt to use the depth camera of the kinect to gain a better understanding of how it recognises pixels and distances to add functulility to the applications we are making.
  • Incorporated the kinect in to the apps so that I can get objects to track the closest point to the Kinect ie. hand. This has led me to create a simple drawing app and a 'minority report' like application using photos.

The next step is to start learning to incorporate the skeletal tracking and 3D functions of the Kinect into applications so that they are more intuitive for the user.

If you get a chance, have a look at the wiki set up by Anthony Rhys (@trinityfieldsit) and in particular the Processing section which has link to various applications that he has created, which are available to download as PDE's to work in the Processing interface.

Also on the site are examples of some pretty amazing applications which give you an idea of what is possible with the Processing language – here is the link.

The intention of this project is have a group of like minded individuals working together to create interactive applications that can be used with the fantastic group of students we work with. So if you interested in helping us please contact us!!!


Po-Motion Interactive Wall in the Sensory Room

This post will focus how we have used some software called Po-Motion to create an interactive wall display in our Sensory Room.  Po-Motion is a piece of software that was available for free, however they have stopped this and there is a small charge around £20 to purchase the software ( though you can trial the software for 30 days). It allows you to create an interactive wall or floor display by just using a web-cam, projector and computer.

The plus points for us was the cost – many items for the sensory room cost hundreds even thousands of pounds and companies are able to charge these amounts of money because it is classed as specialist equipment (though many of these items are actually a fraction of the cost to make and provide students with Profound and Multiple Learning difficulties to engage with sensory stimulations, which I find a little frustrating but enough said on that).  In addition, it allowed us to install an excellent sensory tool which many could access without having to redesign our sensory room.

Anyway rather than me write about it I have made a video showing how we have set up Po-Motion and some of the scenes you can use with this.

When we first installed the system, we thought it would be a standalone tool, but have since found that it interacts with what else is going on in the room. For instance, when changing the colour of the lights, this has an effect on what was shown on the wall. So we have now got a bit of equipment that not only works well on its own but also in conjunction with other pieces of equipment.

Areas for thought –

  • Positioning of camera – From installing the system, we found the positioning of the camera was very important in getting the best reaction from the display. Rather than pointing the camera at the wall, it needs to be positioned where the student will be standing or sitting so that it can detect the movement.  Also in our setup we have the camera facing the display – another setup would be to have the camera facing the student. This is something we might change after we have gone through testing with the students.
  • Lighting – for the camera to pick up movement there needs to be a little bit of light. In the dark, it is very difficult to pick up movement unless using an infra-red camera. (Po-Motion also do a version of the software that works with IR cameras)
  • Web camera vs Kinect – something which we are thinking about is getting a Microsoft Kinect for the sensory room.  The Kinect is more sophisticated camera and would allow us to have different pieces of software running for one bit of equipment. Also in the dark environment of a sensory room it would pick up movement far better especially when using the IR version of the software.   This is something we are looking to set in the near future so will keep you posted on this.

Anyway now we have the system installed it will be interesting to see how the students react to it. Also I would be very interested to hear from others who have installed the system themselves and what scenes they felt worked particularly well.

%d bloggers like this: