This week, I had the pleasure to attend the GestureSEN PLC – (more info here) where discussions were based upon how we use video evidence in purposeful way to aid our reflections of progress made by students operating at the P-levels. My write up from the day will come soon but it was during Andrew Walker’s (@andtomac) excellent presentation that the Gartner Hype Cycle was presented.
The Gartner Hype Cycle provides a graphical representation of the maturity and adoption of new technologies and is presented below:
The graph shows the cycle that all technology goes through from the moment the idea is conceived to mainstream adoption. This was the first time I had seen this graph and as someone who is interested in the use of technology in education, this got me thinking about the education technology we use in the classroom and where different technologies would sit on the graph. Would we say tablet computers are moving towards the Plauteau of Productivity? Virtual reality in education is certainly an area which I believe has reach the Peak of Inflated Expectations and is possibly about to hit the trough. Some technologies will never get pass the Trough of Disillusionment and will disappear into obscurity. .
For SEN, where would Eyegaze technology sit? Leap-motion? KInect? Switches?
Everyone will have their own opinions on where technologies would sit on the graph depending on their experiences. Being presented with this graph has allowed me to reflect on the implementation of technology in the class and the importance to evaluate the technology thoroughly to ensure impact on teaching and learning.
For more information on the Gartner Hype Cycle – click on the link –http://www.gartner.com/technology/research/methodologies/hype-cycle.jsp
Since the Gesture based Technology PLC was created in November last year, our focus has expanded to many devices and this has led to the setup of a central portal page that links to all the hardware that we are currently using with our students.
So if you are interested in using iPads, Eyegaze, Kinect, LeapMotion or any gesture based tech visit the GestureSEN wiki for information about how to use these devices to promote engagement, movement and creativity with students with severe learning difficulties.
Tuesday 23rd July was the day that my LeapMotion was delivered. I was very excited about this piece of technology and the impact it could have on access and engagement for the students that I work with.
For those who don’t know the LeapMotion is a small, USB sized device that works like a mini-kinect to track movements of the fingers. It’s ‘working area’ is an invisible cube about a metre across above the device and it will pick up fine hand and finger movements- these can then be converted instantly to effects on the screen- so causing visual and audio effects. Unlike the Kinect – which does whole body tracking. the Leap will only track hand and finger movement which potentially limits the audience that we would use the device with.
First impressions of the Leap are that it is very accurate in detecting hand and finger movement and I quickly went to Airspace (the Leap Motion app store) to see what has been developed for the device. When playing around with the apps, I found those that require fine motor skills were quite difficult to use for example Blocks – a jenga -like game – which requires you to ‘pinch’ your fingers to remove the blocks. However where I feel the Leap will be most useful for our students is in its ability to track hand movements and the apps Midnight, Flocking, Airharp and Vitrun – beautifully demonstrated this. I found myself engaged with these apps in particular Midnight and spent time trying to manipulate the environment to see what I could create.
After the initial trial at home, I was eager to try the Leap out with our students. Considering it was the last day of term, I still managed to find time for one of our students to have a go with the LeapMotion. The student has muscular dystrophy and has limited movement in his arms. I was intrigued to see how he got on with using the Leap, as I found even when I used it, the positioning of the device was crucial. He started with Midnight and really enjoyed making changes to the environment in front of him. He soon got to grips with changing fingers to make things happen and mentioned that he like the different colours that he was making on screen.
As any typical teenage boy, he is quite keen with playing computer games, so he tried out Cut the Rope. This game requires quite fine motor skills – using your finger to swipe the ropes to cut them. This is where I noticed that positioning of the device for those with difficulty with their mobility is crucial . He was quite determined to continue the game but he did get frustrated on occasions when his finger swipes were not fast enough to cut the rope. The last app that he tried was Vitrun and he had greater success with this app as it requires you to hold you hand out flat above the device to move the ball forward and backwards. Again, he enjoyed this app and was engaged for long periods of time which was promising to see.
Apps that I recommend to try:
Midnight – $2.39 A beautiful particle system that is controlled using your hand. Also plays music to add to the experience. Watch the video below:
Flocking – Free – Fish will follow you fingers on screen. Really nice and simple sensory application.
Airharp – $1.19 – exactly what it says, play a harp using you hands – nice interaction for simple cause and effect.
Vitrun Air – $3.59 – Simple game that involves moving a ball thorugh a range of obstacles with very easy controls.
I am impressed with the Leap and can see potential for it in an SEN setting as an enabling device that allows those with limited movement to be able to interact with a screen. Factors that will be important in the success of using this device are similar to all devices such as iPads, switches, Eygaze – positioning and app selection. Hopefully now that the Leap has been released, I am hoping that more developers will create content for it and I look forward to seeing what they develop for it.
Anthony Rhys (@trinityfieldsit) has setup a LeapSEN wiki which over the next couple of months we hope to fill with information about how to get the best out of the Leap for SEN students as part of the Gesture Based PLC group. For more information about this and other gesture based tech visit –http://kinectsen.wikispaces.com/
After retuning home from another fantastic meeting discussing and examining the impact of gesture based technology on the students we work with, I wanted to share some of the findings from the day. Big thanks to Heronsbridge School for hosting the meeting and I always enjoy going to new schools and having a look around and there were many fantastic examples of how they support and develop their students learning throughout the school.
The meeting started with an update on where we were in terms of future developments, in particular the introduction of the Leap Motion and Kinect One which we discussed the potential of these pieces of technology and the impact they could have on our students. If you have not seen the Kinect One have a look at my previous post here. An exciting development with the PLC is being able to have a hand in developing our own applications with the Cariad Interactive team and this is certainly something to look forward to.
Onto the evidence sharing, where there were some fantastic videos of students using the technology. What is great to see from the examples shown, was the engagement shown by students who would usually struggle with concentrating on tasks for long periods of time. They clearly enjoyed interacting with the various programmes like Somantics and Visikord and through perseverance students showed excellent progress in their engagement. For example a student from Pen Maes, showed vocalisations which they never did at school and this was great to see their excitement from interacting with technology that enable them to have a control on their environment.
In the afternoon, Barri Farrimond from the MUSE Project showcased some of their music technology applications which aim to increase independence and engagement using music making tools that enable access for all. One of the applications shown was Looper Dooper (currently in BETA, but sign up to the mailing list to get the free download). Its a really simple music making app that create loops and you can record any sounds to these. Below is a video of Barry explaining how it works;
In addition, he showed a Kinect application which acts like a Soundbeam which when finished will be another useful tool to use with our students.
Again another worthwhile day and those who are interested in learning more about gesture based technology check out the KinectSEN wiki.
Over the last year, those who follow this blog, know that I have started to get involved in the use of gesture-based technology. This has been an exciting journey so far and has been fantastic to meet and work with so many like minded people like Anthony Rhys (@trinityfieldsict), Susan McCarthy (@LittleAngelsSch, @SusancMcCarthy), Andrew Walker (@andtomac), Ceri Williams (@cerirwilliams) and Keith Manville (@open_sen) to name a few.
Today, at our school we had the opportunity to invite Hector Minto (@hminto) from Tobii and Lee Blemings (@sensoryguru) from SensoryGuru to run an EyeGaze Clinic with our students and staff at Oak Grove College. It was great to see our students use the technology in many different ways from eye control, to what I believe is more important, eye tracking. It certainly changes the playing field for assessing students as it allows you to see exactly where students are looking which is great tool for educators. By seeing where students are looking you can instantly talk about exactly what they are looking at and test their comprehension. When looking at EyeGaze, you instantly think about using it with PMLD students, but the system can be used for so much more than that. Below is an example of what can be done just using an Oxford reading tree book and eye tracking:
In addition there is more and more software becoming available starting at a sensory cause and effect level, up to choosing and communicating.
There were so many positive comments from staff with the majority saying ‘that is amazing’ and ‘so when are we getting one?’ Certainly I will be looking to get in our school as soon as possible!!
If you would like to see more information about Eyegaze and how it is being used in Special Schools please take a look at http://eyegazesen.wikispaces.com/