Kinect PLC Meeting 7th Feb
It’s been 3 months since the last Kinect meeting and this meeting gave the opportunity for the schools to share their early journeys in using gesture based technology.
The morning consisted of Heronsbridge, Trinity Fields and Oak Grove sharing their evidence of using the Kinect so far. The videos showcased some brilliant examples of how the technology is engaging those students who are often hard to engage. In addition, it highlighted that the sessions need to be short with the use of different programs to keep the students interest especially with those working at P4 – P6. Another point that the PLC has to consider is getting the technology embedded in classrooms to allow staff and students to use it without having to set it up each time. This is one of the aims for Oak Grove College and something that I want to work on with staff over the next few months (hopefully our sensory room setup will be complete soon!!)
Andrew Walker (@andtomac) then presented on using a evidence based system to help the PLC record progress of students using the Kinect. At Exeter House School, they have been using the Engagement Profile and Scale developed by SSAT Complex Learning Difficulties and Disabilities Research Project. This has been developed to support staff to focus on the child’s engagement as a learner and create personalised learning pathways. There is also free online training available from the DFE website which talks through how to use the profile and scale. It certainly seems to be a logical step for the PLC to use this as an assessment framework and it will be useful to compare evidence at the next PLC meet.
After a short break, Dr Wendy Kay-Bright (Reader of Inclusive Design at Cardiff University and developer of Somantics and Reactikles) talked through the background story of how these apps came to fruition and gave a brilliant insight into the journey the team had been on. it was great to get a better understanding of how and why the apps came to fruition and Wendy gave some excellent advice on developing applications for students with ASC. What was great to hear was how the students were involved in the process from the beginning. Below are the slides from the presentation:
After lunch, Hector Minto (@hminto) gave a talk about the use of EyeGaze technology as a tool to promote sensory exploration and early tracking to aid reading assessment. The presentation focused on the EyeGaze Learning Curve and how the system could be used to move students along this curve from sensory exploration right up to communication.
Hector concentrated on the first two stages of the curve, Sensory and Eye Tracking. He demonstrated the SensoryFX software which is used to engage students and train the using EyeGaze system. The activities are designed to engage students so that you can develop their skills using a range of different programs. The next stage of eye tracking was demonstrated using a simple e-book and I have included a video to showcase this in more detail. What good to see here is that you are actually able to track where the students is looking and you can test comprehension by asking key questions. (You will need to look closely to see the cursor.)
The price of EyeGaze has reduced considerably and I believe now that it is at a stage where it is worth considering. It gives practitioners another tool to help with assessment and allow progress be shown with students working at the lower end of the P-levels, where progress is more difficult to show.
At the end of the day, the group discuss the next steps and what each school would do before the next meeting. it was a great day and brilliant to see so many different practitioners from teachers, researchers, developers and consultants looking to promote the use of gesture-based technology with SEN students.
If you are interested in finding out more please visit the PLC wiki which contains more information in using the Kinect with SEN and showcases some of the evidence that is being collected.