Blog Archives

Innovative Learning using Technology in a Sensory Way

Recently I was asked to run an INSET workshop on the use of technology to promote engagement and create opportunities for learners with complex learning difficulties and disabilities (CLDD). The focus of the presentation was to look how technology can enable students to have a positive impact on their learning, the reasons for using technology, tools that are available and assessment systems that could be used.

Big thanks to Ian Bean(@SENICT) and Anthony Rhys (@trinityfieldsit) as they have some great resources that are shared freely and have added links to their sites below – these were really appreciated by those who attended.  Also the link to the Youtube video in  the Prezi is from Anthony’s school Trinity Fields who are a real leader in using gesture based technology with their students and this impact has led them to be awarded the 3rd Millennium Award from Naace – certainly a school that we use as a benchmark for what we do.

Below is the link to the Prezi and links to the documents that I shared with colleagues:

Link to Prezi

Useful Links

Switch Progression Roadmap – via Ian Bean’s site

Link to PDF on using a Timer to limit students time on iPads via Ian Bean

IpadSEN – Wiki run by Anthony Rhys

 

Gesture Based Technology Wiki

Since the Gesture based Technology PLC was created in November last year, our focus has expanded to many devices and this has led to the setup of a central portal page that links to all the hardware that we are currently using with our students.

So if you are interested in using iPads, Eyegaze, Kinect, LeapMotion or any gesture based tech visit the GestureSEN wiki for information about how to use these devices to promote engagement, movement and creativity with students with severe learning difficulties.

 

GestureSEN

LeapMotion Arrives!!!!

Tuesday 23rd July was the day that my LeapMotion was delivered. I was very excited about this piece of technology and the impact it could have on access and engagement for the students that I work with.

IMG_0592.JPG (2) IMG_0597

For those who don’t know the LeapMotion is a small, USB sized device that works like a mini-kinect to track movements of the fingers. It’s ‘working area’ is an invisible cube about a metre across above the device and it will pick up fine hand and finger movements- these can then be converted instantly to effects on the screen- so causing visual and audio effects. Unlike the Kinect – which does whole body tracking. the Leap will only track hand and finger movement which potentially limits the audience that we would use the device with.

First impressions of the Leap are that it is very accurate in detecting hand and finger movement and I quickly went to Airspace (the Leap Motion app store) to see what has been developed for the device. When playing around with the apps, I found those that require fine motor skills were quite difficult to use for example Blocks – a jenga -like game – which requires you to ‘pinch’ your fingers to remove the blocks. However where I feel the Leap will be most useful for our students is in its ability to track hand movements and the apps Midnight, Flocking, Airharp and Vitrun – beautifully demonstrated this. I found myself engaged with these apps in particular Midnight and spent time trying to manipulate the environment to see what I could create.

IMG_0599

Midnight app running using Leap Motion

After the initial trial at home, I was eager to try the Leap out with our students. Considering it was the last day of term, I still managed to find time for one of our students to have a go with the LeapMotion. The student has muscular dystrophy and has limited movement in his arms. I was intrigued to see how he got on with using the Leap, as I found even when I used it, the positioning of the device was crucial. He started with Midnight and really enjoyed making changes to the environment in front of him. He soon got to grips with changing fingers to make things happen and mentioned that he like the different colours that he was making on screen.

IMG_0601

As any typical teenage boy, he is quite keen with playing computer games, so he tried out Cut the Rope. This game requires quite fine motor skills – using your finger to swipe the ropes to cut them. This is where I noticed that positioning of the device for those with difficulty with their mobility is crucial . He was quite determined to continue the game but he did get frustrated on occasions when his finger swipes were not fast enough to cut the rope. The last app that he tried was Vitrun and he had greater success with this app as it requires you to hold you hand out flat above the device to move the ball forward and backwards. Again, he enjoyed this app and was engaged for long periods of time which was promising to see.

Apps that I recommend to try:

Midnight – $2.39 A beautiful particle system that is controlled using your hand. Also plays music to add to the experience. Watch the video below:

Flocking – Free – Fish will follow you fingers on screen. Really nice and simple sensory application.

Airharp – $1.19 – exactly what it says, play a harp using you hands – nice interaction for simple cause and effect.

Vitrun Air – $3.59 – Simple game that involves moving a ball thorugh a range of obstacles with very easy controls.

I am impressed with the Leap and can see potential for it in an SEN setting as an enabling device that allows those with limited movement to be able to interact with a screen. Factors that will be important in the success of using this device are similar to all devices such as iPads, switches, Eygaze – positioning and app selection. Hopefully now that the Leap has been released, I am hoping that more developers will create content for it and I look forward to seeing what they develop for it.

Anthony Rhys (@trinityfieldsit) has setup a LeapSEN wiki which over the next couple of months we hope to fill with information about how to get the best out of the Leap for SEN students as part of the Gesture Based PLC group. For more information about this and other gesture based tech visit –http://kinectsen.wikispaces.com/

Gesture Based Technology – EyeGaze

Over the last year, those who follow this blog, know that I have started to get involved in the use of gesture-based technology. This has been an exciting journey so far and has been fantastic to meet and work with so many like minded people like Anthony Rhys (@trinityfieldsict), Susan McCarthy (@LittleAngelsSch, @SusancMcCarthy), Andrew Walker (@andtomac), Ceri Williams (@cerirwilliams) and Keith Manville (@open_sen) to name a few.

Today, at our school we had the opportunity to invite Hector Minto (@hminto) from Tobii and Lee Blemings (@sensoryguru) from SensoryGuru to run an EyeGaze Clinic with our students and staff at Oak Grove College. It was great to see our students use the technology in many different ways from eye control, to what I believe is more important, eye tracking. It certainly changes the playing field for assessing students as it allows you to see exactly where students are looking which is great tool for educators. By seeing where students are looking you can instantly talk about exactly what they are looking at and test their comprehension. When looking at EyeGaze, you instantly think about using it with PMLD students, but the system can be used for so much more than that. Below is an example of what can be done just using an Oxford reading tree book and eye tracking:

In addition there is more and more software becoming available starting at a sensory cause and effect level, up to choosing and communicating.

There were so many positive comments from staff with the majority saying ‘that is amazing’ and ‘so when are we getting one?’ Certainly I will be looking to get in our school as soon as possible!!

If you would like to see more information about Eyegaze and how it is being used in Special Schools please take a look at http://eyegazesen.wikispaces.com/

Also for more information on other uses of gesture based technology check out Kinect in SEN PLC and iPads in SEN PLC

Computing and SEN

As the dust starts to settle on the proposed National Curriculum changes, I have observed and read many blogs on how the change from ICT to Computing presents many challenges. I agree with the thinking behind the changes, in that we need to teach students how to use different hardware so that they can be the next generation of developers etc. rather than just consumers. However I feel that many teachers are already doing this by adapting the existing curriculum, by incorporating programming elements. I also feel that computing is not the be all or end or and that there are many other elements of ICT that should be taught to enable students to create various forms of digital media and become responsible digital citizens. If you have not already done so, read Matt Britland’s Guardian blog post ‘There is room for both computing and ICT in schools’this sums up brilliantly the needs and benefits to teaching both ICT and Computing together.

So moving on to our challenge as a school, I teach in a generic secondary special school for learning difficulties, with almost all the students operating at well below average NC level for English and Maths (they would not be at our school, if they were working at expected levels!!). The school caters for students working at P1 up to NC 5 so a huge range in terms of abilities. In KS3, the new curriculum states:

‘use two or more programming languages, one of which is textual, each used to solve a variety of computational problems; use data structures such as tables or arrays; use procedures to write modular programs; for each procedure, be able to explain how it works and how to test it’

DFE, 2013

Now with many of our students having severe literacy difficulties, they are expected to learn two other programming languages. I have done some coding myself and understand that different computer languages are similar, but still feel this is quite a challenging target for our students. Nevertheless, I believe that it is important that students learn to use computers for much more than viewing the latest YouTube video on their smartphone and look forward to adapting the curriculum to meet their needs. I have already seen my colleague Keith Manville (@open_sen) work with students at NC 1-2, coding simple sketches using Processing. The way in which he adapted this was to give the students different chunks of code to play around with and they soon picked up the understanding of the changes they made to the code was having effects on the what they saw on the screen.  This is certainly one way in which we will have to adapt the curriculum to suit the needs of our students.

As far as a curriculum for SEN, over the last few weeks I have been thinking about the tools we could use to deliver this. The tools listed below are based on own experience and researching what others on Twitter and the web are using. They are listed in order of progression (obliviously more thought will need to be put in to make sure we meet all subject content for each Key Stage) This list is no where near complete and as I write there is a Google Doc being put together by Sheli BLackburn (@SheliBB) called Computing KS1-KS4 , which is collating all the tools you could use to meet the requirements of the curriculum.

Kodu – is a simple visual programming language created by Microsoft. It uses simple sequencing to allow students to program their own worlds and create  games, stories etc. Have used this successfully with a range of students and are amazed how easy they picked this up.

Scratch – is a programming language that makes it easy to create your own interactive stories, animations, games, music, and art. Again it uses simple blocks of predefined code that students can build up and change the variables.

Greenfoot – teaches object orientation with Java.  it is visual and interactive and visualisation and interaction tools are built into the environment.

visual (1)

Raspberry Pi /Arduino/Processing/Python – there has been lots written about the Raspberry Pi and the Arduino is a similar concept. This encourages students to look at the hardware as well as the software and involves them in creating coding software for the hardware. The nice thing about the Arduino is that it lends itself to robotics projects. It uses an IDE (integrated development environment ) written in Java derived from the Processing IDE. Like Processing, it was designed to introduce programming to artists and other newcomers unfamiliar with software development and hardware integration.

There will be few students at present who would be able to reach the Raspberry Pi / Python stage, but we need to have this progression in place to allow students to work their way towards being able to code using hardware like the Pi, especially when they reach KS4 and move into the Sixth Form.  In addition, looking at the student cohort who are likely to access the Computing curriculum, they range from P7-NC5. This is a large range to differentiate for and we will have to ensure that the tools used are individulised to meet their needs and abilities, in order to allow them to access the subject areas and make progress.

The new curriculum certainly presents a challenge for our setting, though feel that by incorporating Computer Science with ICT allows the students to learn creativity skills alongside digital literacy and media skills, ensuring that are students are well-equipped for the digital world that they will be entering.  I would be very interested in what other special schools are planning in terms of a curriculum for Computing so please add your comments.

%d bloggers like this: