As the dust starts to settle on the proposed National Curriculum changes, I have observed and read many blogs on how the change from ICT to Computing presents many challenges. I agree with the thinking behind the changes, in that we need to teach students how to use different hardware so that they can be the next generation of developers etc. rather than just consumers. However I feel that many teachers are already doing this by adapting the existing curriculum, by incorporating programming elements. I also feel that computing is not the be all or end or and that there are many other elements of ICT that should be taught to enable students to create various forms of digital media and become responsible digital citizens. If you have not already done so, read Matt Britland’s Guardian blog post ‘There is room for both computing and ICT in schools’ – this sums up brilliantly the needs and benefits to teaching both ICT and Computing together.
So moving on to our challenge as a school, I teach in a generic secondary special school for learning difficulties, with almost all the students operating at well below average NC level for English and Maths (they would not be at our school, if they were working at expected levels!!). The school caters for students working at P1 up to NC 5 so a huge range in terms of abilities. In KS3, the new curriculum states:
‘use two or more programming languages, one of which is textual, each used to solve a variety of computational problems; use data structures such as tables or arrays; use procedures to write modular programs; for each procedure, be able to explain how it works and how to test it’
Now with many of our students having severe literacy difficulties, they are expected to learn two other programming languages. I have done some coding myself and understand that different computer languages are similar, but still feel this is quite a challenging target for our students. Nevertheless, I believe that it is important that students learn to use computers for much more than viewing the latest YouTube video on their smartphone and look forward to adapting the curriculum to meet their needs. I have already seen my colleague Keith Manville (@open_sen) work with students at NC 1-2, coding simple sketches using Processing. The way in which he adapted this was to give the students different chunks of code to play around with and they soon picked up the understanding of the changes they made to the code was having effects on the what they saw on the screen. This is certainly one way in which we will have to adapt the curriculum to suit the needs of our students.
As far as a curriculum for SEN, over the last few weeks I have been thinking about the tools we could use to deliver this. The tools listed below are based on own experience and researching what others on Twitter and the web are using. They are listed in order of progression (obliviously more thought will need to be put in to make sure we meet all subject content for each Key Stage) This list is no where near complete and as I write there is a Google Doc being put together by Sheli BLackburn (@SheliBB) called Computing KS1-KS4 , which is collating all the tools you could use to meet the requirements of the curriculum.
Kodu – is a simple visual programming language created by Microsoft. It uses simple sequencing to allow students to program their own worlds and create games, stories etc. Have used this successfully with a range of students and are amazed how easy they picked this up.
Scratch – is a programming language that makes it easy to create your own interactive stories, animations, games, music, and art. Again it uses simple blocks of predefined code that students can build up and change the variables.
Greenfoot – teaches object orientation with Java. it is visual and interactive and visualisation and interaction tools are built into the environment.
Raspberry Pi /Arduino/Processing/Python – there has been lots written about the Raspberry Pi and the Arduino is a similar concept. This encourages students to look at the hardware as well as the software and involves them in creating coding software for the hardware. The nice thing about the Arduino is that it lends itself to robotics projects. It uses an IDE (integrated development environment ) written in Java derived from the Processing IDE. Like Processing, it was designed to introduce programming to artists and other newcomers unfamiliar with software development and hardware integration.
There will be few students at present who would be able to reach the Raspberry Pi / Python stage, but we need to have this progression in place to allow students to work their way towards being able to code using hardware like the Pi, especially when they reach KS4 and move into the Sixth Form. In addition, looking at the student cohort who are likely to access the Computing curriculum, they range from P7-NC5. This is a large range to differentiate for and we will have to ensure that the tools used are individulised to meet their needs and abilities, in order to allow them to access the subject areas and make progress.
The new curriculum certainly presents a challenge for our setting, though feel that by incorporating Computer Science with ICT allows the students to learn creativity skills alongside digital literacy and media skills, ensuring that are students are well-equipped for the digital world that they will be entering. I would be very interested in what other special schools are planning in terms of a curriculum for Computing so please add your comments.
Since learning to code, I have explained to colleagues that if they have any ideas or needs for specific apps to come to talk myself or my colleague Keith Manville (@open_sen) and we would look at how we could design a application which would meet their needs. This happened to Keith a few weeks ago and I don’t want to say too much as he has already written about the process of designing the app on his excellent blog which you can find here.
What he has created is a simple visual and audio app that will draw bezier curves and ellipses depending on where the mouse is positioned on the screen. In addition it will play notes from the midi synthesiser using the soundcipher library created by Andrew R Brown. The outcome is a simple cause and effect app that not only stimulates the user visually but also through sound. The app can also be used on a touchscreen which increases the accessibility for some students. In the short time in testing this app out with students, I have found that different students react differently to it. Some are more interested in the visual shapes that are produced whilst others are motivated by the audio coming from the app when touching the screen.
In addition to this, I have adapted the code slightly so that switch users can access the app. Instead of the shapes being drawn depending where the mouse x and y axis are, the shapes are drawn randomly. Again I have found similar results with students being engaged and stimulated by the visuals and sounds being produced.
Part of our ongoing project is to produce applications for students with special educational needs, we have wanted to make the applications easy to share with others. We have set up a wiki: http://processingsen.wikispaces.com/ and if you click on the applications page you will start to find some of the applications that we have coded to download and try with your students, Both Keith and myself would greatly appreciated any feedback and comments regarding the applications as this will help improve them and further applications for the future.
N.B – If you are interested in seeing the work we are doing with the Kinect in the classroom please visit the PLC page at http://kinectsen.wikispaces.com.
On Wednesday 24th November, our school took part in the national Big Draw Day. The day focuses on a particular artist and their work and it gives the opportunity for students to participate in a range of cross-curricular lessons based on the artists work. This year the focus was on the work of graffiti and visual artist Keith Haring. Based around Haring’s work, students were encouraged to take a line for a walk and this was the main focus for the day.
My colleague Keith Manville (@open_sen) had been working with on a application based around generative art and between us we looked at the possibility of running this as a workshop for the day. The aim would be for the student to create their own art using ICT and this would be achieved by taking a line for a walk. An example of how the sketch runs is shown in the video below:
The program runs automatically and the user controls the change of colour by either pressing the ‘space bar’ for a random colour choice, ‘m’ for monochrome and ‘b’ for black. In addition the user can pause the sketch at any time and the application allows for the user to export their image as a jpeg. This application was coded in Processing which you can read more about in an earlier post here.
We found that the simplicity of the program meant that a wide range of students could access the application and create some beautiful pieces of art.
Alongside this, we decided to run a kinect session based on some of the applications that we have found, which have been coded in Processing. The programs we used were created by Amnon Owed and they are based around using the kinect to detect the user and interact with the images on-screen. More information on how these are coded can be found in his useful tutorial here.
The first one we used was the Kinect Flow application which turns the user into a wavy line polygon and will track the movement across the screen. What I noticed for this application was the instant attraction for some of our ASC students using it. They wanted to explore what happened to the image when they moved their body. This was really interesting as these students would shy away from taking part in physical activity, but were really active during this session.
The second application, again created by Amnon Owed, pours shapes over you. The user can use their body to collect them and bat them away. The tracking is very good with this app and I found that it even worked for wheelchair users. Also the app would pick up two users so was good for collaborative teamwork between students.
After we had run the sessions, we had some time to reflect on the day and overall we felt the students had enjoyed the different applications that they had experienced. In terms of the line art sketch app, we found that students enjoyed making the art and were putting thought into when they should change the colours. However some students found that they could exit the app by pressing the ESC key and this is something that we will disable in future versions of the app (reminded me of students exiting apps on the iPad before guided access was added)
The kinect applications we used were not specifically designed with SEN students in mind though the natural user interface of the kinect allowed the students to instantly pick up what they had to do. It has given me some food for thought when it comes to coding my own applications for the students and developed my understanding of how to code for the kinect. If you are interested in learning more about using the kinect with SEN visit the excellent wiki being developed by group of schools using this tech with their students : kinectsen.wikispaces.com
I found Big Draw Day, motivated me to continue to code and make applications for our students. The sessions continued to show, how these application encourage creativity, movement, engagement and exploration. To finish I was going to leave you with a video of a application that I am currently coding – no where near the finished product but gives you a flavor of some of the applications we hope to create.
I mentioned in my previous post that I would keep you updated on my progress with Processing and it seems I have become foul of enjoying the summer holidays too much. Nevertheless I will get an update in before the holidays end. I have progressed in learning code in Processing to create some simple applications and started to use the Kinect to interact with different applications. Here is a quick summary of what I have done using the 'Getting Started with Processing' and 'Making Things See' textbooks I mentioned in previous post.
- Learnt to create various different shapes and incorporate pictures into applications.
- Learnt the functions of variables, for loops, if statements etc (lots of coding language)
- Coded different programmes that allow these objects to move across the screen and react to key presses on the keyboard and mouse.
- Learnt to use the depth camera of the kinect to gain a better understanding of how it recognises pixels and distances to add functulility to the applications we are making.
- Incorporated the kinect in to the apps so that I can get objects to track the closest point to the Kinect ie. hand. This has led me to create a simple drawing app and a 'minority report' like application using photos.
The next step is to start learning to incorporate the skeletal tracking and 3D functions of the Kinect into applications so that they are more intuitive for the user.
If you get a chance, have a look at the wiki set up by Anthony Rhys (@trinityfieldsit) and in particular the Processing section which has link to various applications that he has created, which are available to download as PDE's to work in the Processing interface.
Also on the site are examples of some pretty amazing applications which give you an idea of what is possible with the Processing language – here is the link.
The intention of this project is have a group of like minded individuals working together to create interactive applications that can be used with the fantastic group of students we work with. So if you interested in helping us please contact us!!!
This summer, I am going to start to learn coding, specifically Processing. This is a development platform that allows you to write software to make images, animations and interactions. It was originally developed to make programming graphics easier than using other languages such as C++ and Java.
Now you might be asking my I am learning this language and what benefit it might have for the students I worked with. I have previously written about different natural user interfaces like the Microsoft Kinect and the software I have used with them – Visikord and Po-motion. These programmes are really good and I have seen some great engagement by students whilst using these. What is clear is the students love to explore the different aspects of the applications and the natural user interfaces allow the students to do things they simply could not do or access before. From discussions with colleagues at school, on Twitter (@trinityfieldsit) and looking at what Ceri Williams (@cerirwilliams) has been working on, this has prompted me to start learning to code so that I can develop different programmes that users can interact with using the Kinect. I am also interested in how different program's work and hope that by learning to code simple interactive graphics programmes, I will be able to gain a better understanding of how developers create apps and software.
By being able to develop our own applications, the hope is that we will be better placed to respond to the needs of our students and create exciting and engaging applications. It going to be a long process but if you are interested I would recommend the following books –
'Getting Started with Processing' – Casey Reas and Ben Fry – gives you a introduction to the Processing language and gives you practical examples to work with.
'Making Things See' – Greg Borenstein – a hands-on guide that takes you through the steps to create applications that uses the Microsoft Kinect in the Processing language.
I will post on the blog the progress I make on this project but in the meantime if you are interested in using the Kinect with SEN check out the Kinect in the Classroom page of my blog or look at the Kinect in SEN wiki which represents the work of the Personal Learning Community of @trinityfieldsit, @LittleAngelsSch, @cerirwilliams and myself. The idea of the wiki is to showcase the excellent work being done by those using the kinect in SEN.