Digital technology assessment? Hmmm, so we are teaching great stuff, the kids are loving it. You are learning new skills, the school has invested money for new resources and if you are lucky some great PL opportunities. You may even have had D.T. incursions or excursions, visiting robots certainly add to the buzz! The D.T. unit you developed for the term was a success, the Principal is impressed and the parents can’t stop talking about it. But really the whole point is the result, did your students achieve what was expected?
They may have mastered how to use ‘Scratch’, created a game and had lots of fun in the process, but how do you collate and record what they did when it is all digital? And are they really using computational thinking if they are just following the provided steps and dragging block code onto an editor? We may think we are teaching D.T. by engaging with these resources but perhaps we are not really challenging our students, nor collecting real evidence.
There are many resources out there that are recommended to us as being great for teaching digital technologies. Lots are FREE (Don’t we love that?) and some cost us a great deal of money (plus time!), each has value or does it? For example, I have used Cargo bot with a previous class. For those of you who haven’t tried it here is a video to give you some idea of how it works. It is a great app to demonstrate the use of symbolic code and it can be very challenging. In fact, so challenging that you become frustrated. For our high school students, this resulted in many students off task or students cheating by googling how to solve the various tasks/levels. Well, we did want them problem solving!
If we have to provide evidence of a student’s skill and knowledge of using this app for assessment purposes, what are we going to say? Student A got to level 3 and student B reached level 8, so student B clearly understands symbolic code, pattern and repetition (iteration), or perhaps student B is just better at googling the answers?
Let’s talk about using Ozobots. As you know I am a big fan of these little robots, to the point that I bought my own to play with before introducing them as a teaching device. The Ozobot website provides many lesson plans and workshops. There is challenges along the way and something for every age group (K-12). I have used many of the provided resources, and they are great, but still, I felt that collecting evidence of a student’s true computational thinking abilities is not always evident. Students often help each other, which is a skill we are looking for, collaborative problem solving, but when it comes to assessment for reporting it is individual. One of the ways I have been dealing with this is by modifying the supplied Ozobot lesson plan/workshop. For instance, there is a simple worksheet activity which has students draw a simple compound shaped track and then write the code needed to program the robots. Yes, I said write, a little bit of practice writing code and some evidence collection. With me, the students only attempt this after they have become familiar with the basic online editor games, Shape Tracer and Shape Tracer 2.
After finishing the simple activity sheet, I set them a new task. Using graph paper and textas they must draw their own track. I set a few rules:
- there must be seven or more objects on the page (much like a birds’ eye view of a map-buildings)
- they must have colour changes within the route (the robots flash various colours)
- the robots must travel around the buildings/objects
- they need a start and finish point or they may create a loop (iteration)
I keep it simple as what I am really after is what follows. The students create very different tracks, some are very simple and some extremely complex. I then set the next task which is writing the code (sequence = algorithm) for programming the route. They have their original worksheet to refer to which has a section with the correct coding terminology and choices, such as ‘right turn 90 degrees’, ‘slight left turn 45 degrees’, etc. This helps support the students, they don’t need to concentrate on the terminology, they need to focus on reading their route and finding which code they need for their programming to run successfully. For assessment purposes it gives me an indication of whether they understand the different codes, if they can problem solve by selecting the correct code, add the extra information needed (i.e. number of steps, etc.) and create a sequence (algorithm).
The next step is block coding their written code onto the computer, we use the Ozoblockly editor to do this. Once again students need to demonstrate that they can read/understand their own code, they drag and drop the block code into the required sequence (algorithm). They then program the code onto the robot (from the computer monitor) and run the programmed robot to test if their coding is correct. Here is where they find errors and correct their coding (both online & on their paper coding).
So what have they learned and what can we assess from this activity? Quite a bit, here is a few areas (from various grades) which we have covered:
- Different types of data, and the same data can be represented in different ways (ACTDIK008)
- Data is represented using codes (ACTDIK015)
- Design, modify, follow and represent both diagrammatically, and in written text, simple algorithms (sequence of steps) involving branching(decisions) and iteration (repetition) (ACTDIP019)
- Implement and use simple visual programming environments that include branching (decisions), iteration (repetition) and user input (ACTDIP020)
Anyone else have assessment ideas to share? What have you been using and doing to collect assessment data?