Computer Science end of first year design think

Near the end of the academic year in 2013, I facilitated a design thinking session with three Computer Science undergraduates. All three of the students had been part of Christian Smith’s academic writing course groups. Christian had introduced them to Evernote based collaborative working, Screenflow presentation recording, smart boards, collaborative writing in Google Drive, and open-space working the Teaching Grid’s Experimental Teaching Space. In a focus group in Week 8, the students discussed these inspirational experiences, and demonstrated an excellent understanding of the relationship between technology practices and learning. Christian’s teaching had clearly inspired them to create interesting new ideas. This design thinking session aimed to turn this inspiration into a lo-fi prototype. We began by considering technology likes and dislikes. The students unanimously expressed a preference for well-designed, task-fucessed, uncluttered apps that would allow them to focus on the academic content and activities, and to work collaboratively where possible. These technologies should work across platforms, so that the student can work on a range of devices (phones to desktops) all kept in sync without the need for manual actions. The student were very aware of good flowing task focussed design. A narrative emerged out of this discussion of design values, concerning apps to help with getting to, working in and making the most of lectures. This was motivated by the common experience that time-flies very quickly and it can be hard to keep up!

We mapped the narrative out on the floor of the Teaching Grid, and then recorded a walk through. This is a summary of the key points (more to be added):

The action takes place in week 4 of the first term in the first year of a Computer Science course. It is 8.45, and a small group of students are in the kitchen of their halls of residence. They laugh as a variety of beeps, comedy ringtones and music tracks simultaneously plays from their various mobile devices. 15 minutes to the lecture. One of the students checks the notification: lucky they did, there has been a change of venue – Ramphal 1.3? Where? She clicks on the link to view it on the map. They had better get moving – in the direction indicated by Find My Lecture. On the way, another of the students reads through the brief description of the lecture, updated by the lecturer lasts night (it seems that they are making faster than expected progress, so the lecture plans have been updated a little). He summarises the description for the other students, and they have a brief discussion of what it might mean. But where is 1.3? Find My Lecture points them in the right direction, and by the time they get into the room more details of the lecture have appeared on their mobile devices (varying size of phone and tablet, various operating systems depending upon preference, and including a few traditional laptops).

They get to their seats just in time. Next task, choose a note taking template in the Notetaking app (Find My Lecture automatically switched over to it). They all choose the default for this lecture (recommended by the lecturer), except for one student who prefers a different layout. The slides are loaded into the template. The lecture starts. Note taking begins. For one student this means typing detailed text into the Timeline. For the others, scribbling annotations on the slides as they appear. In both cases, a system of different coloured Smart Highlights is used. For example, when a green highlight is drawn over a word, a task is automatically added to their Task List saying that they need to write a definition for the phrase that has been highlighted. They can also add keyword tags selected from the course taxonomy. The notes are added to their note taking Timeline, associated with the current slide (and eventually with the audio/video recording of the lecture, once it has been completed). To make it easier, a system of Hot Keys or Gestures (depending on device and preference) allows the students to add notes and take actions. For example, there is a Hot Key for marking a point in the lecture that is not well understood and needs more explanation.

All the time the lecturer is monitoring feedback. This might be in the form of comments and questions posted back to him, or it might be alerts based upon Hot Key selections by the students. For example, if over 50% of the students press the “don’t understand” Hot Key, the lecturer gets an alert and can take action (they can set the threshold level for this). This happens, and the lecturer decides to modify the lecture. She introduces a new slide, added to live. The new slide appears in the Notetaking app of the students. She then adds another slide, and writes a couple of multiple choice questions onto it. The students also get this set of questions to answer on their devices. Their answers are communicated back to the lecturers Presentation app, and collated. 95% get it right. Reassured, they carry on with the lecture. Meanwhile, three of the students are co-writing an idea for a project based on the ideas for the lecture. They are able to show notes to each other, switching between each others views. Alongside their notes they are co-writing a text – the pitch for the project.

The lecture concludes and the audio/video recording is saved and added to the students’ timelines. They can now review the whole thing, with their notes in sync with the lecture. Or alternatively, they can look at a summary of the lecture and their notes, generated by the Notetaking app. As a post-lecture task, and in preparation for the follow-on seminar, they are asked to add a short summary text to this, and share their summaries with their seminar groups – thus ensuring that there is plenty to discuss in the seminar.