Understanding and shaping student engagement live and online

This lecture was originally created as part of the LDC APP PGR course, introducing postgraduate research students to teaching. I’ve done it in various forms now (from 30 minutes to 2 hours) and it is always really good. It is all about designing to gauge and shape behavioural, emotional and cognitive engagement. The longer version includes some advice about using online tools. This is put into the Warwick context, where most use of online is to “sustain and amplify” good class teaching – the Extended Classroom approach.

Peer-learning methods are used in the lecture, with ResponseWare, to illustrate how we can understand and shape student engagement. Simple MCQs are used in some cases, with and without set answers. Text based responses to some questions are also used, with results presented as a word clouds. There is also a numeric response question (about how often one should stop and prompt student thinking and discussion during a lecture) with a range specified as correct.

I’ve added notes to the slides to explain what I was doing and how I used ResponseWare.

Method: the VRIP matrix for analysing current and ideal tech adoption

Over the last few days i’ve been involved in yet another one of the long winding discussions that occur in the academic tech business concerning how a particular requirement is best satisfied, and what part institutional IT should play in providing a solution. In this case it is “video and audio streaming live events” (yes, that one again). The debate is messy in two dimensions:

  1. “Live event” in an academic context covers a wide range of quite different things, from lectures through to physical theatre workshops.
  2. There’s a huge range of possible technology elements, including cameras, mics, encoders, streaming platforms, ad-ons to common tools (YouTube, Facebook) with a super complex matrix of features and quality levels.

Fortunately we now have analytical tools that can help us – so long as they are used systematically. And here are some notes I have written to explain it to colleagues.

The VRIP matrix (as I’ve decided to call it so as to sound more impressive) emerged out of work done on digital capabilities, and the visitor/resident model (Dave White, Helen Beetham, Alison Le Cornu, Lawrie Phipps, Donna Lanclos, James Clay and others). This is what a matrix looks like. The idea is that a tool/technique can be placed onto the matrix to indicate the type of adoption and integration into institutional services. This can be used to show the situation as it is or the ideal we think we should design and work towards. We can also indicate the extent of adoption by using a colour coding schema. Typically we map out the situation regarding a specific group (e.g. undergraduates).

The matrix can be used to graph the current situation – e.g. where most people are at with a tool or technique now. It may also be used to show the ideal degree/type of adoption and support.

At the institutional end of the matrix we place things that are or should be chosen, provided and supported by the institution. At the other end, we put personally (or communally) chosen, owned and supported.

On the visitor end of the visitor/resident axis we put tools and techniques that people use less often and which they don’t become familiar (they often have to relearn each time they visit). And at the resident end are the tools and techniques that they live and breathe all the time.

So where is “streaming a lecture” on the matrix, for specific groups of people? That’s an empirical question for which we haven’t really got an answer.

And where should it be? That’s a complex strategy question.

And how do we get to the point at which a tool/technique is in the right place on the matrix for the right people?

So this is the kind of sophisticated analysis, that needs to be backed up by an investigation (of the kind I’m doing for VR at the moment).

 

Workshop: Design Thinking techniques for effective participation in the design process

I’ll be running this workshop on 14th March 2017 at 12.30, Oculus Building OC1.01, University of Warwick.

The University is a design rich environment: we design all kinds of aspects of the educational process, courses, tools, technologies, spaces, organisations, publications. Across all of these fields there is a desire to do designing more collaboratively. Effective participation by staff and students should ensure outcomes that fit more neatly with needs and capabilities, stick in use for longer, spread more widely and grow our capabilities for further development.

Professional designers have developed a broad repertoire of techniques for ensuring effective participation. In this workshop, we will learn about and try out techniques that focus upon the language and dialog used by design collaborations throughout the lifecycle of a design (right through to supporting its use once implemented).

Design Language

Designs are typically described and recognised using a remarkably limited and un-examined vocabulary. Expanding and critically assessing a richer shared vocabulary is essential for successful designing. This may counter unhelpful assumptions and cognitive biases, and is especially important when aiming for inclusive and universally accessible designs. Finding just the right words to describe an actual or possible aspect of a design may also unlock new possibilities or new pathways for investigation. Working with language in this way, bringing together all of the perspectives involved in the design process (including users), helps to ensure a higher level of engagement and a sense of ownership.

How do you and your collaborators describe your designs? How might that language be refined and enriched?

Design Patterns

A design pattern is a statement of a problem plus a pattern of actions and interactions that addresses a problem with links to related patterns. It is elaborated with information on its originating context and the concerns, values and problems out of which it arose and advice on implementation and customisation. The pattern is usually headed with a catchy and meaningful title. In some disciplines the inclusion of diagrams and images is considered essential. In education, this might be best achieved with a storyboard or even a video. All of these elements are intended to act as a guide to design activity and a prompt for thinking and prototyping.

Could you benefit from stating your design patterns more explicitly? Could that enable more objective and precise collaborative designing?

Extended Classroom Pedagogy-First Cards

This is a H5P interactive presentation introducing a second set of Extended Classroom cards. This set has been produced by Sara Hattersley and Emma King of LDC (academic staff training) at Warwick. Our first (green) set looked at 12 technologies and their application to enhance learning, teaching and the student experience. This (blue) set starts from the other side, considering a set of themes or ambitions in TEL and considering how technology may be used to help.

Why I gave my EU referendum vote to my 11 year old son

I’m not usually a political blogger. But this country is making me very angry. Here’s why.

I’m 45 years old and I am (on paper at least) very highly educated. I got there the hard way – working class etc but still got into a Russell Group University and got an excellent education. Just the kind of achievement that UK families aspire to.

I now work with young people. I work at a University. Here is a simple fact for our predominately old electorate: they are smarter than us; they are smarter than we will ever be; they’ve been through an increasingly sophisticated and challenging education system; and they are very hard working – alcohol consumption amongst the young is dropping dramatically. We have to trust the young. And most importantly we are responsible for creating a political system that they care about, that they will engage with.

Yes, let’s face it. We have screwed up.

I have two sons – 5 and 11. The 11 year old is already doing school work at a level that I did when I was 16. Yes 16. He’s not unusual. The education system has changed. The expectations have changed. I gave him my vote because he is disenfranchised and will remain effectively disenfranchised by an elderly electorate who are not qualified to make big decisions that will have a massive impact on his life chances.

Old people – have some humility. Know when to give way.

Extended Classroom presentation for LTSMG conference

This is a version of a presentation that I gave at the UK Learning and Teaching Space Managers Conference at Warwick in 2016. It is an introduction to the Extended Classroom strategy for increasing collaboration and consistency in the adoption of technology enhanced learning practices at Warwick. I use an adaption of Dave White’s residents/visitors concept to explain how we need to create learning spaces in which teachers and students can feel at home and feel more free to “design” their use of technology.

Crowd sourcing questions as a student-staff partnership activity

Yesterday we had a great lecture yesterday by Prof Sue Rigby of Lincoln University. It was a rich and complex talk, with lots of good examples and ideas. Sue talked about students creating questions, which should be equal in importance with answering questions, if not more important. That is a good way of explaining what higher education is about, why it is a developmental, character building, intellectual capacity building activity. There was much more in her lecture, but the point about students creating questions reminded me of a design idea I sketched out many years ago – a system that would allow students and staff to collectively create, use and review good questions. Participants can find relevant questions by others, try them out and review them, thus helping us to find questions that work especially well in all kinds of contexts – academic and personal development.

Here’s the design that I came up with at the time:

Click to enlarge.

Click to enlarge.

What are the pedagogic challenges facing teachers in HE today?

Yesterday I wanted to run a simple demo of Turning Point ResponseWare in a drop-in session at our new Oculus teaching and learning building. I left a single question active for the whole session, and just invited anyone present to have a go. My aim was to illustrate the mechanics of the system, but also to show a useful technique – an open survey that could be used to trigger dialogue with a diverse range of participants.

Earlier in the week I had noticed this entirely unofficial sign appearing outside of many of our lecture theatres:

no-phones

Clearly someone is having a bit of a struggle to hold their students’ attention. It is a divisive issue, with pro and anti phone views amongst staff and students – but also many who just don’t know what all the fuss is about. This got me wondering if my assumptions about the issues that staff worry about in HE teaching are accurate. So I started a simple list. And during the drop-in session I put that list on as a ResponseWare question, inviting participants to say which of the issues they want to address. This was very much a trial, not proper research, just to see who recognises these issues and the terms used to describe them. Here’s my list:

  1. Students using shallow learning strategies.
  2. Poor lecture attendance.
  3. Students inattentive in lectures.
  4. Disengaged students.
  5. Achieving a balance between coverage and depth.
  6. Inappropriate, misaligned, assessment methods.

And here’s the completely unrepresentative results:

teaching-issues-survey-test

I don’t think these results mean very much. More importantly, every respondent recognised the issues, and understood the meaning of the terms, with the exception of “shallow learning” – that was a surprise.

I had a few more suggestions for issues to add to the list, and no doubt it will grow much bigger as I dig deeper.

More issues to be added here:

  • Overcrowding.
  • Illness and exhaustion (teacher and student).
  • Too much time spend manicuring the VLE, too little time to spend engaging with students.
  • Too little contact time.
  • Risk averse students.
  • Risk averse teachers.
  • Seemingly good students not achieving as expected in assessments.
  • Lack of or poor access to learning materials and facilities.