Category Archives: Uncategorized

20 ways to use ResponseWare

The following activities may be conducted using ResponseWare:

1. Attendance registration – students, signed into the system using their username and password, indicate their presence in the room in response to an attendance poll, the teacher sees a list of students who are present and a list of those who are not, the data may then be analysed, downloaded, or uploaded through a VLE integration (e.g. Moodle).

2. Revision exercises – may occur towards the end of a module, near to exam time, or regularly throughout, often at the start of a lecture to revise what was covered in a previous lecture.

3. Introduction, ice-breaker, warm-up – an intensive series of questions at the start of a lecture, to get the students in the right frame of mind.

4. Maths and statistics – ResponseWare includes a ‘numeric response’ type of question, in which students must respond with the correct number, or within a specified range.

5. Enhancing student engagement by more frequently testing understanding and providing micro-feedback – the most common use of ResponseWare, the aim is to prevent teacher-actions and student-understanding disengaging, as the teacher is able to quickly judge if the students have understood a topic and are ready to move on, and the student is able to recognise their own understanding and progress.

6. Peer learning – the teacher asks the students to discuss a question before, or sometimes after, they answer it, perhaps asking them to discuss their response with someone who has a different response (the method pioneered by Professor Eric Mazur of Harvard University).

7. Confidence-based testing – the teacher poses a question, does not immediately provide the answer, asks the students to state how confident they are (via a Likert scale question in ResponseWare), then reveals the answer and gets the students to reflect on the accuracy of their self-efficacy assessment (especially important in disciplines such as medical training, but also used in Economics – pioneered by Dr Fabio Arico of UEA to address the Dunning-Kruger effect amongst students).

8. Rhetorical questioning – the teacher poses a question that is deliberately designed to highlight misconceptions, contentions, false assumptions, leading to deeper investigation and discussion, and potentially to dispelling errors that prevent students from mastering threshold concepts.

9. Gathering creative responses – ResponseWare includes several mechanisms for gathering ideas from the audience, including a system that builds a word cloud from the responses.

10. Decision making – allow students to make choices, with straightforward voting on a list of options or arranging options into ‘priority rankings’.

11. Crowdsourcing choices and definitions – questions and answers can easily be added during a session, meaning that we can gather ideas from students (for example alternative definitions of a word), add them as questions, and get them to vote on them (this has been used at Warwick in medical research).

12. Working against the clock – the teacher is in control of how long students have to respond to a question, they can do this manually, or add a countdown timer, thus encouraging the students to think fast – this can be used within a rhetorical question to force students to respond intuitively or based on perhaps unsound assumptions.

13. Enhancing student engagement through competitions – competitions may be between individuals or the class may be divided (before or during the live session) into teams, points awarded for correct answers, and scores automatically compiled into a leader board.

14. Speed scoring – when the teacher records and tracks individual student performance, or polling is used within a competition, points scored may be adjusted for speed, so that faster responses gain higher grades.

15. Gathering instant student feedback – in addition to making inferences based upon responses to ordinary questions, the teacher may explicitly ask for feedback at any time during a lecture, simply by adding an anonymous feedback question.

16. Module evaluation – more comprehensive feedback surveys may be conducted quickly and efficiently at any time, combining the benefits of gathering feedback in-class (not later online) with the benefits gained by doing this in a digital (not paper) format.

17. Learner/learning analytics – with the ability to download data into Excel or Access (as csv files) or upload data into the VLE, we may easily apply sophisticated analysis algorithms to data from individual or multiple lectures, looking at learning gain and other dimensions from an individual student or group perspective.

18. Demographic learner/learning analytics – we can create demographic groupings before or during lectures, and analyse responses accordingly (for example, if we are teaching students from two different disciplines, we can analyse the differences between their responses).

19. Evaluating the impact of specific teaching and learning activities – using the learning analytics potential of ResponseWare to evaluate the efficacy of specific teaching techniques and activities, testing (for example) the constructive alignment (Biggs) between Intended Learning Outcomes, Learner Activities and Assessment Activities.

20. Social, outreach and fun uses – we have also seen ResponseWare used in many contexts beyond the conventional lecture, as it is an easy to use and fun set of tools.

A problem with assessment in super-selective institutions

“As argued in Chapter 1, good teaching narrows the initial gap between Robert and Susan therefore producing a smaller spread of final grades than that predicted by the initial spread of ability. The distribution of results after good teaching should not be bell shaped but skewed, with high scores more frequent than low scores. At university level there is therefore every reason not to expect a bell curve distribution of assessment results in our classes.” Biggs & Tang (2011) Teaching for Quality Learning at University (4th edition), p.200

In a super-selective university this is even more so. If we assume a high quality intake, with very narrow spread of capabilities, then the eventual attainment spread should be extremely narrow. When we look at a student who achieved 65% (student 1) and compare them to a student who achieved 80% (student 2) in reality that difference might mean very little. The difference might simply be the product of entirely extraneous variables, random events (student 1 having a cold during exam week).

Unless we can demonstrate a difference in kind between the high achiever and the slightly lower achiever, this is meaningless. It might be (and I think I see this happening) that academics invest much into the identification and application of those differences in kind – “student 2 really got it, they have become a proper philosopher/physicist/economist”.

“The categories of honours (first class, upper second, lower second) originally suggested qualities that students’ work should manifest: a first was qualitatively different from an upper second, it was not simply that the first got more sums right.” ibid. p.210

But that then is also open to subjective biases. Biggs and Tang don’t really seem to have an answer to this. But they are very much entrapped by their strict adherence to definitive “intended learning outcomes” within the system of constructive alignment. Hussey and Smith’s alternative combination of ILOs and “emergent learning outcomes” within an “articulated curriculum” leaves room for student creative input, risk taking, genuine innovation, individuation and other (possibly) less determinate characteristics of learning as research/innovation/creativity. As such, the curriculum offers opportunities for more significant and transformative student input, and consequently aspects of student transformation-through-learning that can be meaningfully assessed and reported upon. Having experienced such learning activities, and achieved unforeseeable outcomes, the student is more likely to value and build upon their success. Thus the learning itself, and the transformation being evaluated, is a more reliable indicator of the student’s future capabilities. And that IS what we are looking for when we assess students in the university.

“The extent to which emergent learning outcomes (ELOs) contribute to the achievement of intended learning outcomes (ILOs) varies. Some emergent outcomes are relatively close to the intended learning outcomes and can be perceived to contribute directly towards their achievement. The contribution of others is less direct, being capable of inclusion on the basis of their contribution to the student’s knowledge of the subject in general, whilst the contribution of other emergent learning outcomes is to the field of studies in general and might be included on those terms. Yet other ELOs contribute to the overall development of the students as autonomous, self-managing learners, far beyond the field of study.” Hussey & Smith (2003) “The Uses of Learning Outcomes”, Teaching in Higher Education, Vol. 8, No. 3, 2003, pp. 357–368.

Understanding and shaping student engagement live and online

This lecture was originally created as part of the LDC APP PGR course, introducing postgraduate research students to teaching. I’ve done it in various forms now (from 30 minutes to 2 hours) and it is always really good. It is all about designing to gauge and shape behavioural, emotional and cognitive engagement. The longer version includes some advice about using online tools. This is put into the Warwick context, where most use of online is to “sustain and amplify” good class teaching – the Extended Classroom approach.

Peer-learning methods are used in the lecture, with ResponseWare, to illustrate how we can understand and shape student engagement. Simple MCQs are used in some cases, with and without set answers. Text based responses to some questions are also used, with results presented as a word clouds. There is also a numeric response question (about how often one should stop and prompt student thinking and discussion during a lecture) with a range specified as correct.

I’ve added notes to the slides to explain what I was doing and how I used ResponseWare.

Why I gave my EU referendum vote to my 11 year old son

I’m not usually a political blogger. But this country is making me very angry. Here’s why.

I’m 45 years old and I am (on paper at least) very highly educated. I got there the hard way – working class etc but still got into a Russell Group University and got an excellent education. Just the kind of achievement that UK families aspire to.

I now work with young people. I work at a University. Here is a simple fact for our predominately old electorate: they are smarter than us; they are smarter than we will ever be; they’ve been through an increasingly sophisticated and challenging education system; and they are very hard working – alcohol consumption amongst the young is dropping dramatically. We have to trust the young. And most importantly we are responsible for creating a political system that they care about, that they will engage with.

Yes, let’s face it. We have screwed up.

I have two sons – 5 and 11. The 11 year old is already doing school work at a level that I did when I was 16. Yes 16. He’s not unusual. The education system has changed. The expectations have changed. I gave him my vote because he is disenfranchised and will remain effectively disenfranchised by an elderly electorate who are not qualified to make big decisions that will have a massive impact on his life chances.

Old people – have some humility. Know when to give way.

30+ aspects of learning and teaching that can be enhanced with technology

In preparation for some design thinking workshops, I have compiled a list of good reasons that people give for changing practice (often through the application of technology). My aim is to illustrate the breadth of the tweaks that we can make. There are of course many more possibilities than are listed here.

  1. Enhance student engagement – physical, emotional, cognitive.
  2. Enhance teacher engagement – physical, emotional, cognitive.
  3. Reduce resource consumption – time, money, materials.
  4. Widen participation in higher education or a specific discipline.
  5. Widen/enrich opportunities (including global connections).
  6. Improve feedback and dialogue on design/delivery with students and others.
  7. Enable real-world impact for student work – academia, business, social, political etc.
  8. Develop transferable and enduring student capabilities.
  9. Ensure students understand the value of their learning.
  10. Explain ideas effectively.
  11. Improve feedback to and dialogue with students.
  12. Assessment that is accurate, relevant, meaningful, appropriate, timely – constructively aligned.
  13. Smooth operation: reduce/eliminate errors, misunderstandings, contentions, inconsistencies.
  14. Speed up and make clearer orientation (where am I in time, space, process etc.).
  15. Improve facilities for students’ independent study.
  16. Improve student and staff welfare – physical, emotional.
  17. Find out what really works and why in teaching and learning – research.
  18. Share knowledge and good practice.
  19. Challenge, disrupt, critique, surpass habits and assumptions.
  20. Create and sustain a community of practice.
  21. Accurately understand my students (capabilities and needs).
  22. Help students to accurately understand themselves.
  23. Allow students to experience otherwise inaccessible experiences.
  24. Facilitate students to make quality objects.
  25. Facilitate students to take managed risks.
  26. Record and track incidents and tasks together.
  27. Ensure fairness and equal opportunities
  28. Identify priorities for action together.
  29. Plan a series of actions together.
  30. Monitor and adjust a plan together.
  31. Make personal productivity more resilient.

 

Student Champions framework published by HEA

The HEA have just published the report that I wrote last year.

Student Champions: a competency framework, process model and developmental approach for engaging students in the enhancement of learning, teaching and the student experience in higher education

This report is based on a collaboration between the Academic Technology Team, LDC, Classics and Life Sciences. It is intended for use by everyone involved in enhancing learning, teaching and the student experience (LTSE) in HE.

The framework describes how students can and do perform essential roles within the enhancement of LTSE – as part of special projects (such as those now funded by WIHEA) and through everyday practice.

A set of intermeshing competencies are described for 9 essential roles:​

  1. informed advocate;
  2. technical facilitator (spaces, learning designs, technologies etc.);
  3. social facilitator;
  4. admin process facilitator;
  5. project facilitator;
  6. creative-critical friend;
  7. researcher;
  8. horizon watcher and visionary;
  9. design participant.

The framework demonstrates how all of these competencies are essential for a continual enhancement process, so as to ensure that innovations fit with the needs and ambitions of their users, stick for a reasonable length of time, spread to more people and more contexts, and enable continual growth in our capability for further improvements.

The student champion approach (as implemented as Digichamps by WIHEA) encourages staff and students to form teams and work together to develop and apply this full range of competencies.

https://www.heacademy.ac.uk/sites/default/files/hea_warwick_fl.pdf