GUEST POST: Instructional Design on Lockdown
By Leonard Houx
Leonard Houx is a senior instructional designer at Cass Business School where he designs instruction that comes together as simultaneously effective, efficient and engaging – and helps others to learn to do this as well! Over the last ten years working in online learning, Leonard has worked in the private sector, charity sector and university sector. He has written about online learning for the Guardian, the Association for Learning Technology and E-Learning Age.
With many universities still physically closed and planning for a continued foray into online course delivery, numerous experts have published pieces advising teachers on how to teach online.
There are some deeply thoughtful and indeed helpful pieces out there, yet, it seems the focus is almost entirely on high-level principles such as presence and empathy or on media types, for example synchronous vs asynchronous.
What experts have been less ready to advise has been to describe concretely what the online learning should look like. So, specifically, what could a real-life lesson activity sequence look like for the novice online teacher?
In what follows, I describe three basic activity sequences for online lessons – from simple to intermediate – using a mix of tools. I then follow this by describing each of the suggested tools (or constituent parts) in greater depth and the rationale behind each.
Each sequence is simple to structure but if each constituent part is executed well, it will engage students more and leave them less exhausted.
The purpose of the design, for each lesson, has been to incorporate into online teaching as many of the following six evidence-based educational practices – as identified by The Learning Scientists – as possible:
1. Concrete examples: illustrate ideas with examples that students can easily grasp
2. Dual coding: integrate words with images
3. Elaborative questions: ask questions that help students connect new learning with prior learning
4. Retrieval practice: have students practice with test questions on what they remember
5. Interleave practice: mix practice test questions from a variety of lessons
6. Space practice: delay interval periods between practice tests
For more detail on each of these refer to the Learning Scientist blog
This is the most basic sequence for a lesson in an online, for-credit module. Video coupled with a quiz, a forum, and a follow-up email.
Next, we add a review quiz, allowing us to incorporate retrieval practice and, with that, spaced practice and interleaved practice.
Finally, in this third lesson structure, we add the webinar, which in the form, largely, of responsive teaching, you can incorporate all six of the best practices.
I now turn to describe each of these online activity elements.
Video recording
I recommend you start with video not because it is, in itself, more pedagogically effective (1). It’s because of recorded video– over webinars, audio, web resources or online texts – allows the most to be done with it for starts that is pedagogically effective.
Compared to texts and audio, video gives you more means to manage learners’ cognitive load. Particularly, video allows you to more easily and, in more ways, integrate visuals with your words. This integration of the visual and verbal is often referred to as ‘dual coding’ or the ‘multimedia principle’. It is one of the Learning Scientists’ top 6 most reliable evidence-based practices. Likewise, and it’s easier to progressively reveal those graphics, which also reduces cognitive load. It also feels more natural to use informal language when speaking compared to writing – and this also decreases cognitive load.
Videos are also easier to navigate and control. Students can watch them when they want, as much as they want and at what speed they want (some students report watching lecture videos at 1.25 or 1.5 speed). You can break them into shorter parts, which evidence suggests helps with engagement. A study of learners on free MOOCs found under six minutes to be optimal (2). Adding a chapterised structure also promises to give more control and help students learn more effectively (3).
Compared to webinars, videos are a little less scary. If you make a mistake, you can just record it over again. If you deliver a webinar and can’t get the microphone to work for the first 20 minutes, that’s happened; you can’t take it back. That doesn’t mean don’t do webinars – they have their advantages as well, as we will see. And, after all, control isn’t everything (especially at this historical moment) and online text is even more forgiving and easier to fix than video. But still, video can give you a bit more wiggle room to fix mistakes. (A word of warning: for perfectionists, that this control can be as much of a blessing as a curse as you can descend into an endless pursuit of perfection.)
Lastly, compared to text, it carries a sense of presence that many students will value as they feel increasingly isolated and may feel a loss of what in British universities are called “contact hours”.
Here you – having read Understanding How We Learn – will want to incorporate the evidence-based practice of using multiple concrete examples, and integrating meaningful visuals with your word and your words, keeping the text to a minimum.
Your videos, will be fundamental to your student’s success. But videos are – besides pressing ‘play’, ‘next’ and ‘1.25 speed’– entirely passive. And this creates a kind of visibility problem. You cannot tell, usually, much about what the student has watched beyond than if a student has opened the page. More importantly, you cannot what your learners have learned, as, in lieu of any retrieval practice, they have only a false sense of fluency that comes with a memory of not the learning itself but of having been told something. This is made worse in video as students tend to perceive video as making easier to learn (4). So you don’t have a check for understanding loop because just watching something doesn’t mean you understood it. But you also don’t have an accountability loop because you can’t, from a technical perspective, tell if they were even looking at the screen. This is why the CFU quiz is so important.
CFU quiz
The next thing you want to build is a quiz or, specifically, quizzes, each coupled with a video or set of videos. Hence our block,
would look like
Video1 + Quiz1
Video2 + Quiz2
Video3 + Quiz3
or, perhaps
Video1a + Video1b + Video1c + Quiz1
Video2a + Video2b + Video2c + Quiz2
Video3a + Video3b + Video3c + Quiz3
Coupling each video with a quiz helps, your learners to take an active role towards their learning, and you have a clearer picture of both their engagement and understanding.
Quizzes tend to enhance learners’ focus on the videos. Researchers have found that when videos were interspersed with designated review quizzes, tested students reported fewer instances of mind wandering (19%) than those who did not have tests (41%) and those who used the time for restudy (39%) and showed improved learning on a test of the final section of the lecture (89% vs 65% for restudy and 70% for non-tested), indicating a learned habit of increased focus on the videos (5).
Quizzes likewise tend to be viewed positively by students and have a high engagement rate. In a case study from an Abnormal Psychology module, Jennifer Hillman found students to have a positive view of pre-class test questions giving a 4.4 to the question “How helpful do you think the online quizzes were…to your learning of material in the course?” and a 4.7 to the question “If you used the online quizzes to study for exams, how helpful were they?” (6). At Cass Business School, our videos coupled with quizzes have the highest completion rate of all our non-required activities. This correlates with what experts in online behaviour have observed about online behaviour – that users tend to commit their time and energy based on immediate cost/benefit analyses. Jakob Nielsen calls this ‘interaction elasticity’ (after price elasticity) usage increases as difficulty decreases and vice versa. Quizzes tend to not be difficult as they require little effort (often the click of a radio button) as well as little risk (they are not marked and not seen by other students) yet offer something immediate in return (feedback and completion). For automated questions, the feedback is usually unambiguous – as is the sense of completion.
An immediate quiz this way, does not bring the student a tremendously ‘desirable difficulty’ as Robert Bjork would call it: it is not difficult for a student to remember and therefore does not do as much to reinforce long-term learning as one that might appear a day or two later (7).
What a coupled or adjacent quiz like this does do, however, is more fundamental in that it checks students understood the content. Here, the quiz provides an ideal level of clarity, as it can provide questions allow for more precise diagnosis later. Also, with the results showing in a table in most VLEs, the teacher can easily scan the results for patterns.
I recommend here a quiz where the answers can be automatically graded and given feedback and therefor avoiding short essay questions with model answers. Automated questions can be, for example, true/false, multiple choice, quantitative, fill-in-the-blank, and hotspot questions.
You can use automated questions for simple, factual learning, but it can also be used for deeper learning. It can be used for procedural learning such as in formulas. You can ask scenario-based questions, which can have a stronger correspondence with real-life situations. You can also use multiple choice to ask elaborative questions (e.g. why, how).
Here I would couple a quiz with each video. For example, if you break your lesson into four videos, you have also four accompanying quizzes. The way I do this in a VLE is I build the quiz and embed this video in the introductory body of the quiz activity. I also turn on activity completion in the VLE so that when the student completes the quiz, the activity is marked as complete. This gives allows for that ‘accountability loop’ Lemov mentions, and in a more meaningful way than if just a video had been uploaded.
Forum discussion
A forum allows you to do a more open-ended, elaborative activity where students can connect the lesson with their prior knowledge and compare their ideas with other students.
I should say that I am suggesting you use a discussion forum with some strong concerns. I have seen that too often engagement is not as good as it should be. Many students find discussion forums difficult and spend a surprising amount of time preparing their responses, many more do not post at all, feeling intimidated or saying that they don’t see the point. And academics new to online learning, may actually not notice limited participation as the forum is populated with interesting posts from the brighter students. They fail to notice that other students are not participating and thereby benefitting from elaborating their learning in their own words.
Likewise, many academics struggle to write questions that are open-ended enough to sustain a group discussion, but offer enough structure enough to be engaging and productive. Instead, they ask the equivalent of an essay question. The strong students, who usually post first anyway, exhaust the question in the first three posts. The weaker students, who usually post later, are left paraphrasing what has been said or simply “I agree with what she said”.
It should not be surprising, therefore that, in their literature review of online discussions, Laura Schindler and Gary Burkholder find that “recent research suggests that levels of critical thinking in discussions remain low.” (8) Too many teachers seem to treat discussion forums like a social constructivist Field of Dreams: build it and they will come. Sadly, the only activity that field will see will be students pressing the back button so they can find something more interesting to do.
That said, a well-designed forum activity can still do a lot. It can allow students to compare responses, to voice concerns, to see that other students and their teachers are out there, to articulate their learning in their own words and thereby integrate the lesson with their existing knowledge, to navigate meanings and interpretations.
Schindler and Burkholder recommend keeping discussion prompts structured. That is “clear, detailed, [and] specify instructions for participation and time parameters”. They also recommend providing “initial and response posts exemplars”.
What can be effective is asking students to provide a concrete example of something (give an example of foreshadowing, give an example of a disappointing IPO). Here students can tie concepts to their own experience and benefit from others examples. Even better, you can ask students elaborative “why” and “how” questions to get them to connect the content to prior learning.
One way to overcome the “she already said it” problem is to set the discussion (if you can) so that students cannot see others’ posts until they have posted. This way, their answer will not be influence by other posts they see and they do not need to worry about their post being original. In this case, a good prompt might be an elaborative question, for example, “why would Shakespeare have Macbeth say that he thought he heard a voice cry out ‘Sleep no more!’?” You can also ask students to diagram what they are learning on paper and upload images of their work, for example, “draw a fishbone diagram mapping the causes for Kodak’s downfall as a business”.
Lastly, you can use a forum to allow students to ask you questions. What do they feel they do not understand? What do they want to know more about?
Follow-up email
A follow-up email gives you the chance to give students some responsive teaching. You can review students’ participation and performance in the quiz and discussion and try to correct gaps and misunderstandings. Sending a follow up email also provides a recognition for students that you are there with them, supporting them. It also gives them a prompt to keep up with the lessons.
I recommend doing this through the VLE, if possible, through an announcement forum. This way, they are pointed back to the VLE and students have a record of your message in the online course your message is associated with.
Non-automated quiz
This is where you can ask question in short essay form. Studies have shown that students learn more from essay questions than from essays and these will have a stronger correlation to any later essay writing work they might have to do for their summative assessments.
On the other hand, essay questions can be riskier in the sense that they only work when students have a higher level of understanding. Students have a more difficult time using the feedback from model answers. Likewise, there is a risk that they give empty or circular answers. This is why it is advisable to have the automated CFU questions in place first.
That said, again, essay questions do benefit students more as students have to express the answers in their own words and thereby integrate the lesson better with their prior knowledge.
Likewise, there is more of a chance here to dig deeper: to ask fuller elaborative questions about where and how and open further questions for students that you can discuss in a webinar.
Webinar
Finally, we finally add the webinar. The webinar is a great tool, but as we have seen, it is not as good for giving learners control over content, breaking into parts, or accessibility.
However, used well, the webinar can be incredibly powerful as it can uniquely incorporate all six of the Learning Scientists strategies for effective learning and, with our other activities in place, it can use these capabilities to tie the lesson together to keep students on track and deliver rich, multi-layered, responsive teaching.
How can webinars use all six of the Learning Scientists’ strategies?
· Dual coding: use multimedia – using slides with visuals and worked examples with narration.
· Concrete examples: include examples in your webinar, as well as in discussion and questions.
· Retrieval practice: webinar software like Zoom and Adobe Connect provide survey and/or quiz tools; which also help increase student engagement.
· Interleaving: questions can and should be interleaved.
· Spaced practice: by interleaving and testing students in a regular webinar, you are allowing students space between practice.
· Elaboration: in the webinar, you can elaborate on questions that students asked during the week and you can build on activity within the webinar. After a quiz or survey question, you can ask students to answer elaborative questions for example “those who answered b: why did you choose that?” This also helps you to build engagement and keep the conversation on track.
This is all the more powerful because, in this sequence, you have seen how students have handled the activities and can do some real responsive teaching. That is, your webinar is, as so many students complain of, just a lecture transferred to online, rather it is given in response to all the work so far. It is feedback. But it is not just feedback, but rather interactive feedback. As Dylan William has said in interview “as a general principle, don’t ever give feedback to students unless you make the time… for them to respond to the feedback” (9). The webinar here gives a perfect opportunity for this.
Lastly, the webinar takes these capabilities and adds a sense of timing and urgency to your lesson. Unlike the other, asynchronous, activities, the limited availability, the scarcity, of the webinar is one of its greatest strengths. It is a great strength because its scarcity increases the students’ sense of urgency to participate – and to meaningfully participate in the webinar, the need to complete the week’s activities. In social psychology, this influencing power is called the ‘scarcity principle’. (10) But, in so doing, the webinar doesn’t just motivate students to participate in the webinar, but all the activities, since the webinar gives feedback on them.
This kind of shepherding is critical online. With the flexibility that online learning offers, it can be easy for students to imagine that timing doesn’t matter and they can just do the activities in any order and at the last minute. Students are not always the best judges of how to sequence and time their learning. Keeping at least one activity pinned down in time can help keep them on track (11).
Conclusion
In conclusion, what I have shown is a design, so there are a lot of trade-offs involved. There are also a lot of ways – not all of them being equal – that it can be done. Perhaps you might want to add a collaborative assignment, or you have some VR simulation and your students all have headsets. But if, perhaps, you are a teacher who is contemplating how to move beyond emergency teaching online, this could be a good place to start.
References:
(1) Clark, R. E. (1983). Reconsidering research on learning from media. Review of educational research, 53(4), 445-459.
(2) Guo, Philip J., Juho Kim, and Rob Rubin. "How video production affects student engagement: An empirical study of MOOC videos." Proceedings of the first ACM conference on Learning@ scale conference. 2014.
(3) Zhang, D., Zhou, L., Briggs, R. O., & Nunamaker, J. F. (2006). Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & Management, 43(1), 15-27. doi:10.1016/j.im.2005.01.004
(4) Choi, H. J., & Johnson, S. D. (2005). The effect of context-based video instruction on learning and motivation in online courses. The American Journal of Distance Education, 19(4), 215-227.
(5) Szpunar, K. K., Khan, N. Y., & Schacter, D. L. (2013). Interpolated memory tests reduce mind wandering and improve learning of online lectures. Proceedings of the National Academy of Sciences, 110(16), 6313-6317.
(6) Hillman, J. (2012). The impact of online quizzes on student engagement and learning. Last Accessed October, 20, 2015.
(7) Karpicke, J.D., Roediger H.L., 2007a. Expanding retrieval practice promotes short-term retention, but equally spaced retrieval enhances long-term retention. J. Exp. Psychol. Learn. Mem. Cogn. 33 (4), 704–719. http://dx.doi.org/10.1037/0278-7393.33.4.704.
(8) Schindler, L. A., & Burkholder, G. J. (2014). Instructional Design and Facilitation Approaches That Promote Critical Thinking in Asynchronous Online Discussions: A Review of the Literature. Higher Learning Research Communications, 4(4), 11-29.
(9) Dylan, W. 2017. Assessment, Marking and Feedback. In: H. Carl and M. Robin, ed., What Does this Look Like in the Classroom? Bridging the gap between research and practice. Melton, Woodbridge, UK: John Catt, p.28.
(10) Highhouse, S., Beadle, D., Gallo, A., & Miller, L. (1998). Get 'em while they last! Effects of scarcity information in job advertisements. Journal of Applied Social Psychology, 28(9), 779-795.
(11) Kirschner, P. A., & van Merriënboer, J. J. (2013). Do learners really know best? Urban legends in education. Educational psychologist, 48(3), 169-183.