The Learning Scientists

View Original

Episode 3 - Bite-Size Research on Retrieval Practice Formats

Your browser doesn't support HTML5 audio

Episode 3 - Bite-Size Research on Retrieval Practice Formats Learning Scientists

This episode was funded by The Wellcome Trust.

Show Notes:

This is our first bite-size research episode, where we briefly describe research findings on a specific topic. This week, Megan Sumeracki talks about retrieval practice.

In our second episode, we introduced retrieval practice or bringing information to mind. We know from a century of research that retrieval practice improves learning. There are a lot of ways to practice retrieval, and this strategy seems to be very flexible and can be used in a lot of different ways.

One easy way to implement retrieval practice in the classroom is to give students frequent low-stakes or no-stakes quizzes. But the next natural question is, what retrieval format should I use?

The two most common formats are short-answer and multiple-choice formats. Some research shows that short-answer quizzes improve learning more than multiple-choice quizzes because they require the students to produce the answer (1). Yet often multiple-choice quizzes are easier to administer and to grade, and we know this is very important for busy teachers. So what to do? (Spoiler alert, based on my honors thesis and the work of others, the format does not have a huge impact on learning. The important thing is to make sure students practice retrieval in some way.)

In 2005, Park (2) created a hybrid format to try to combine the benefits of short-answer and multiple-choice formats. Sixth-grade students would first try to answer a question in short-answer format, and then could click a "next" button for the multiple-choice alternatives to select the correct answer. The catch is that the multiple-choice alternatives only show up for a brief amount of time. So, the students really had to try to produce the answer before clicking next. Park found that the hybrid quiz led to a little bit more learning than a standard multiple-choice quiz after a few days.

In 2008, I was really interested in quiz formats and decided to conduct my undergraduate honors thesis on this topic at Purdue University. In my experiments (3), students were randomly assigned to one of a few different conditions, and each condition was assigned a different retrieval-practice format. Some students answered multiple-choice questions, some answered short-answer questions, and others answered hybrid questions. Finally, some students were in a control group where they didn't answer questions at all. All of the students read a text, took a quiz (except the control group), and then read statements containing the correct answer to all of the quiz questions. One week later, we gave the students an assessment test.

My thesis advisor and I found that retrieval practice, regardless of format, improved learning over the control group.

Data from Smith & Karpicke, 2014 (3) Experiment 4

However, we also found that the type of retrieval format didn't really much matter. Across 4 experiments, any differences we found between retrieval formats were really pretty small.

Data from Smith & Karpicke, 2014 (3) Experiment 4

At first, my advisor and I were really surprised by this! But after doing a very systematic review of the literature and conducting 4 experiments of our own, it seems that the retrieval practice format does not have a huge effect on learning. Others have found little to no difference between retrieval practice formats (e.g., 4, 5, 6). In another paper that was published after mine, researchers found that there weren't format differences among younger middle school students (7).

Main Takeaway:

Retrieval practice improves learning, and we can be pretty sure of this based on a century of research. However, the type of format you use is not likely to make a huge difference to learning. 

You can read a blog based on this research here. You can find the published paper containing my honors thesis experiments here. We hope you enjoyed this bite-size research podcast! Check back on the first Wednesday of next month, when we’ll be releasing a podcast about spaced practice.


Subscribe to our Podcast!

Go to our show on iTunes or wherever you get your podcasts.

RSS feed: http://www.learningscientists.org/learning-scientists-podcast/?format=rss

References:

(1) Kang, S. H. K., McDermott, K. B., & Roediger, H. L. (2007). Test format and corrective feedback modify the effects of testing on long-term retention. European Journal of Cognitive Psychology, 19, 528-558.

(2) Park, J. (2005). Learning in a new computerised testing system. Journal of Educational Psychology, 97, 436-443.

(3) Smith, M. A., & Karpicke, J. D. (2014). Retrieval practice with short-answer, multiple-choice, and hybrid tests. Memory, 22, 784-802.

(4) Clariana, R. B., & Lee, D. (2001). The effects of recognition and recall study tasks with feedback in a computer-based vocabulary lesson. Educational Technology Research & Development, 49, 23-36.

(5) Williams, J. P. (1963). Comparison of several response modes in a review program. Journal of Educational Psychology, 54, 253-360.

(6) Gay, L. R. (1980). The comparative effects of multiple-choice versus short-answer tests on retention. Journal of Educational Measurement, 17, 45-50.

(7) McDermott, K. B., Agarwal, P. K., D'Antonio, L., Roediger, H. L., & McDaniel, M. A. (2014). Both multiple-choice and short-answer quizzes enhance later exam performance in middle and high school classes. Journal of Experimental Psychology: Applied, 20 , 3-21.