View this email as a web page here.
Faculty Focus
Faculty_Focus_TopHatLeader
February 1, 2017

By Maryellen Weimer, PhD

There’s a lot of talk these days about evidence-based instructional practices, so much that I’ve gotten worried we aren’t thinking enough about what that means. Let me see if I can explain with an example.

Recently I’ve been trying to locate the evidence that supports quizzing, wondering if it merits the evidence-based label. Tracking down this evidence in our discipline-based research is challenging because although quizzing has been studied across our disciplines, it’s not easily searchable. My collection of studies is good, but I know it’s not complete. As you might suspect, the results are mixed; they are more positive than negative, but still, a significant number of researchers don’t find that quizzes affect learning outcomes.

I’ve been looking closely at a set of seven studies, which you will find listed at the end of the article. (These studies were randomly selected—no empirical objective here.) Not all of the studies report the same positive results, but if they are viewed collectively, the use of quizzes seems to yield some impressive benefits. Students reported they spent more time reading and more time studying between tests, and that they were more motivated to come to class prepared when the course included quizzes. These quizzes also increased student participation, lowered failure rates, improved exam scores, resulted in better overall course grades, and did not lower course evaluations. That all sounds pretty good, doesn’t it?

But the devil is in the details, as in the specific combination of factors and conditions that produced the results. When I looked closely at this subset, I was amazed at the array of details that could potentially affect whether quizzes improve learning.

  • Are they pop quizzes or scheduled on the syllabus?
  • What types of questions are used (multiple choice, short answer, etc.)?
  • What’s the relationship between quiz questions and questions on the exam (same questions, similar questions, or completely different)?
  • How many quizzes are given throughout the semester?
  • When are the quizzes given—before content coverage or after? How soon after?
  • Do students take the quizzes in class or online?
  • Are the quizzes graded or ungraded? If graded, how much do they count?
  • Is the lowest score dropped?
  • What kind of feedback are students provided?

In addition to these design details, there are content variables derived from what’s being taught, the level at which it’s taught, the type of course, and the instructional method used to deliver it. And then there are student variables, such as their year in college and academic performance to date. In all likelihood, the classroom climate exerts some influence on the outcomes as well.

What this evidence tells us is that given a particular set of conditions, quizzes produce positive results, in most cases a range of them. And that gives us three things to consider. First, based on studies done in our disciplines, quizzes are an evidence-based instructional strategy only in a general sense. If your course design details and teaching context aren’t the same as those in the study, you aren’t assured the same results.

Second, to be sure that your quizzes produce the desired results, you need evidence. You can conduct your own empirical analysis. One of the benefits of all these different studies is that they provide a range of different ways quiz performance can be analyzed. That will give you the best evidence, but you can also do something quasi-empirical. You can look at exam scores in sections with and without quiz scores. You can ask students how a course with quizzes affects their attendance, preparation, and study habits. Or, you can carefully, thoughtfully, and objectively observe how quizzes are affecting learning. What we need to stop doing is assuming that just because an instructional strategy has been studied and judged effective, we can use that same strategy and accrue the same benefits.

Finally, looking at a set of studies (whether on quizzing or a range of other instructional strategies) illustrates the profound importance of instructional design. So often, when we decide on an instructional approach, we just do it. Without much thought or purposeful decision-making, we come up with a way to use quizzes. And yet it’s those easy, seemingly minor decisions about the details that determine the outcome.

Remember, though, that you haven’t gotten the whole story here. You’ve gotten the sum of a sample of studies done in our disciplines. Regular repeated testing has been studied elsewhere. In the next post, we’ll continue this consideration of what it means to be an evidence-based instructional strategy.

Resources:

Azorlosa, J. W. (2011). The effect of announced quizzes on exam performance: II. Journal of Instructional Psychology, 38, 3-7.

Batsell, Jr., W. R., Perry, J. L., Hanley, E., and Hostetter, A. B., (2017). Ecological validity of the testing effect: The use of daily quizzes in introductory psychology. Teaching of Psychology, 44 (1), 18-23.

Braun, K. W., and Sellers, R. D. (2012). Using a “daily motivational quiz” to increase student preparation, attendance and participation. Issues in Accounting Education, 27 (1), 267-279.

Hardsell, L. (2009). The effect of quiz timing on exam performance. Journal of Education for Business, 84 (3), 135-141.

Hatteberg, S. J. and Steffy, K., (2013). Increasing reading compliance of undergraduates: An evaluation of compliance methods. Teaching Sociology, 41 (4), 346-352.

Johnson, B. C., and Kiviniemi, M. T. (2009). The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teaching of Psychology, 36 (1), 33-37.

Kouyoumdjian, H. (2004). Influence of unannounced quizzes and cumulative final on attendance and study behavior. Teaching of Psychology, 31 (2), 110-111.

StratX-600-x-80-101716.png
Faculty_Focus_600x145_socialTOPbody.jpeg

Top Hat’s interactive, cloud-based teaching platform enables professors to engage students inside and outside the classroom with compelling content, tools and activities. Millions of students at 700 leading North American colleges and universities use the Top Hat teaching platform. Visit https://tophat.com/ today!

ff-featured-product.jpg

Are you facing an “engagement gap” in your online classroom?

Here’s how to close it.

In traditional classrooms, you interface with, well, faces.

But in the online classroom, your interactions are with usernames and occur remotely at different times and locations.

Not surprisingly, student engagement can suffer. Thankfully, there are proven principles and techniques that can create not only greater engagement but better learning outcomes for your online students. Discover them in a three-part series of 20-Minute Mentor programs led by B. Jean Mandernach, PhD.

Each of the programs in the three-pack features steps you can implement right away in your courses.

In What Are the Best Questioning Strategies for Enhancing Online Discussions?, you’ll discover how to design and facilitate asynchronous discussions that encourage critical inquiry, promote learning, and capture student interest.

What Three Things Should I Do Each Week to Engage Online Students? focuses on personalized interactions with your students using methods that strengthen faculty–student rapport while respecting your time and schedule.

How Do I Design Innovative Assignments to Foster Learning in Online Classrooms? will help you devise assignments that are engaging, impactful, and learning-focused. You’ll learn how to integrate online-appropriate formative and summative assignments into your courses to create significant added value for your students.

Together, these fast, focused presentations will equip you with a handy, effective “engagement toolkit.” You’ll see the results through better learning outcomes, better retention, and more satisfying online experiences for you and your students alike. Order your Fostering Learning and Engagement in the Online Classroom Three-Pack today!


      LEARN MORE      

 

2718 Dryden Drive    Madison,  WI  53704  United States

You received this email because you are subscribed to Emails from Faculty Focus from Magna Publications.
Update your email preferences to choose the type(s) of email you receive, or to unsubscribe from future emails.

Please do not reply to this email; this address is not monitored. If you have a question or concern, contact our customer service team at support@magnapubs.com or 608-246-3590.