instructional design

Article Review: The Impact of Embedded Interactivity on Video Effectiveness

Welcome to “Article Review”, where I put on my scholarly cap to give an academic article its due in critical analysis.

This week’s article: 

“An investigation of effects of instructional videos in an undergraduate physics course” by Olha Ketsman, Tarequ Daher, and Juan A. Colon Santana

Synopsis

This study investigates whether students learn more from watching an interactive video with embedded questions or from watching the same video uninterrupted and followed by questions.

The study’s authors ran a comparative experiment on 111 participants—all students in the same large university physics class—with 54 randomly assigned to the control group and 27 randomly assigned to the experimental group (p. 273). All participants watched the same 44-minute video and answered the same 20 assessment questions, either interspersed throughout the interactive video or at the conclusion of the passive video. Students then responded to a 23-question post-survey that assessed their attitudes and preferences about whether quizzes should be within or after the video lecture.

The study uses a convergent design that places equal weight on both quantitative (i.e. “How do students’ assessment results compare?”) and qualitative methods (i.e. “What are students’ attitudes and preferences?”) While the study did not find any statistical significance in quantitative achievement, its qualitative analysis indicated that students prefer embedded questions, and that the added interactivity of embedded questions keeps students engaged longer and helps them retain material better (p. 282).

Based on these findings, the study’s authors recommend using video embedding when possible, particularly in combination with chunking and scaffolding.

Strengths

This study is valuable in its examination of several aspects of Constructivist learning theory as they apply to instructional video. The “to embed or not to embed” question is an important one, something most teachers will find applicable to their own materials.

The study’s choice to use a longer video is particularly useful, since the longer the video the more instructors need to consider how do you keep students focused on what the video is covering instead of multitasking, skipping ahead, or switching their attention to social media or another distraction?

The authors did a good job explaining their process and the rationale for incorporating certain methods and materials. They recruited a reasonable sample size (111 students out of a 130-person class) and randomly assigned participants to control or experimental.

Finally, the combination of quantitative and qualitative analysis seemed well-balanced and resulted in a helpful, dynamic approach.

Critique

This study is organized, clearly written, and well-structured. It avoids the jargonistic word salad that academic articles can devolve into and does not overly rely on passive voice. However, it also has several typos (e.g. a noun missing its article in two different places; a missing possessive apostrophe) that raise questions about the quality of editorial oversight on the part of the journal and peer reviewers.

The section examining qualitative data is particularly interesting. The students’ feedback helps conceptualize why a “preference for integrated video” might be more than just predilection. Between the multiple-choice results (see Figure I below) and the verbatim extracts from student’s written responses, the authors illustrate how preference translates into engagement.

ArticleReviewVideoEmbedFig1

For the most part, the authors appear cognizant of their study’s limits and weaknesses, namely that the study may have been affected by its reliance on self-reported reflections, and that there was only one experiment, only one instructional video, and only one 20-question quiz (p. 282). I also suspect that the makeup of the experiment’s participants—namely that they were all in their late teens through mid-twenties, mostly majoring in a related subject, and overwhelmingly male—may also have affected the results. I agree with the authors’ conclusion that more research should be conducted to see if this study’s findings hold up under repetition and with a larger sample size, whether its findings are applicable to non-STEM courses, and if it can increase long-term student engagement and retention.

Best Uses

This study makes a strong case for choosing interactive videos with embedded quizzes. University-level STEM instructors could incorporate its findings to guide their decisions in selecting software (such as Techsmith Relay or Camtasia) and designing instructional videos and assessments.

Its findings could also be extrapolated to a wider audience of educators seeking to improve their use of video in instruction.

References

Ketsman, O., Daher, T., & Colon Santana, J. A. (2018). An investigation of effects of instructional videos in an undergraduate physics course. E-Learning and Digital Media, 15(6), 267-289. doi: 10.1177/2042753018805594

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s