Promoting Student Engagement in Videos Through Quizzing

Cummins, S. Beresford, A.R. and Rice. A (2016) Investigating Engagement with In-Video Quiz Questions in a Programming Course. IEEE Transactions on Learning Technologies 9(1): 57-66

The use of videos to supplement or replace lectures that were previously done face-to-face is a standard to many online courses. However these videos often encourage passivity on the part of the learner. Other than watching and taking notes, there may be little to challenge to the video-watching learner to transform the information into retained knowledge, to self-assess whether or not they understand the content, and to demonstrate their ability to utilize what they have learned towards novel situations. Since engagement with videos is often the first step towards learning, Cummins, Beresford, and Rice (2016) tested whether or not student can become actively engaged in video materials through the use of in-video quizzes. They had two research questions: a) “how do students engage with quiz questions embedded within video content” and b) “what impact do in-video quiz questions have on student behavior” (p. 60).

Utilizing an Interactive Lecture Video Platform (ILVP) they developed and open sourced, the researchers were able to collect real-time student interactions with 18 different videos developed as part of a flipped classroom for programmers. Within each video, multiple choice and text answer based questions were embedded and were automatically graded by the system. Videoplay was automatically stopped at each question and students were require to answer. Correct answers automatically resumed playback while students had the option of retrying incorrect ones or moving ahead. Correct responses were discussed immediately after each quiz question when payback resumed. The style of questions were on the level of Remember, Understand, Apply, and Analyse within Bloom’s revised taxonomy . In addition to the interaction data, the researchers also administered anonymous questionnaires to collect student thoughts on technology and on behaviors they observed and also evaluated student engagement based on question complexity. Degree of student engagement was measured by on the number of students answering the quiz questions relative the number of students accessing the video.

According to the Cummins et. al. (2016), that students were likely to engage with the video through the quiz but that question style, question difficulty, and the overall number of questions in a video impacted the likelihood of engagement. In addition, student behaviors were variable in how often and in what ways this engagement took place. Some students viewed videos in their entirety while others skipped through them to areas they felt were relevant. Others employed a combination of these techniques. The authors suggest that, based both on the observed interactions and on questionnaire responses, four patterns of motivating are present during student engagement with the video – completionism (complete everything because it exists), challenge-seeking (only engage in those questions they felt challenged by), feedback (verify understanding of material), and revision (review of materials repeatedly). Interestingly, the researchers noted that student recollection of their engagement differed in some cases with actual recorded behavior but, the authors suggest this may actually show that students are not answering the question in the context of the quiz but are doing so within other contexts not recorded by the system. Given the evidence in student selectivity in responding to questions based on motivations, the author’s suggest a diverse approach to question design within videos will offer something for all learners.

While this study makes no attempt to assess the actual impact on performance and retention of the learners (due to the type of class and the assessment designs within it relative to the program), it does show that overall in-video quizzes may offer an effective way to promote student engagement with video based materials. It is unfortunate the authors did not consider an assessment structure within this research design so as to collect some assessment of learning. However given that the platform they utilized it available to anyone (https://github.com/ucam-cl-dtg/ILVP-prolog) and that other systems of integrated video quizzing are available  (i.e. Techsmith Relay) which, when combined with key-strokes and eye movement recording technology, could capture similar information does open up the ability to further test how in-video quizzing impacts student performance and retention.

In terms of further research, one could visual a series of studies using a similar processes which could examine in-video quizzing to greater depth not only for data on how it specifically impacts engagement, learning and retention but also how these may be impacted based on variables such as video purpose, length, context and the knowledge level of the questions.  As Schwartz and Hartmann (2007) noted design variations with regards to video genres may depend on learning outcomes so assessing if this engagement only exists for lecture based transitions or may transfer to other genre is intriguing. As the Cummins et. al (2016) explain, students “engaged less with the Understand questions in favour of other questions” (p.  62) which would suggest that students were actively selecting what they engaged with based on what they felt were most useful to them. Thus further investigation of how to design more engaging and learner centered questions would be useful towards knowledge retention. In addition, since the videos were sessions to replace lectures and ranged in length from 5 minutes and 59 seconds to 29 minutes and 6 seconds understanding how length impacts engagement would help to understand if there is a point at which student motivation and thus learning waivers. While the authors do address some specifics as to where drop-offs in engagement occurred relative to specific questions, they do not offer a breakdown as to engagement versus the relative length of the video and overall admit that the number of questions varied between videos (three had no questions at all) and that there was no connection between number of questions and the video length. Knowing more about the connections between in-video quizzing and student learning as well as the variables which impact this process could help to better assess the overall impact of in-video quizzing  and allow us to optimize in-video quizzes to promote student engagement, performance and retention.

Schwartz, D. L., & Hartman, K. (2007). It is not television anymore: Designing digital video for learning and assessment. In Goldman, R., Pea, R., Barron, B., & Derry, S.J. (Eds.), Video research in learning science. pp 349-366 Mahwah, NJ: Lawrance Erlbaum Associates.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at WordPress.com.

Up ↑

%d bloggers like this: