The Role of Technology in Personalized Learning

Izmestiev, D (2012) Personalized learning: a new ICT-enabled education approach. UNESCO Institute for Information Technologies in Education Policy Brief March 2012. Available at http://iite.unesco.org/pics/publications/ en/files/3214716.pdf

In reflecting on the future of education, the Izmestiev’s policy brief for the United Nations Educational, Scientific and Cultural Organization (UNESCO) Institute for Information Technologies in Education (IITE) offered that learning will become more personalized. Lamenting the “the ‘one size fits all’, full-time classroom-based model” (Izmestiev, 2012, p 1) Izmestiev (2012) presented that the future of education requires “a new education paradigm characterized by greater flexibility and choice options for each individual student” (p.1) whereby the educator matches “what is taught and how it is taught with the needs of each individual learner” (p. 1). As Izmestiev (2012) denoted, such an idea is not new, but its implementation had been hampered by issues facing educators in implementing such approaches. These include managing workload but still meeting the needs of a diverse body of learners.  However the growth of information technology, digital educational resources and digital content delivery systems now offers educators greater opportunity in reaching this ideal of personalized learning.

But what exactly is personalized learning and what role can technology play? Reflecting on policies within the U.K., California, Calgary and British Columbia, Izmestiev (2012) offered, that “personalized learning is a methodology, according to which teaching and learning are focused on the needs and abilities of individual learners within classroom groups supervised by the teacher” (p. 3). There are five components to personalized learning as outlined by David Miliband, former United Kingdom Minister of State for School Standards. These include:

  • An emphasis on assessing individual learner strengths and weaknesses as well as their interests and needs through ” a range of assessment techniques, with an emphasis on formative assessment that engages the learner” (Izmestiev, 2012, p.4)
  • Use of effective teaching and learning which allow and emphasis the self-directed learner.
  • Offer the learner the ability to engage in “the selection of curriculum content as well as in the development of individually tailored learning program” but “with clear pathways through the system” (Izmestiev, 2012, p. 4)
  • Class organization is focused towards student progress such that school resources and design are redirected towards meeting that focus
  • Connection of learning outside the classroom through community partnership and  socially engaging activities

In reflecting on these components, Izmestiev (2012) offered that “information and communication technologies (ICTs) and digital content development tools” have made personalized learning more available. (p. 5).   The authors offered that learning management systems now are used to collect assessment data  in a managed work flow through a variety of assessment forms. Newer technologies mean the learner is now offered the ability to move at their own pace along a guided pathway, using system using based recommendations or adjustments in learning strategies and contents to meet individual student needs, while still being encouraged to progress towards specified learning goals. As Izmestiev (2012) commented “using Web 2.0 tools and social networks, learners can interact with each other beyond the classroom,” to broaden where, when and with whom the learners can engaged in meaningful learning goal directed activities. Within the personalized learning paradigm, the author offered that the teacher’s role shifts “from instruction to mentoring, advising and consulting” which necessities refocusing professional development and teacher training (Izmestiev, 2012, p. 7) However he also cautioned that there are risks to personalized learning when poorly implemented.  These include the potential to decrease in teacher-student and student-student interactions as well as a view of decreasing teacher engagement within the learning process in favor of more technology-augmented learning.  Well-intentioned implementation coupled with teacher professional development can address these risks in the author’s view.

In reflecting on this, one of my interests within educational technology is exploring personalized learning and the key affordances within its design could make for effective and engaged learning for the individual learner, the issues of design and implementation which impact both student, learner and institutions moving towards persoalized learning, as well as exploring if personalized learning is an effective means for increasing collaborative learning experiences for groups of learners.  Digital platforms specifically designed for personalized active and adaptive learning experiences are already being utilized in schools to assist student learning of key concepts as well as hands-on skill training and many textbook publishers are implementing these with their books. However, as Pane et al (2017) noted, while preliminary data show some potential for personalized learning to positively impact the learner in terms of performance and motivation, “the field lacks evidence on which practices are most effective or what policies must be in place to maximize the benefits” and more research was needed (Pane, et. al., 2017, p 7).

Additional readings for Week #15

Chen, C. M. (2008). Intelligent Web-Based Learning System with Personalized Learning Path Guidance. Computers & Education, 51(2), 787-814.

Huang, Y.-M., Liang, T.-H., Su, Y.-N., & Chen, N.-S.. (2012). Empowering Personalized Learning with an Interactive E-Book Learning System for Elementary School Students. Educational Technology Research and Development, 60(4), 703-722.

Hwang, G.-J., Sung, H.-Y., Hung, C.-M., Huang, I., & Tsai, C.-C.. (2012). Development of a Personalized Educational Computer Game Based on Students’ Learning Styles. Educational Technology Research and Development, 60(4), 623-638.

Kerr, P. (2016) ; Adaptive learning, ELT Journal, .70 (1): 88–93

Pane, J. F., Steiner, E.D., Baird, M.D,  Hamilton, L.S. and Pane, J.D. (2017) How Does Personalized Learning Affect Student Achievement?. Santa Monica, CA: RAND Corporation

Shaw, C., Larson, R., & Sibdari, S. (2014). An Asynchronous, Personalized Learning Platform― Guided Learning Pathways (GLP). Creative Education, 5, 1189-1204.

Promoting Student Engagement in Videos Through Quizzing

Cummins, S. Beresford, A.R. and Rice. A (2016) Investigating Engagement with In-Video Quiz Questions in a Programming Course. IEEE Transactions on Learning Technologies 9(1): 57-66

The use of videos to supplement or replace lectures that were previously done face-to-face is a standard to many online courses. However these videos often encourage passivity on the part of the learner. Other than watching and taking notes, there may be little to challenge to the video-watching learner to transform the information into retained knowledge, to self-assess whether or not they understand the content, and to demonstrate their ability to utilize what they have learned towards novel situations. Since engagement with videos is often the first step towards learning, Cummins, Beresford, and Rice (2016) tested whether or not student can become actively engaged in video materials through the use of in-video quizzes. They had two research questions: a) “how do students engage with quiz questions embedded within video content” and b) “what impact do in-video quiz questions have on student behavior” (p. 60).

Utilizing an Interactive Lecture Video Platform (ILVP) they developed and open sourced, the researchers were able to collect real-time student interactions with 18 different videos developed as part of a flipped classroom for programmers. Within each video, multiple choice and text answer based questions were embedded and were automatically graded by the system. Videoplay was automatically stopped at each question and students were require to answer. Correct answers automatically resumed playback while students had the option of retrying incorrect ones or moving ahead. Correct responses were discussed immediately after each quiz question when payback resumed. The style of questions were on the level of Remember, Understand, Apply, and Analyse within Bloom’s revised taxonomy . In addition to the interaction data, the researchers also administered anonymous questionnaires to collect student thoughts on technology and on behaviors they observed and also evaluated student engagement based on question complexity. Degree of student engagement was measured by on the number of students answering the quiz questions relative the number of students accessing the video.

According to the Cummins et. al. (2016), that students were likely to engage with the video through the quiz but that question style, question difficulty, and the overall number of questions in a video impacted the likelihood of engagement. In addition, student behaviors were variable in how often and in what ways this engagement took place. Some students viewed videos in their entirety while others skipped through them to areas they felt were relevant. Others employed a combination of these techniques. The authors suggest that, based both on the observed interactions and on questionnaire responses, four patterns of motivating are present during student engagement with the video – completionism (complete everything because it exists), challenge-seeking (only engage in those questions they felt challenged by), feedback (verify understanding of material), and revision (review of materials repeatedly). Interestingly, the researchers noted that student recollection of their engagement differed in some cases with actual recorded behavior but, the authors suggest this may actually show that students are not answering the question in the context of the quiz but are doing so within other contexts not recorded by the system. Given the evidence in student selectivity in responding to questions based on motivations, the author’s suggest a diverse approach to question design within videos will offer something for all learners.

While this study makes no attempt to assess the actual impact on performance and retention of the learners (due to the type of class and the assessment designs within it relative to the program), it does show that overall in-video quizzes may offer an effective way to promote student engagement with video based materials. It is unfortunate the authors did not consider an assessment structure within this research design so as to collect some assessment of learning. However given that the platform they utilized it available to anyone (https://github.com/ucam-cl-dtg/ILVP-prolog) and that other systems of integrated video quizzing are available  (i.e. Techsmith Relay) which, when combined with key-strokes and eye movement recording technology, could capture similar information does open up the ability to further test how in-video quizzing impacts student performance and retention.

In terms of further research, one could visual a series of studies using a similar processes which could examine in-video quizzing to greater depth not only for data on how it specifically impacts engagement, learning and retention but also how these may be impacted based on variables such as video purpose, length, context and the knowledge level of the questions.  As Schwartz and Hartmann (2007) noted design variations with regards to video genres may depend on learning outcomes so assessing if this engagement only exists for lecture based transitions or may transfer to other genre is intriguing. As the Cummins et. al (2016) explain, students “engaged less with the Understand questions in favour of other questions” (p.  62) which would suggest that students were actively selecting what they engaged with based on what they felt were most useful to them. Thus further investigation of how to design more engaging and learner centered questions would be useful towards knowledge retention. In addition, since the videos were sessions to replace lectures and ranged in length from 5 minutes and 59 seconds to 29 minutes and 6 seconds understanding how length impacts engagement would help to understand if there is a point at which student motivation and thus learning waivers. While the authors do address some specifics as to where drop-offs in engagement occurred relative to specific questions, they do not offer a breakdown as to engagement versus the relative length of the video and overall admit that the number of questions varied between videos (three had no questions at all) and that there was no connection between number of questions and the video length. Knowing more about the connections between in-video quizzing and student learning as well as the variables which impact this process could help to better assess the overall impact of in-video quizzing  and allow us to optimize in-video quizzes to promote student engagement, performance and retention.

Schwartz, D. L., & Hartman, K. (2007). It is not television anymore: Designing digital video for learning and assessment. In Goldman, R., Pea, R., Barron, B., & Derry, S.J. (Eds.), Video research in learning science. pp 349-366 Mahwah, NJ: Lawrance Erlbaum Associates.