Digital Games, Design and Learning: A Meta-Analysis

Clark, D. B, Tanner-Smith, E.E, and Killingsworth, S.S. (2016) Digital Games, Design and Learning: A Systematic Review and Meta-Analysis. Review of Educational Research 86(1):  79-122.

Within this article, Clark, Tanner-Smith and Killingsworth (2016) offer a refined and expanded evaluation of research on digital games and learning.  To ground their study, the authors summarize three prior meta-analyses of digital games. It is from these three studies and their findings that the authors develop a set of two core hypotheses about how digital games impact learning  that were tested in their meta-analysis. These two core hypotheses were further examined for that the authors term as moderator conditions and from this the authors developed sub-theories for each core theory to also test in their meta-analysis. Utilizing databases spanning “Engineering, Computer Science, Medicine, Natural Sciences, and Social Sciences” the authors sought research published between 2000 and 2012 to identify studies which examined digital games in K-16 settings, which addressed “cognitive, intrapersonal and interpersonal learning outcomes”(p. 82) and had studies which either had comparisons of digital games versus non-game conditions or utilized a value-added approach (something the prior meta-analyses ignored) to compare standard and enhanced versions of the same game. In addition they required a set of criteria for these studies to meet which included specifics on game design, participant parameters, and pre and post testing data which could be used to assess change in outcomes. Overall, they identified 69 studies which met the parameters outlined in their research procedures. From this population they discerned the following signficant patterns:

  1. In studies of game versus non-game conditions in media comparisons, students in digital game conditions demonstrated signficantly better outcomes overall relative to students in the non-game comparisons conditions (p. 94). This was significant for both cognitive and interpersonal outcomes (p.95). The number of studies with interpersonal outcomes was too small for statistical significance.
  2.  In studies of standard game and enhanced game versions through value-added comparisons, students in enhanced games showed “significant positive outcomes” relative to standard versions (p. 98). While overall there were too few studies with specific features for cross comparisons, the one feature of enhanced scaffolding (personalized, adaptive play)was present in enough studies and showed a significant overall effect (p. 99).
  3. Overall in examining game conditions, games which allowed the learner multiple play sessions performed better than those of single game play when compared against non-game conditions. Game duration (time played) seemed to have no impact on overall impact. (p. 99) These results did not vary even when considerations of the visual aspects of the game were measured.
  4. Despite what was seen in previous meta-analyses, there was no difference in outcomes for games paired with additional non-game instruction versus those without the additional non-game instruction. (p. 99)
  5. There was significant differences with player configurations within games. Overall, single player games had the most signficant impact on learning outcomes relative to group game structure and these outcomes were higher in single player games with no formal collaboration or competition. (p. 100). However games with collaborative team competition had signficantly larger effects on learning outcomes when compare to single competitive player games.
  6. Games with greater engagement of the player with actions within the game had greater impact than those with only a small variety of actions of the screen which did not change much over the course of play.
  7. Overall the visual and narrative perspective qualities of the games both simple and more complex game designs showed effectiveness in learning outcomes but overall schematic (schematic, symbolic or text-based) games were more effective than cartoon or realistic games

In reflecting on their findings, the authors recognized some limitations present based upon both their search parameters and their methodological breakdowns for analysis and encourage further examination of studies which fell outside of their range (for example simulation games) and greater examination of the subtleties of the individual studies included within their analysis before any larger generalizations can be made as to the specifics of best practices for game design.

Perhaps the most interesting aspect of this study is not the outcomes it presents for future study (even though these are great food for thought about intentional game design for educational purposes) but the proposition it makes that educational technology researchers should “shift emphasis from proof-of-concept studies (“can games support learning?”) and media comparison analyzes (“are games better or worse than other media for learning?”) to cognitive-consequences and value-added studies exploring how theoretically driven design decisions can influence situated learning outcomes for the board diversity of learners within and beyond our classrooms” (p. 116).



Online learning as online participation

Hrastinski, S. (2009). A theory of online learning as online participation. Computers & Education, 52(1), 78–82

In this article, Hrastinski (2009) presents the argument that online participation is a critical and often undervalued aspect of online learning and that models which relegate it to solely a social aspect for learning are ignoring its larger contributions to how students connect to materials and each other in the online environment.  In support of his ideas, Hrastinski (2009) offers an overview of literature on online participation which highlights that online learning is “best accomplished when learners participate and collaborate” (p.  79) and this translates into better learning outcomes when measured by “perceived learning, grades, tests and quality of performances and assignments” (p. 79).  In order to evaluate online participation, Hrastinski (2009) presents a conceptualization of online participation as more than just counting how often a student participates in a conversation but rather reflects on the online participation as “a process of learning by taking part and maintaining relations with others. It is a complex process comprising doing, communicating, thinking, feeling and belonging which occurs both online and offline” (p. 80). Hrastinski (2009) in reflecting on the work of others, offers up a view that participation creates community which in turn supports collaboration and construction of knowledge-building communities which foster learning between each other and the group at large. This learning through participation requires physical tools for structuring this participation and the psychological tools to help the learner engage with the materials.  This suggests examining aspects of motivation to learn within the structure of designing materials directed towards participation. He presents this means we should be looking at participation through more than just counting how much someone talks or writes but developing activities which require engagement with others in variety of learning modes.

While the importance of participation being seen as a critical component of online learning and the idea of reflecting on ways in which students may reflect online participation through more than just discussion boards is a good thing to see. Hrastinski (2009) offers little in terms of concrete examples to demonstrate how he sees this theory of online participation playing out through these different learning modes. While he may not have included examples as a way of preventing a formulaic approach to considering online participation, the inclusion of either examples or greater descriptions with how he sees faculty being able to construct both the physical and psychological tools of online participation would have been helpful for those less familiar with these to visualize the increasing ways they can apporach structuring online engagement.

As I have a deep interest in examining ways in which community and culture are structured through online classes and the impacts this has on learning, I found this article both intersting and encouraging for research avenues. In particular the rethinking he proposes on how we see online participation being constructed is encouraging and I would like to see the ways in which faculty and students may seem this idea of “what is participation” similarly or differently and the connection these perceptions have on how they both approach online larning and how they evaluate online learning.