Photo Credit: <a href=”https://www.flickr.com/photos/81325557@N00/8989239906/”>carfreedays</a> via <a href=”http://compfight.com”>Compfight</a> <a href=”https://creativecommons.org/licenses/by-nc-nd/2.0/”>cc</a>
Following my last post about the benefits of using metacognitive instruction towards comprehension in digital reading contexts, this is another APA formatted (groan!) summary of research into the best learning approaches for increasing motivation, achievement, etc. in the context of digital media.
In the study Improving Learning Achievements, Motivations and Problem-Solving Skills through a Peer Assessment-Based Game Development Approach (Hwang, Hung, & Chen, 2014) researchers probe into whether a peer assessment incorporated into a game development approach to learning would be of even more benefit to the already successful learning activity.
In recent years the use of digital games in the classroom has become more popular due to it’s “interactive and enjoyable learning opportunities” (Hwang et al., 2014, p. 130). There are many benefits to using digital games as a learning tool. Ali (2009) found that due to the way learning concepts are abstracted and graphical in nature, students were able to better understand the concepts being taught, and solve the problems more easily (as cited in Hwang et al., 2014). Games also have been proven to motivate and to promote independent learning ability (Burguillo, 2010; Tuzun et al., 2009, as cited in Hwang et al., 2014).
Recently there has also been a push towards the added benefits of students actually designing or developing games, as according to Hayes (2008) it could further enhance students’ information processing ability and cultivate their problem-solving capability (as cited in Hwang et al., 2014). Using this technology as a knowledge building tool, rather than simply a visual aid will allow for students to be engaged in “interpreting, analyzing, synthesizing, and organizing their knowledge” (Hwang et al., 2014, p.132).
Researchers chose to integrate a peer assessment aspect to the game development exercise to see whether the added community would further motivate and provide ongoing critical feedback that would produce overall better work. Additionally, the ability to play and critique another’s work will provide the student the ability to reflect on “the advantages and drawbacks of their learning performance, which enables them to understand themselves better than their teachers could” (Chen, 2010, as cited in Hwang et al., 2014, p. 133).
This quasi-experiment involved 167 Taiwanese grade 6 students, half of them the experimental group, the other half the control group. Both groups had received a month’s instruction already in environmental education, a course taught in Taiwan’s science curriculum. Students were to use the computer program Kodu to “design a game for instructing the knowledge of a sustainable town, that is an ideal and pollution-free town suitable for people to live in” (Hwang et al., 2014, p.134). Both the control and experimental groups were provided instruction in how to operate the game. For 6 weeks, students were given 45 minutes per week to further create and develop their world. Students in the experimental group were then given an additional 15 minutes per week for peer assessment. Students would be given a new partner each week to play their game, and to provide feedback.
In order to provide students with the ability to provide informative feedback, assessment guidelines were created. This was accomplished through a collaboration of two teachers with 10 years experience teaching this specific curriculum, and an experienced e-learning researcher. The objective of the peer assessment activity was to “engage students in making reflections and sharing ideas instead of scoring the games” (Hwang et al., 2014, p.135). The six evaluation criteria focused on game enjoyment, game innovation, appearance, completeness of the content, accuracy of the content, and relevance to the learning objectives, and all were graded using a three-point evaluation scheme.
Both a pre and post test was used to evaluate the outcome of the experiment, and consisted of a test for comprehension re: environmental content being studied, a learning motivation scale and a problem-solving skill scale (Hwang et. al, p. 135). Both the learning motivation scale, and the problem-solving skill scale both used the 5 point Likert scale, and the Cronbach’s alpha values were both 0.93, “presenting high reliability of the scale” (Hwang et al., 2014, p.133).
T-test results on the pre-test data of both experimental and control groups showed that there was no significant difference between them. Using ANCOVA, researchers compared the pre-test data with that of the post-data, and found a significant difference between the groups; those in the control group who had the added peer assessment activity showed significantly better learning achievements.
For the learning motivation scale, again the pretest comparison between control and experimental groups showed no significant difference. Following the 6 weeks, researchers found that again there was a significant difference between the experimental and control groups showing that the peer assessment activity greatly benefitted the motivation of the experimental group.
Using the same method as the two prior tests, researchers found that students in the experimental group showed a significant advantage over those in the control group in “improving their problem solving conception” (Hwang et al., 2014, p. 139).
The results of the experiment show that the simple addition of a peer assessment activity can have a huge impact on student motivation, content mastery, and problem solving abilities. This is an excellent reminder of the power of collaborative learning. In my grade ⅚ classroom at Connaught I use many digital tools to motivate and assist student learning, but up until reading this paper I hadn’t thought about how beneficial it might be to have students create their own games to further facilitate their content mastery. Many people have started facilitating this type of instruction through MinecraftEdu, a version of Minecraft where teachers can manipulate the world students build/interact in, in order to create learning adventures for the children. There is nothing stopping teachers from giving the reins to the children, and have them create their own adventures for their peers to learn from.
Using peer assessment with digital tools, whether it be a developmental approach to a game or not, is something that I will pay more heed to. Educational programs such as Duolingo, Prodigy, and DIY all have the ability to allow students to build a community of friends. Giving students the time to reflect upon one another’s work and to provide feedback would no doubt add motivation, content mastery, and an improvement in problem solving abilities, as seen in the results of this study. This type of assessment might also prove useful with the Google Classroom my class is using, as the program allows for the owner of the document to share it with whomever they wish. Those who receive the shared file can choose to either ‘edit’ or ‘suggest;’ the latter giving the editor the opportunity to point out the mistakes of the writing sample, or to add additional comments about content. It is a powerful tool because it doesn’t actually create permanent changes to the document like the ‘edit’ mode would. The original owner of the document can see the ‘suggestions’ in real time, and can then later choose to accept the suggestions, or to ignore them. I believe that this additional audience would motivate students to put more effort into the quality of the content, as well as to feel more pressure to edit their work beforehand, knowing their peers aren’t as tactful with their criticism as their teacher might be.
Of course, what I believe made this study so successful was that they were careful to provide students the opportunity to learn how to critique others’ work in an effective way. Topping (2005) states that when peer assessment is “implemented with thoughtfulness about what form of organisation best fits the target purpose, context, and population, and with reasonably high implementation integrity, results are typically very good” (Topping, 2005, p. 635). Making the point to be thoughtful about structuring peer learning in my own classroom will, I am sure, make a big difference in how my students approach their interactions in class, leading to better overall student achievement.
Ali, A. (2009). A conceptual model for learning to program in introductory programming
courses. Informing Science and Information Technology, 6, 517–529.
Burguillo, J. C. (2010). Using game theory and competition-based learning to stimulate student
motivation and performance. Computers & Education, 55(2), 566–575.
Chen, C.H. (2010). The implementation and evaluation of a mobile self- and peer-assessment
system. Computers & Education, 55(1), 229–236.
Duolingo. (2015) Duolingo [application software]. Retrieved from https://en.duolingo.com
Hayes, E. (2008). Game content creation and it proficiency: An exploratory study. Computers &
Education, 51(1), 97–108.
Hwang, J.G., Hung, C.M., Chen, N.S. (2014). Improving learning achievements, motivations and
problem-solving skills through a peer assessment-based game development approach.
Educational Technology Research and Development, 62(1), 129-145.
Mayer, R. E., & Wittrock, R. C. (2006). Problem solving. In P. A. Alexander & P. H. Winne
(Eds.), Handbook of educational psychology (pp. 287–304). Mahwah: Erlbaum.
Microsoft FUSE labs. (2015) Kodu Game Lab [application software]. Retreived from
Microsoft. (2015) Minecraft [application software]. Retreived from https://minecraft.net
Prodigy. (2015) Prodigy [application software]. Retrieved from https://www.prodigygame.com
Topping, K.J. (2005). Trends in peer learning, Educational Psychology: An International
Journal of Experimental Educational Psychology, 25:6, 631-645.
Tuzun, H., Meryem, Y. S., Karakus, T., Inal, Y., & Kızılkaya, G. (2009). The effects of
computer games on primary school students’ achievement and motivation in geography
learning. Computers & Education, 52(1), 68–77.