Metacognitive Instruction on Student’s Reading Comprehension in Digital Reading Contexts

Photo Credit: <a href=””>Monica Blatton</a> via <a href=””>Compfight</a&gt; <a href=””>cc</a&gt;

Digital Citizenship and Digital literacy have been at the forefront of my thinking over these past few months. Taking two classes this semester, I have had the opportunity to take a look at different approaches to learning from an educational psychology framework. These different approaches, whether behavioural analytic, metacognitive, or peer learning all have interesting and important facets into how we integrate technology into our classrooms. I have been able to focus my research for this other class by looking at efficacy studies into how these technological aids can support learning, and the best ways to instruct students to use these tools effectively.

David White’s work into Digital Visitors and Residents helped me to better understand that in order for technological tools to be useful, students would need the opportunities to learn the meta-cognitive strategies that go along with them. I found a very interesting article by Lan, Lo, & Hsu ‘s (2014) entitled The Effects of Meta-Cognitive Instruction on Students’ Reading Comprehension in Computerized Reading Contexts: A Quantitative Meta-Analysis. As the title suggests, the strategy under review is cognitive strategy instruction. The authors of this metaanalysis wanted to see whether or not metacognitive strategies used to improve student’s reading comprehension would be as effective when used in a digital context. The following is an overview of the meta-analysis, and the most salient approaches one might take when incorporating digital reading into our classrooms.

The authors believe this type of study deserves focus, given the way text is being primarily accessed through a technological device. According to Puntambekar & Stylianou (2005), cognitive monitoring is necessary when accessing text through digital means as it is nonlinear and flexible. Students will need to “plan what to read next, and closely monitor ongoing learning” (as cited in Lan, Lo, & Hsu, 2014, p.187). Referencing Prensky and his now famous article on Digital Natives vs Immigrants (2001), the authors point out that because our youth are floating in a sea of technology, they are accessing information and learning in a fundamentally different way from past generations, and as such Presnsky states that “the same methods that worked for the teachers when they were students will work for their students now is no longer valid” (as cited in Lan, et al., p.187).

The authors used a thorough, if not at times confusing, approach to their meta-analysis. The authors followed Petitti’s (2000) steps on conducting a meta-analysis, where “studies need to be systematically identified, then filtered according to the inclusion criteria” (p.187). Using online database search engines such as PsychINFO, the authors made careful searches. that correlated technology with cognition and reading comprehension. Five journals were also manually searched due to their “prominent role in such fields as metacognition, reading/literacy, and digital learning” (p.188).

In order for the studies to be included in the meta analysis, they needed to meet an inclusion criteria. This criteria stipulated that the studies: 1) involved explicit metacognitive instruction; 2) were published in English in refereed journals; 3) Had to include the use of electronic text; 4) adopted an experimental or quasi-experimental design; 5) provided enough quantitative information to allow for calculation of the effect size (p.188). After putting all 57 studies they had found through their inclusion filter, only 17 studies were found to meet their criteria.

Effect sizes were used to determine the “magnitude and strength of the metacognitive strategies via the computerized reading context” (p.189). Cohen’s d (2011) is the standardized population between two populations. To calculate Cohen’s d, means and standard deviations were taken from each study, then “subtracted the control group’s mean from the experimental group and divided it by the pooled standard deviation. Magnitude of effectiveness was decided according to Cohen’s criteria: d=.80 is a large effect, d=.50 moderate, and d=.20 small effect” (p.189).

Studies were organized into four main groups:

  1. Regulation as instruction: These 7 studies focused on improving self-directed thinking through use of prompts, both imbedded in the programs, and/or support from tutors, to comprehend electronic text. In all studies save two, global monitoring questions, both embedded and from tutors, “helped students to be more consciously aware of what they were reading; therefore having better performance in the comprehension measures” (p.199).
  2. Strategy cues with think aloud as instruction: Only two studies used strategy cues with think-aloud as instruction, and were both used in conjunction with English speakers learning a second language. Large effect sizes “indicated that multimedia annotation strongly supported vocabulary recognition and comprehension” (p.193).
  3. Vocabulary and comprehension support as instruction: These 4 studies were either “developing students’ metacognitive awareness or involving students in metacognitive processes” (p.194). Results found that narrated animations, glossary terms, and motivational content improved “comprehension, perceived difficulty, affect, and motivation” (p.194) when reading digital texts. These may have been beneficial due to the multi-dimensional sensory supports the included video and audio presented.
  4. Computerized environment versus hard copy: These 4 studies looked at the differences between reading text through use of a computerized environment vs. traditional paper. The most effective results showed that while the use of computerized environments may provide motivation, there was no significant benefit in terms of students learning and implementing metacognitive strategy training (identifying the task, planning, performing, and evaluation) (p.195).

Researchers concluded that the use of self-directed metacognitive strategies proved to be the most effective out of the four methods of metacognitive approaches reviewed. Of the other three areas, while there were promising results, in many cases results weren’t statistically significant in many of the studies. This was in large part due to the fact that many of the studies relied upon technologies that didn’t differ much from traditional metacognitive cues (p.199). Studies that made use of audio and visual cues provided much more beneficial results over control groups (p.199).

This metacognitive study was helpful in reminding me of the importance of using the right technologies to support reading comprehension, and to ensure students are properly taught how to use self-directed metacognitive strategies when using such texts. Software such as Google Read & Write is a new technology that incorporates many of the meta-cognitive assists used in many of the studies (ability to define and highlight text, a research button to find audio/video that helps with comprehension/context) all in a simple to use interface. The app works within both a word-processing environment as well as on any page on the internet. Providing students the instruction on how to properly use such a tool is integral to it being a help, rather than a hindrance, according to the results of this meta-analysis (p.199).

When choosing an article on the efficacy of a metacognitive study, I did so with the intent of also working towards finding something that would be helpful in my classroom. Specifically, I was interested in knowing how to facilitate metacognitive strategies that would best align with using technologies to read online texts. A large number of students on IIP’s now have technology written into their goals, and I was hoping to find positive results on the use of cognitive strategies in tandem with the use of technological tools to facilitate these strategies.

Unfortunately, due to the low number of studies that correlated with their inclusion parameters, I feel that that evidence towards this type of instruction wasn’t as pronounced as it could have been. A problem with this I believe was due to the varying types of software that were being used in tandem with the metacognitive instruction. Studies were pulled as far back as 1988, and even in the last 3 years user functionality in software has improved dramatically, which makes a case for past technologies perhaps not offering the functionality that would compliment the metacognitive strategies being implemented when reading scrolling text.  

Researchers also found that results weren’t as pronounced when working with at-risk youth, as well as in children whose reading proficiency was below grade level. Again, I believe that there are factors that may have played a part. A metacognitive study unfortunately doesn’t give you all the details of a the studies they are using, and the expectation is for you to take the researcher’s conclusions at face value. Again, I would imagine there was likely issues with the software being used, which distracted, or impeded the children due to it’s poor design/workflow. Having had the opportunity to work with a variety of software programs aimed at assisting students with comprehension of texts (Kurzweil being the main one), I have seen time and time again where these programs will crash, or are unexpectedly unreliable. Students quickly learn to mistrust the software, and will prefer to ignore the assists entirely.


Cohen, L., Manion, L., & Morrison, K. (2011). Research methods in education (7th ed.). New

York, NY: Routledge.

Lan, Y.C., Lo, Y.L., & Hsu, Y.S. (2014). The effects of meta-cognitive instruction on students’

reading comprehension in computerized reading contexts: a quantitative meta-analysis.

Educational Technology & Society, 17(4), 186-202.

Petitti, D.B. (2000). Meta-analysis, decision analysis, and cost effectiveness analysis: Methods

for qualitative synthesis in medicine (2nd ed.). New York, NY: Oxford University Press.

Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, 9(5), 1-6.

Puntambekar, S., & Stylianou, A. (2005). Designing navigation support in hypertext systems

based on navigation patterns. International Science, 33(5-6), 451-481.

Text Help. (2015). Google Read & Write [application software]. Retrieved from:


One thought on “Metacognitive Instruction on Student’s Reading Comprehension in Digital Reading Contexts

  1. Jeremy, great post. I too am very much intrigued with research that touches down on ways we can engage students into tapping into metacognition. I will definitely take a look at the Cohen article you referenced. Over the past year, I have been reading Hattie’s work on metacognition and teacher/student feedback loops. If you have a chance (and of course if you haven’t already read it) be sure to check out John Hattie’s Visible Learning (at least one of his 3 books!)

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s