This capstone paper was written for my Master of Arts in Educational Technology at Concordia University, St. Paul under the guidance of Professor Lynn Henry and second reader Professor Tina Monosmith. A PDF version of this paper can be downloaded here.
Abstract
Current research showed students were severely deficit of media literacy skills to competently engage with society and culture. Students were generally not able to critically source or evaluate news and posts on social media. Historically, media literacy terminology, definition, and skills lacked general consensus, as noted in several meta-analytical reviews. This paper focused on three areas of media literacy with which research agreed: access and use, analysis and evaluation, and creation. Research showed strategies could be implemented to empower students to access information and technology, analyze and evaluate sources of information, and use creation to model media industry processes. A review of 18 empirical, qualitative, and mixed-methods studies indicated short media literacy interventions could improve students’ critical thinking skills when paired with media studies courses. Longer interventions were successful when fully integrated into school culture. Although more cumbersome to structure and implement than shorter interventions, these semester, year-long, and interdisciplinary academies demonstrated great effectiveness and durability over longer periods of time.
Keywords: media literacy, media studies, critical thinking
Dedication
This work is dedicated to the teachers who were most fundamentally responsible for molding me into a media-literate person and consummate educator.
From my formative years at Mahtomedi:
Claudine Goodrich for inspiring me to appreciate literature, song, and poetry, and giving my writing a voice. Your belief in me lit a fire inside for everything else that was to follow.
Nancy Breening for making me literate, shaping my use of language, and reminding me the English language has more than 100,000 words—never use the word “got.” It was because of your routine and purposeful feedback that I grew as a writer.
Gordon Donnelly, whose unyielding expectation for the best in all his students ferried me along on my own journey. Your intolerance of willful ignorance established the framework for me to become a critical thinker, to question, and to hunger for deeper knowledge. Rest in peace, old friend.
John Anderl and Mark Smeltzer of the Communication department at Century College. I still refer back to your lessons about mass communications and interpersonal relationships.
To my wife, Aria, who is also a teacher. Thank you for teaching me something new every day, whether it’s a lesson about love or patience.
And to my mother and father, because no teacher is more responsible for the earliest of life lessons than one’s parents. You always told me that I would understand someday. Now, I do.
Technology Interventions for Media Literacy
False news, propaganda models, and manipulated messages are not new to the twenty-first century. Throughout history, cultures have been bombarded with media messages. These messages were commonplace in the Roman Empire, the Third Reich, McCarthyism, and the Soviet regime during the height of communism. These messages historically manifested themselves through news media, advertisements, and state-sponsored agendas. The freedom of the press has been a crucial component to democracy in the modern world and helped provide checks and balances to other media and world powers.
However, three fundamental differences exist between media today and that of previous time periods. These are the speed with which news is distributed, the ability to instantly fact-check any statement, and the increased longevity of a news report. Previous generations received publications that were heavily vetted before going to print (Fabry, 2017). Conversely, news today is more about harnessing viral stories first and fact-checking later. Silverman (2015) pointed to a bevy of troubling journalistic practices propelled by social media metrics, asserting, “Incentives favor moving fast and publishing content that is likely to spread” (Rumor and Debunking, para. 23) and that “trust and accuracy are sacrificed for clickability” (Alarming Dissonance, para. 33). Adding to this problem is the lifespan of media and the ability to access it online well beyond its original airdate or publication date. For perspective, 500 years ago, Martin Luther challenged the Catholic Church with his German-language translation of the Bible for common people; this feat took over a decade to put knowledge in the hands of commoners. With technology today and the proper media literacy skills, citizens have the access to evaluate media messages at the press of a button. Arguably, most do not take advantage of the modern ease of accessing and evaluating information. Compared with the past, media today is more quickly published and fact-checked; it is accessible to the masses with a greater reach and for far longer than ever before. This ease of obtaining information becomes troubling when critical thinking skills are omitted. Breakstone et al. (2019) suggested, “Education moves slowly. Technology doesn’t. If we don’t act with urgency, our students’ ability to engage in civic life will be the casualty” (p. 3).
Importance of the Topic
Current research showed that high school students were ill-prepared to interact with media (Breakstone et al., 2019; Melro & Pereira, 2019), but recent interventions suggested ways of promoting media literacy to allow students to evaluate media more critically (Hobbs & Frost, 2003; McGrew, 2019; McGrew, 2020; Pérez et al., 2018). This research appeared to be consistent across secondary and post-secondary age levels.
Several problems contributed to students’ media literacy incompetence. Students gravitated to familiar sources and those that were quick and efficient to access, focusing on superficial details such as titles and keywords to determine relevance rather than credibility (List et al., 2016). Teachers have observed students’ struggle to identify biases in media and determine reliability across multiple sources in lessons covering media studies. Furthermore, native advertisements and sponsored content have been designed to blend in with traditional news, making it increasingly difficult for the general public to identify credible news from clickbait, infotainment, and fake news. Compounding these issues in recent years was the hyperpolarization of news and the ability for consumers to find sources with which they agree, engendering confirmation biases. The drought of Cronkitean journalism and consolidation of media conglomerates paradoxically enable consumers to hear voices with which they agree while simultaneously providing relatively few options at the level of media ownership by those who control the steady diet of information consumers receive.
Students’ critical thinking as it relates to media literacy is a significant deficit to the cost of well-informed members of society, a reality which threatens civic engagement. Because education evolves at a slower pace than technology advances, it creates a problem where schools are several years behind the environment in which their students are developing. The school ecosystem needs to adopt best practices for teaching students to be acutely critical with analytical and evaluative skills needed for civic engagement.
Research Question
Education plays a vital role in preparing students to access technology and to critically evaluate media. The Essential Question for the Educational Technology graduate program at Concordia University, St. Paul is: “In light of what we know about how children learn and education policy and practice, how shall educators best utilize technology to enhance student achievement?” Of Jones-Jang et al.’s (2021) three solutions (information providers, crowdsourcing, and audience), this paper’s focus specifically targets the domain educators have the most control over: the student audience. It seeks to answer the question: “To what extent does research show technology can be most effectively used in high school media studies programs to facilitate media literacy?” This is especially important for fostering students’ growth in critical thinking, civic engagement, and digital citizenship (Martens & Hobbs, 2015), all of which concern the media students consume and create.
Scope of Research
This paper analyzes recently published scholarly journals that involved technology and media literacy for high school media studies classes. Because of the relevance to this age demographic, research pertaining to middle school and undergraduate students was also utilized. Several studies related to the behavior of the general public, which included the targeted high school demographic, were examined because of their relevance to current trends across age demographics, further highlighting the need for media literacy education.
Because of the media literacy deficit, media studies teachers will want to employ strategies that empower students to critically engage with media. The studies covered in this paper relate to how technology can be harnessed to enhance media literacy with a specific emphasis on identification and evaluation of news. In a time when information spreads faster than ever before, students need to build the skills to responsibly source, share, and create credible media messages.
Definition of Terms
It is important to define the terminology used within this literature review to allow for greater comprehension of the aspects described. The following should be used as a guide to understand how this body of research structured these terms.
Civic Engagement
The American Psychological Association defined civic engagement as “individual and collective actions designed to identify and address issues of public concern” (Delli Carpini, n.d.). Civic engagement includes activities like participating in the political process, volunteering, and voicing local issues. Media literacy positively contributes to civic engagement.
Critical Thinking
Scriven and Paul (1987) defined critical thinking as “the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action” (as cited in The Foundation for Critical Thinking, para. 3). Analytical and evaluative skills play a vital role in the critical thinking skills required for media literacy.
Fake News
This term encompasses news that is untrue. Information that fits this definition could have a false context or be false, manipulated, misleading, impostering, or fabricating content; another subset could be classified as satire or parody (Melro & Pereira, 2019). These falsehoods and deceptions are either intentional or unintentional through the publication process.
Lateral Reading
Lateral reading is the ability to read across various sources of information rather than within the context of a single source.
Media Literacy and Other Forms of Literacy
The term “media literacy” was difficult to define. A meta-analysis by Jeong et al. (2012) highlighted that there was no standardized definition of media literacy despite agreement that the concept involved skills and knowledge related to critical understanding of media and its use; this could be examined through critical, cultural, or social scientific lenses. A review of literature by Ilomäki et al. (2014) revealed 34 different iterations of the concept in 76 studies. Within these, media literacy, digital literacy, and technology literacy fell within different disciplines. Perhaps one of the most confusing elements was the lack of a standardization of terminology. Term confusion was still something of a problem because different studies highlighted different aspects of each with overlap.
The most fundamental components of media literacy are the ability to access, analyze, evaluate, and create media with emphasis placed on the role of communication (Hobbs et al., 2013; Lee, 2014). Media literacy could not exist outside of critical thinking. Both analysis and evaluation are skills involved in the critical thinking process. Therefore, media literacy cannot be achieved without all four of its parts, which are accessing, analyzing, evaluating, and creating. Other studies also listed the use and reflection of media messages as important instructional pieces (Martens & Hobbs, 2015). This paper defined media literacy by these criteria. For all practical purposes, the terms “digital literacy” and “media literacy” were congruent with only a few nuanced differences. The literature tended to use these terms synonymously, though it must be underscored that digital literacy emphasized more of the technology aspects whereas media literacy usually highlighted the competencies relating to mass media. Both terms referenced the important component of communication. A shared subset of analytical and evaluative skills related to critical thinking were within the scope of each, most prominently with media literacy. Other variations included information literacy, news literacy, and civic online reasoning. Each of these had a specific role for understanding what media contains, how it is constructed, characteristics of its medium, and the specific context through which it arises.
Media Studies
The term “media studies” was used in this paper to discuss the numerous content areas that share similar outcomes related to communications and media. Some of these include visual communications (video production, graphic design, photojournalism), journalism, and mass communications. These classes predominantly focused on news media and sourcing information or were more generally related to building critical thinking skills required for both media creation and consumption.
Summary
This paper considered the question, “To what extent does research show technology can be used most effectively in high school media studies programs to facilitate media literacy?” Chapter Two contains a literature review of 18 primary studies to explore how teachers can promote critical thinking through technology use. This task is of increasing importance as youth turn more to the Internet to access, analyze, evaluate, create, and share media. The literature review will investigate critical thinking; media literacy and traits surrounding it, such as predicting factors, false news, sourcing, access and operation of information through technology, engaging with and consuming media, and the act of creation; and potential interventions to solve problems. Chapter Three will provide a discussion of the research, limitations of the studies, and possible applications for how technology can be utilized to empower media literacy for media consumption.
Chapter Two: Literature Review
Media literacy was fundamentally tied to civic engagement through three distinct sets of similar tasks: access and use; analysis, evaluation, and reflection; and creation. This literature review was organized to discuss the research for each of these components in the order of progression according to Bloom’s Digital Taxonomy: remembering, understanding, applying, analyzing, evaluating, and creating (Grantham, 2014). The “Access and Use” section was best aligned with the tasks for remembering, understanding, and applying, especially in the context of using digital and technology tools. “Analyzing and Evaluating” tasks were homonymous to the taxonomy, although the reflection component was absent from the taxonomy. For this purpose and because of its relatedness, reflection was grouped with Bloom’s analyzing and evaluating tasks. Finally, Chapter Two will conclude with creation, which was the highest order of processes in the taxonomy. It is important to note that Bloom’s Extended Digital Taxonomy included sharing as the highest functional level and was akin to publishing what one has created (Grantham, 2014). For this reason, creating and sharing were organized together in the “Creating and Publishing” section later in this review.
The skills covered in each section also aligned themselves with the International Society for Technology in Education (ISTE) (2016) Standards for Students to create empowered learners, digital citizens, knowledge constructors, innovative designers, computational thinkers, creative communicators, and global collaborators. There was significant overlap between the goals of the ISTE Standards and Bloom’s Digital Taxonomy. Both were important lenses through which media literacy and civic engagement in education could be examined.
Access and Use
Media literacy tasks that required Bloom’s remembering, understanding, and applying processes for accessing and using information and technology were addressed in this section. These lower-ordered processes required the knowledge and application of functional skills to complete digital tasks, such as operation (finding, opening, closing, maximizing, and minimizing programs, navigating file and folder structures, and using the mouse) and usage (saving, deleting, undoing, coping, pasting, and searching) (Lee, 2014). These tasks aligned with similar technology skillsets identified in Bloom’s Digital Taxonomy (Grantham, 2014). One must not only be able to complete tasks but also know what is capable through technology. Additionally, for one to be media literate, one must also possess the skills to parse through and find information (Livingstone et al., 2008, as cited in Jones-Jang et al., 2021). In other words, media literacy’s access and use skills were dependent on both technology literacy and information literacy to complete tasks.
Access skills pertaining to locating credible and relevant information were covered in the next section because of the analytical and evaluative criteria required for users to make judgments about the information being retrieved. It must be stressed that those higher-ordered skills transfer back to enhance the lower-ordered processes of access and use.
Lee’s (2014) empirical study of a digital technological community center (DTCC) in a mid-size Southern city in the United States consisted of 122 participants and was fundamental to understanding the role of access skills to build media literacy. The participants ages ranged from 18-68 with a large concentration of poor, unskilled, male African American laborers having an average age of 45.9 years old. Over 80% self-identified a deficit in computer skills. The study focused almost entirely on the technology literacy domain for operating computer systems and accessing information for civic engagement through a week-long intervention consisting of digital literacy training measured with a pre- and a post-performance test. The training program lasted for one hour over five consecutive days, covering essential computer tasks for operating the system, creating documents, accessing social media, and searching for information (Lee, 2014). The results showed growth in the participants’ abilities to complete tasks on the computer and access the Internet. Lee (2014) contended that higher socioeconomic status contributed to better media literacy and that gender and educational attainment were not factors in the development of digital literacy skills. However, age correlated to be the most likely variable affecting digital literacy skills development. Lee (2014) clarified that 48% of participants had their high school diploma while 27% had not graduated from high school, determining that prior education was not a related factor to the acquisition of digital skills. However, age seemed to adversely affect one’s acquisition of these technology-based skills. These findings suggested a strong value in media literacy education to prepare students to be computer literate.
Martens and Hobbs (2015) suggested a strong correlation between media literacy and civic engagement. The empirical study by Breakstone et al. (2019) of high school students’ media literacy reported that one’s current grade level was a significant predictor correlated with stronger civic engagement scores on the study’s tasks. The findings were not clear whether grade level could be used as a proxy for age, an indicator of prior academic achievement, or both. However, maternal education levels were used as a proxy for socioeconomic status to predict performance. Breakstone et al. (2019) found that lower grade levels had greater deficits in media literacy proficiency than in higher grade levels. Breakstone et al. (2019) suggested grade level was a predictor of higher media literacy skills, positively affecting outcomes on the assessment by 0.029 points per grade level based on a total composite score of two points. Maternal education level was composited at 0.014 points, or about half the value of grade level attainment. When compared with rural populations, urban students scored 0.107 points more and were more than twice as advantaged as suburban students. Compared with White and Asian students, Hispanic students were only slightly disadvantaged whereas two or more races, other races, and Black students (in that order) were progressively more disadvantaged in civic online reasoning. Martens and Hobbs (2015) also linked the importance of reading and comprehension skills to the ability for students to form critical analysis skills. These data supported the need for effective media literacy education of youth and adolescents and for schools to help students overcome potential financial barriers to acquiring and using devices.
Likewise, research suggested that effectiveness of news media literacy messages was dependent on students’ previous exposure to media literacy education. Vraga and Tully (2016) studied the effects of a short media literacy public service announcement (PSA). The study measured media literacy between students enrolled in an undergraduate media course and those enrolled in a non-media course between three universities. Students enrolled in non-media courses were sampled from George Mason University on the East Coast, while media related courses were sampled from the University of Iowa and the University of Wisconsin–Madison in the Midwest. This quantitative experiment was measured through a student response survey. Both groups of students were shown a PSA on the role of the press and critical consumption prior to each group watching a political talk show that was carefully engineered by design for the experiment (Vraga & Tully, 2016). By watching the PSA, students’ critical evaluation scores were raised for students enrolled in a media course. No correlation was found between the PSA and non-media students.
The study noted the potential for self-selection biases with course enrollments, thus skewing the data in favor of those with a predisposition towards media literacy. It also outlined the need for greater controls. Differences existed in political demographics; media studies students in the Midwest leaned strongly partisan Republican and almost half of East Coast participants were non-White and born outside of the United States (Vraga & Tully, 2016). These data stressed the need for additional studies with more evenly distributed population study controls.
The findings appeared to indicate that effects of media literacy messages needed to be connected to media literacy education. Positive effects were largely related to prior media literacy exposure. This fact further supports Martens and Hobbs’ (2015) conclusion that media literacy education is a necessity in earlier education.
Watson and Pecchioni’s (2011) case study of a university-level health communications course in Louisiana spanned three years; qualitative data was collected through focus groups and interviews. Students were required to create 10-15 minute documentaries in groups on a health topic of their choosing. There were 24 groups in total over the three years of the study, which took place over the course of a 16-week semester each year. A significant feature of this case study was the difference between the first two years and the third year. Whereas in previous years technical training was optional, in the final year, students were required to complete a Lynda.com training of Final Cut Pro and produce a short video for practice, both outside of class. While the training added about four hours of work to students’ schedules, 94% of students successfully completed the training (Watson & Pecchioni, 2011). Student interviews suggested the trainings lessened frustrations with technology and provided students with more familiarity with the technical tools such as the camera to the editing software required to complete the documentary. This furthered the case for teaching technology skills for access, as it provided students with a greater understanding of the tools and techniques required. It also allowed greater creativity during the production phase because students knew what they were capable of creating with the camcorder and software provided (Watson & Pecchioni, 2011). While these technical components were tertiary to the focus of health communications, the trainings adequately provided the media literacy skills for access.
While Watson and Pecchioni’s (2011) study emphasized the role of technology literacy, the study by Hobbs et al. (2013) supported this notion but also incorporated other important facets. Their study had a diverse sample population of 85 video production students from Thurston High School in suburban Detroit, Michigan. Both studies emphasized the need for pre-production training. With Hobbs et al. (2013), students self-reported feeling more competent in nontechnical skills, such as gathering information and conducting interviews, than in the technical tasks of video production, such as editing and fixing video and making professional news videos. The same study also underscored the importance of information literacy. Hobbs et al. (2013) suggested, “Teachers might consider providing more explicit instruction in pre-production to help students learn information gathering skills, practice of broadcast journalism, and nature of evaluating informational claims through comparison/contrast” (p. 243).
Access skill acquisition in these cases could have been negatively affected by self-selection biases. Students in video production groups tended to specialize in different roles throughout the production process, making it inevitable that not all students would acquire the same skills (Hobbs et al., 2013). Furthermore, the authors noted that uneven participation levels could be attributed to counseling decisions to place low-performing and at-risk students into these types of courses. “Many of today’s students see no connection between the classroom and the culture, and these attitudes have a negative impact on motivation” (Bachman et al., 2008, as cited in Hobbs et al., 2013, p. 231). Schools need to improve participation by making studies relevant to their student populations. Findings suggested that high levels of in-class participation for production activities correlated with media literacy, though possibly related to similar self-selection biases mentioned earlier. These findings might also suggest that increased motivation—or student engagement—could improve media literacy acquisition, although it is unclear if the motivation was a product of assignment to the video production course or if students were pre-disposed to engagement because of prior interests in the subject (Hobbs et. al., 2013).
Insufficient reading, listening, and viewing comprehension may have contributed to the lack of information literacy skills. Martens and Hobbs (2015) also suggested that media literacy levels could be adversely impacted by lower comprehension levels, thus affecting students’ accessibility. However, media literacy skills for media analysis were also shown to increase comprehension in a landmark quantitative randomized assignment study of a yearlong 11th grade English media and communications class (Hobbs & Frost, 2003). It appeared that interventions for comprehension and media literacy could have the potential to build on each other with a stacking effect.
In summary, analysis and evaluation skills were somewhat dependent on one’s ability to use technology to access information. It also appeared important to learn skills associated with information seeking. In the case study of Stony Brook University’s news literacy program, which demonstrated information seeking through the review of hundreds of documents and 28 interviews, Fleming (2014) asserted: “Accessing news in news literacy meant developing the ability to identify it” (p. 152). The next section will look closer at the analytical and evaluative skills necessary to identify and examine news. These critical thinking skills took a special context in the role of fighting fake news, as users consume and share media online. Especially among adolescents, social media rose to prominence as the platform for obtaining information first, as demonstrated in Portugal (Melro & Pereira, 2019) and India (Negi, 2018). Access and use alone could not satisfy the requirements of media literacy. To this effect critical thinking skills were of increasing importance to a holistic concept of media literacy and were covered at length in the next section.
Engaging With and Consuming Media: Analysis, Evaluation, and Reflection
Analysis, evaluation, and reflection skills closely aligned with critical thinking skills and were shown to be prominent components in media literacy education. The research suggested that socioeconomic status was a defining factor in predicting media literacy skills (Breakstone, 2019) and implied that access contributed to stronger digital literacy skills (Lee, 2014). Based on these facts, one’s media literacy levels for analysis and evaluation could be impaired or advanced by levels of access. These skills were not found to be entirely dependent on access to technology because the ability to analyze and evaluate information were properties shared with critical thinking skills—skills which were not found to be technology dependent.
A research study from Stanford University provided reason for alarm in education. Breakstone et al. (2019) uncovered several concerning results regarding a deficit in media literacy. This empirical study contained qualitative elements. Its analytic sample contained 3,446 high school students who were representative of the demographics from urban, suburban, and rural populations in the Northeast, Midwest, South, and West regions of the United States. This sample was further reduced to 3,402 students because of technical problems. A Qualtrics survey was administered during regular social studies periods and captured students’ abilities to evaluate video evidence, compare webpages, evaluate articles, explain claims on social media, and analyze a website’s homepage. These tasks focused on assessing how well high school students could discern who was behind information, uncover what other sources said, and cite evidence for their positions (Breakstone et al., 2019).
Six tasks were evaluated on a rubric from zero to two points (Beginning: 0; Emerging: 1; Mastery: 2) and were scored by two independent raters from an external consulting firm. The scores from these six tasks were then composited into one score ranging from zero to two. Fifty-two percent of participants believed a misleading video; only three students out of more than 3,000 participants determined the source of the video when explaining their reasoning. Two-thirds of high school students struggled to discern the difference between news and sponsored content, known as native advertising. Out of a possible six tasks, 90% of students received no credit on four of the tasks. Most troubling, more than 96% of student participants struggled to identify potential causes of bias and credibility when analyzing a website (Breakstone, et al., 2019). Of these students, only 13 students received a “Mastery” score on their final composite of the six tasks. Only 10 students received an “Emerging” score between 1.50 and 1.99. A larger sample of 88 students scored between 1.00 and 1.49. However, more than 58% received no credit on any of the tasks and 38.7% received some credit but were below the threshold to earn a composite score of 1.00 (Breakstone et al., 2019). These unsettling statistics emphasized the importance of developing strong critical thinking skills through media literacy education, whether as a media creator or a media consumer.
List et al. (2016) executed a relatively small exploratory study of 31 undergraduate students using mixed methods research and statistical analysis. Participants were social science majors from the mid-Atlantic, mostly females, with an average age of 22. List et al. (2016) aimed to understand student source selections, grouped into two categories: epistemic (concerned with quality) or nonepistemic (concerned with relevance). Students were restricted to using a pre-created source library for consistency. The results showed that students were primarily concerned with matters of search relevance rather than indicators of credibility. Students frequently opted to use sites that were familiar to them. The ease of use and ability to quickly find answers seemed to take precedence. Surface level reasons for source selection were cited most frequently, especially for closed-ended questions.
The findings from List et al. (2016) at the undergraduate level corroborated the study from Coiro et al. (2015) at the middle school level. Coiro et al. (2015) included a qualitative analysis based on empirical data from the seventh-grade demographic. A random sample of 773 students from a multi-year research project was taken from 42 school districts in two states. Two 16-item assessments were assigned over two days, and some tasks required students to search using the open Internet. Student responses captured their rationales for choosing sources of information and whether they believed the websites were credible. Although students were largely able to identify a website’s author, they struggled progressively in three other domains; these were evaluating an author’s point of view, the author’s expertise, and the website’s overall reliability (Coiro et al., 2015). Both List et al. (2016) and Coiro et al. (2015) agreed that students, whether at the middle school or undergraduate level, were more concerned with content relevance than credibility when searching for information. Again, these studies highlighted a deficit in the critical thinking skills needed for students to make the jump from accessing information to analyzing, evaluating, and reflecting upon it.
A study of first-year undergraduates at two Portuguese universities concurred with the need for developing students’ critical thinking skills. The study conducted a mixed method, explanatory model based on quantitative data from 562 questionnaires from students with different majors, 332 of whom were female and 230 of whom were male, (Melro & Pereira, 2019). This sample was further channeled into a focus group of 45 participants to understand students’ perception of fake news and the critical thinking (Melro & Pereira, 2019). The authors revealed that two-thirds of students could not identify an article as fake news, especially when the story was purported by someone well-known. Students rarely questioned information unless it concerned a topic with which they were passionate. Furthermore, Melro and Pereira (2019) agreed with the interpretation by List et al. (2016) that students placed factors of relevance with a higher priority than factors of credibility, especially when the information aligned with one’s perspective. While other majors struggled in the study by Melro and Pereira (2019), those in communication sciences performed best on the assessment, which exemplified the need for media literacy education.
Kahne and Bowyer (2017) discovered through a quantitative survey of people ages 15 to 27 that a majority of youth (58%) aligned themselves with false statements that aligned to their ideological perspectives. Of further concern, those with greater political knowledge were more likely to seek information confirming their biases instead of a motivation to seek accurate information (Kahne & Bowyer, 2017). Findings indicated that media literacy, although not the only strategy, was more effective than education on political knowledge to help students question information. Results suggested that media literacy and critical thinking skills should take a more prominent place in earlier education (Kahne & Bowyer, 2017).
Tully and Vraga (2018) agreed with this perspective, albeit providing more optimistic conclusions for stronger partisans. The researchers sampled 946 undergraduate students from the Public Speaking course and the Interpersonal and Group Communications course at George Mason University to measure news media literacy through a pre-test and a post-test assessment. Their study noted that stronger partisans showed greater gains towards media literacy of the news in the areas of messages and meaning and with their own self-perceived media literacy (Tully & Vraga, 2018). The findings agreed with Hobbs et al. (2013) that individual beliefs about media literacy correlated with growth in media literacy of the news irrespective of political affiliation. However, Tully and Vraga (2018) revealed that partisan preference had an impact on the growth of media literacy; Democrats showed more growth correlative to the value placed on news media literacy whereas Republicans saw less value in news media literacy and made fewer gains comparatively. Perhaps of the greatest importance, Tully and Vraga (2018) suggested that gravitation toward and appreciation of cognitive tasks for critical thinking were strong predictors for news media literacy.
It should be noted that the empirical study of the general public by Jones-Jang et al. (2021) conflicted in its findings from previous studies. The results concluded that information literacy was more significant than media literacy in combatting fake news. However, this finding could have been a by-product of study limitations attributed to the way skills were partitioned for each literacy. Most of the research concurred that media literacy and its analytical and evaluative domains were fundamental for students to engage with the news. This could be best summarized by the work of Martens and Hobbs (2015) that discovered a positive correlation between media literacy and news analytical skills. Their study compared tracked populations within a school that had an integrated media literacy program with selective admissions (one of two integrated programs), an open admissions media literacy track (one of five tracks), and students enrolled in non-media literacy tracks. Results strongly indicated that media literacy supported information-seeking behaviors, critical thinking, and knowledge of how media works—supporting the idea that media literacy positively affects civic engagement behaviors (Hobbs et al., 2015).
Some studies discussed interventions for building student critical thinking skills related to media literacy. These interventions addressed the massive deficits in media literacy to improve students’ ability to analyze, evaluate, and reflect on media decisions.
A different study associated with Stanford University recognized this need and proposed a short intervention to increase students’ civic reasoning. McGrew et al. (2019) examined four classes with two control groups and two treatment groups to determine the effectiveness of a small-scale intervention for civic online reasoning, which the authors defined as “the ability to effectively search for, evaluate, and verify social and political information online” (p. 486). The tasks were scored with rubrics by multiple scorers to achieve consistency. The study’s population was a smaller sample of 67 undergraduate students from a West Coast university in a critical thinking and writing course. The sample population was highly diverse, comprised of over 80% minority students. Half of the student body were first-generation college students. As Breakstone et al. (2019) suggested, both factors would put these students at a severe disadvantage for media literacy.
The findings by McGrew et al. (2019) indicated that short interventions could help increase students’ abilities to critically evaluate information found online by using heuristics. The interventions led students to evaluate sources of information by asking the questions: “What is the evidence? Who is behind the information? What do other sources say?” (McGrew et al., 2019, p. 486). Students, historians, and fact-checkers were observed to determine differences in behavioral approaches to digital resources. The intervention ultimately hinged on employing the use of lateral reading like fact-checkers rather than vertical reading of sources which was more common among both students and historians (McGrew et al., 2019). This meant the goal was to migrate users away from looking for internal indicators of credibility on the same webpage. Instead, users needed to move off-site to see what other sources said and to seek out potential biases and problems behind the reliability of the information.
McGrew et al. (2019) also noted moderate but statistically significant gains in the treatment group’s article and evidence evaluating skills. Of importance, there were only interventions in two class periods out of a 15-week semester course. Thus, shorter interventions could be employed to create larger improvements in media literacy. However the authors were unsure if their three-question intervention would transfer to different contexts and sustain the effects over longer periods of time (McGrew et al., 2019).
In a subsequent quantitative study, McGrew (2020) tested the same three-question intervention with 68 11th graders in a history class by a comparative measure of pre-test versus post-test results. Among this younger population, the three-question intervention demonstrated significant improvements with students who were in the intervention treatment group for lateral reading (an increase from 9% to 32%) and analyzing evidence (from 14% and 36% to 73% and 89% respective to two tasks). Even more, 90% of students attempted to investigate who was behind the information they read (McGrew, 2020). Although the study showed student growth with the investigation portion of the tasks, there was still a gap between intention to investigate and actual evaluation skills. One of the limitations in this study was that students described their own decisions; software to track students’ online behavior as they evaluated sources would have provided more insights. McGrew (2020) recommended that students should receive more support in evaluation tasks and have further practice with open searches on the Internet.
The study by Coiro et al. (2015) unearthed some key findings about critical skills in middle school students and offered recommendations. First, their literature cited that no clear standards of uniformity for how information was organized online, which could be confusing for students to navigate (Britt & Gabrys, 2001; Rouet, Ros, Goumi, Macedo-Rouet, & Dinet, 2011, as cited in Coiro et al., 2015). This was supported by Rouet (2006), who postured, “multiple source use is a particularly cognitively demanding task for students” (as cited in List et al., 2016, p. 47). It was further noted by Coiro et al. (2015) that these problems were not isolated to just the middle school demographic but that older students struggled to go beyond content relevance as well. The authors proposed five solutions for educators:
- Encourage students to consider information about authors.
- Clarify how to elaborate on areas of author expertise.
- Scaffold inferences about the consequences of an author’s point of view.
- Model strategies for dealing with conflicting information.
- Demonstrate the value of using multiple indicators of reliability. (Coiro et al., 2015, pp. 293-295)
These suggestions aligned neatly with the studies by McGrew et al. (2020) and Pérez et al. (2018). Both studies cited the SEEK acronym (Source, Evidence, Explain, Knowledge) intervention by Wiley et al. (2009, as cited in McGrew et al., 2020; Pérez et al., 2018) which sought to invest students in sourcing information, evaluating provided evidence, explaining the context of the evidence, and assimilating new information with prior knowledge.
The study by Pérez et al. (2018) sampled 137 ninth grade students from four public French secondary schools. Two classes were studied in each school to provide a control and a treatment group. This study was prepared in advance through a pilot study of 12 students aged 13 to 16 who did not participate in the final study. Concurring with the work of McGrew et al. (2020), the study by Pérez et al. (2018) tested an intervention to help students identify an author’s position, motivation, and media quality. By having students examine the sources for an author’s credentials and possible conflicts of interest, treatment groups more routinely referred to source information when determining the material’s reliability. The process for examining sources included strategies for understanding pre-publication criteria for sources, such as quality control and validating information. This skill became necessarily important when considering quality control measures for different types of documents and motivations of the author to publish them (Pérez et al., 2018). The intervention had lasting, long-term impacts for choosing reliable source information between two texts and justifying their choices. However, students still struggled with referencing source material to draw conclusions from what was read (Pérez et al., 2018). Perhaps this was a limitation of the study’s design; it focused mostly on rating tasks instead of writing about how source information was utilized (Pérez et al., 2018).
In addition to the work of Martens and Hobbs (2015), two studies demonstrated a more comprehensive plan for teaching analysis and evaluation instead of focusing on shorter interventions. These studies examined a yearlong media literacy program in an English department’s 11th grade media and communications class (Hobbs & Frost, 2003) and a news literacy program at Stony Brook University spanning one semester (Fleming, 2014).
The landmark study by Hobbs and Frost (2003) demonstrated that media literacy curriculum could be as rigorous as traditional academic goals. The authors specified that their findings were specific to secondary language arts classes. The results of the study suggested that media literacy interventions could increase reading comprehension and analytic skills across formats and improve identification of omitted information and construction techniques in media. Interventions could also aid in the ability to identify main ideas, summarize, and understand information from readings, listening exercises, and watching media. Students were also better equipped to make connections to economic functions of news media and write longer paragraphs than the control group. Those who received media literacy instruction throughout the year interpreted textual evidence more critically than those without (Hobbs & Frost, 2003).
Because this was the first large-scale study of media literacy interventions, numerous limitations were mentioned. Like other studies reviewed, self-selection bias was a concern. The study also measured different populations with far less data about practices for the control group. Their study was a springboard for further research to be conducted and indicated positive results for embedding media literacy into literature arts courses (Hobbs & Frost, 2003). Evidence for these findings was further expanded by Vraga and Tully (2016), who suggested that media literacy interventions were effective when paired with media-focused courses. This concept was further supported by Martens and Hobbs’ (2003) research, accentuating the prerequisite for media literacy interventions to be within the context of media studies courses rather than in other disciplines like English or social studies. These interventions showed positive results across cohesive social studies, English, and media productions courses in an interdisciplinary academy program (Martens & Hobbs, 2003).
The journalism program at Stony Brook University provided a unique look to how industry professionals have attempted to solve some of these problems related to news literacy. Fleming’s (2014) case study examined feedback from administrators, lecturers, news fellows, and students about a news literacy course and its content. One of the limitations with the study and the program at Stony Brook University was the intention to create critical thinkers while also gaining a strong appreciation for journalism. The program could have been overly optimistic with its perception of journalists, tending towards a more idealistic view of those in the news media rather than assuming a more critical role (Fleming, 2014).
One weakness of Fleming’s study was that it did not deliver enough qualitative feedback on the actual performance of the program. The abstract noted that the study would cover the experiences of majors and non-majors but offered little to compare between majors and non-majors and their perceptions of growth in media literacy. Student and faculty feedback seemed selective to favor the program at Stony Brook. Fleming admitted that the study could not “be generalized to make any larger claims about other news literacy programs” (Fleming, 2014, p. 159). While the author criticized the perspective that the program was a blind attempt to praise journalism, she agreed that the program did little to enhance students’ abilities to recognize and analyze bias in media ownership. The author also suggested that further quantitative studies needed to follow her work, noting that the program at Stony Brook University was not grounded in any scholarly research.
Even so, Fleming’s case study brought to light several frameworks to foster critical thinking, which were created by the university’s dean, Howard Schneider. These frameworks followed the domains for media literacy (accessing, analyzing, and evaluating), although the domain of creation was displaced by “appreciation.” Fleming recommended that the Stony Brook University’s journalism program be contextualized in the same way as music appreciation courses because the creation element was absent (2014). One of these valuable frameworks was the Deconstruction Guide that enabled students to discern fact-based journalism from emotional narratives (Fleming, 2014). This guide was similar in content and was more extensive than the three-question intervention presented by McGrew et al. (2019) and the aforementioned SEEK intervention (McGrew et al., 2020; Pérez et al., 2018). The Deconstruction Guide aimed to identify reliability of sources, evaluate evidence and fairness, and examine key facts related to story elements (Fleming, 2014).
Another framework was the Taxonomy of Information Neighborhoods to help students develop the ability to identify goals, methods, practitioners, and outcomes of the various types of media messages ranging from entertainment to propaganda and raw information (Fleming, 2014). Schneider also established a set of ten key skills and concepts to support students in distinguishing news media and its structures.
Potter (2010, as cited in Fleming, 2014) agreed with media literacy’s necessity to advance critical thinking skills and proposed four points:
- Media can harm individuals.
- The purpose of media literacy is to teach people how to guard against being harmed by media.
- Media literacy skills and abilities must be cultivated.
- Media literacy is multidimensional, meaning people are influenced by media cognitively, attitudinally, emotionally, physiologically, and behaviorally. (p. 148)
Thus, a crucial aspect of the analytical and evaluative steps is to critically approach texts and interact with them through various lateral processes to enable a search for truth, validity, and credibility. It is important to discuss the overlap between Fleming’s (2014) case study and the approach McGrew et al. (2019) took to enhance critical thinking skills. In the latter of these studies, a stronger emphasis was placed on determining the ownership of media messages than in the former study. Both studies contained frameworks to build media literacy by having students question the content and narratives of media messages.
Hobbs and Frost (2003) determined that students in the treatment group were more aware of media construction techniques than the control group after the year-long media studies course with media literacy instruction. The researchers used a quasi-experimental, non-equivalent group design because all students in the treatment group received fully-integrated media literacy education through their school’s 11th grade English language arts curriculum (Hobbs & Frost, 2003). In contrast, the control group’s curriculum focused on world literature, and these students in a separate school were assessed by portfolio. Teachers in Hobbs and Frost’s (2003) treatment group adopted five framing questions to help guide student media analysis skills:
Who is sending this message and what is the author’s purpose? What techniques are used to attract and hold attention? What lifestyles, values, and points of view are represented in this message? How might different people interpret this message differently? What is omitted from this message? (p. 336)
Additionally, treatment group students received instruction in four key areas: “(a) advertising, persuasion, and propaganda; (b) the analysis and construction of news and nonfiction; (c) approaches to storytelling in dramatic fiction; and (d) the representation of gender, race, and ideology in media messages” (York & Aubry, 1999 as cited in Hobbs & Frost, 2003, p. 336). When compared with the control group, students who received fully-integrated media literacy instruction in their English language arts classes demonstrated significantly higher empirical scores in media literacy, as determined through a univariate analysis of reading, listening, and viewing comprehension, writing skills, media construction techniques, identifying purpose, and comparison-contrast (Hobbs & Frost, 2003).
Overall, students in each of the studies seemed to struggle with critical thinking skills for choosing sources of information and analyzing and evaluating reliability factors. However, short interventions demonstrated short- and long-term benefits to these interventions with varying degrees of success (McGrew et al., 2019; McGrew, 2020; Pérez et al., 2018; Vraga & Tully, 2016). Studies showed strong correlations between long-term media literacy programs and students’ analysis and evaluation abilities, confirming that media literacy is inextricably entwined with critical thinking (Fleming, 2014; Hobbs & Frost, 2003; Watson & Pecchioni, 2011). These effects also seemed to be tied to other positive outcomes when paired with specific disciplines, such as improving comprehension, strengthening the revision of work, deepening the understanding of concepts, and building technology-related skills (Lee, 2014; Martens & Hobbs, 2013; Watson & Pecchioni, 2011). These studies demonstrated advantages to holistic media literacy programs. The application of these research studies will be further explored in Chapter Three.
Creating and Publishing
According to Bloom’s Digital Taxonomy (Grantham, 2014), creating and sharing were the highest-level cognitive processes. Lee’s (2014) work exemplified basic creation tasks related to digital literacy, such as creating folders and resumes. Many of the tasks taught through the week-long intervention fell within the domain for building technical skills that could allow users to access information and technology. Although the tasks were technically classified as “creation,” one must view them through the lens of how the intervention was relevant to the middle-aged adult population. These skills for creation were important to providing access and opportunities, although the tasks in Lee’s (2014) study might not have necessarily translated into higher-ordered processes, as classified by Bloom’s Digital Taxonomy. However, the exceptional deficits within the targeted demographic in Lee’s (2014) study stressed the importance of not skipping over teaching those rudimentary technological tasks. Without those simplistic creation tasks, base knowledge would not exist for users to create through more challenging, higher-ordered processes in later education.
Watson and Pecchioni’s (2011) case study drew attention to the positive effects of using project-based learning in a health communications course. Students who successfully completed their documentary projects received a well-rounded experience in creating and consuming media. Proficiency was attained through multimedia experiences with software and examination of professional documentaries. By critically examining professional examples, students were better able to see the structuring of content and the stylistic voice camera angles gave to productions. Through creating their own documentary, students replicated these processes. Because of the totality of their experiences, university students practiced industry-related skills to fully understand the concept of their own documentary and the processes needed to script, film, and revise their work for targeted audiences (Watson & Pecchioni, 2011). The steps for revision were crucial for students; revision took place through peer and instructor feedback during the first two years of the study and through screenings and critiques of rough cuts to guide revisions during the final year.
Watson and Pecchioni (2011) established that technology’s role in their health communications course was to be used as a tool for creation. Teaching students how to access and operate the software was not the main outcome for their documentary assignment. Rather, their focus was to enhance students’ critical thinking skills. Their study demonstrated deeper level connections with the material through analysis and evaluation of their topic’s subject matter. Evidence suggested that creation was the final step for demonstrating access, analysis, and evaluative skills and was essential to further develop critical thinking skills. When students were required to develop a documentary about a health topic of their choosing, findings qualitatively suggested deeper levels of analysis and evaluation were applied when compared with the oral presentations and written papers that had been used previously (Watson & Pecchioni, 2011). The documentaries were graded through instructor-scored rubrics for content, organization, and production elements to assess engagement with course content rather than technical skills (Watson & Pecchioni, 2011). More specific descriptions of grading criteria could not be ascertained. Watson and Pecchioni (2011) concluded:
The documentaries were much stronger than they had been in the past, and this seemed to be primarily due to them learning more about the technical aspects of the software early on so they could focus on the creative elements and having earlier deadlines that resulted in them having more time to edit and refine in the last two weeks. (p. 316)
Findings could suggest that project-based learning can strongly foster critical thinking and media literacy. This seemed especially true when applying technology and technical media skills through authentic processes.
One obstacle that needed to be overcome was how students specialized in their roles throughout production stages. Individual students gained expertise in specific task-related roles, such as filming, editing, and on-screen work. Despite the authentic nature of these specializations, individual participation levels with activities varied in comparison to the whole group; for example, about one-third did not participate during in-class pre-production tasks (Hobbs et al., 2013). The variability of individual experiences “could negatively impact the ability to measure program effectiveness” (Hobbs et al., 2013, p. 244). The authors noted the self-selection bias with media production courses and room for further research to answer whether the class itself facilitated individual growth or if students’ intrinsic motivation was responsible for growth (Hobbs, 2013; Vraga & Tully, 2016). However, the work of Tully and Vraga (2018) further expanded upon this question, suggesting that those with higher levels of enjoyment for critical thinking were more likely to develop higher levels of news media literacy than those who do not enjoy critical thinking. These facts did not seem to resolve the discrepancy between media literacy acquisition and production skills acquisition.
The literature suggested that students attained higher levels of critical thinking skills, greater mastery over access and operational processes, and deeper understanding of material when creation tasks employed strategies to use technology as a communication platform, such as producing a documentary in a health communications class (Watson & Pecchioni, 2011) and creating a news story in a video production class (Hobbs et al., 2013). Furthermore, qualitative evidence through interviews and quantitative evidence from assessment scores suggested stronger connections to course materials were attained through the early use of technology training and framework interventions (Watson & Pecchioni, 2011).
Through thorough media literacy education inclusive of creation and consumption of media, students developed greater civic reasoning to discern biases, credibility, and ownership in news media. McGrew et al. (2019) noted statistically significant differences between pre-test and post-test performance and that the treatment group was 2.15 times more likely to score higher on the post-test than the pre-test.
Summary
Research suggested three areas of focus for media literacy: access and use; analysis and evaluation; and creation. Access and use was best demonstrated through studies intervening on behalf of perceived technology illiteracies or building upon previous technology trainings (Hobbs, 2013; Lee, 2014; Watson & Pecchioni, 2011). These interventions sometimes bridged the gap between the technical skills to access and the critical thinking processes more routinely associated with analysis and evaluation of media, such as in the case of identifying credible news (McGrew, 2019; Pérez et al., 2018). Concurring with Bloom’s Digital Taxonomy (Grantham, 2014), studies attuned to the full gamut of media literacy showed impressive indicators of media literacy proficiency, perhaps suggesting that each individual domain scaffolded acquisition of the others. This postulation will be further explored in this paper’s conclusion. The studies addressed in this paper illustrated the need for teaching technology literacy for basic and more specialized tasks to model the media industry’s workflow. The studies also showed the need for students to be trained critical thinkers when sourcing information and before engaging on social media.
Chapter Three: Discussion
Research shows effective interventions for fostering media literacy by focusing on three key domains: access and use, analysis and evaluation, and the role of creation. This section will explore how the research supports using technology to facilitate critical thinking.
Insights and Application
Numerous insights from the research can be called upon in high school media studies classrooms to facilitate media literacy among students. These suggestions appear to be best structured when skills for technology access have been taught as a base structure (Lee, 2014). Although critical thinking is independent of technology use, when considering the various dimensions of media literacy, it is crucial for students to understand what is possible and how they can accomplish specific tasks.
Research shows that students can become critical examiners of news media through courses with elements of project-based learning. In Watson and Pecchioni’s (2011) case study, students in the study’s third year made the largest perceivable gains because of rigorous and authentic processes that mimicked the media industry. Through the creation of documentaries, students learned processes that increased their accessibility with technology through software trainings. Furthermore, students improved their analytic and evaluative skills through critical examination of professional documentaries and through instructor and peer feedback to improve upon their own documentaries, which also required the understanding of tailoring productions for a target audience (Watson & Pecchioni, 2011). The task of creating a documentary required students to synthesize their knowledge for specific purposes, building upon domains of media literacy. Therefore, one could recommend that robust media literacy training requires extended durations for practice and understanding, such as semester or year-long programs with production-style elements (Fleming, 2014; Hobbs & Frost, 2003; Hobbs et al., 2013; Martens & Hobbs, 2015; Watson & Pecchioni, 2011).
Research also appears to show a strong correlation between interdisciplinary classes and media literacy, demonstrating positive effects for classes outside of the media studies focus when planned in collaboration with media studies programs (Martens & Hobbs, 2015). Schools would be wise to incorporate media literacy throughout all subject areas as appropriate, from basic technology access skills, critical thinking exercises, and opportunities to create and publish digital artifacts to share with broader communities, as encouraged by the ISTE standards (International Society of Technology in Education, 2016). Integrating media literacy holistically will require a highly planned and coordinated approach by teachers and administrators and is of the utmost importance at a time when students across America are deficient in media literacy.
However, the difficulty for schools to restructure entire curriculums should be recognized. Instead, research also shows lasting benefits from smaller interventions that can be more quickly implemented (Fleming, 2014; McGrew et al., 2019; McGrew, 2020; Pérez et al., 2018; Vraga & Tully, 2016). Problems with media literacy were largely consistent between middle school, high school, and undergraduate students. These benefits of the studied interventions were consistent between high school and undergraduate populations, specifically targeting the growth of analysis and evaluation skills.
Several studies investigated short interventions that were easy to implement in existing curriculum. Fleming (2014) discussed the Deconstruction Guide and the Taxonomy of Information Neighborhoods used at Stony Brook University to improve undergraduate students’ access, analysis, and evaluation of news media. Two studies by McGrew demonstrated the effectiveness of a three-question intervention to encourage students to uncover evidence, determine who is behind the information, and compare websites with other sources (McGrew et al., 2019; McGrew, 2020). Both studies that included McGrew were consistent with the work of Pérez et al. (2018) that sought to teach students to investigate an author’s position, motivation, and quality of the publication. Vraga and Tully (2016) demonstrated the effectiveness of using a short media literacy PSA specifically in media studies classes, whereas other short heuristics interventions appeared to show applicability to classes outside of media studies (McGrew, 2020).
The recommendation of this review is to start education in media literacy early in elementary school to build key skills students will utilize throughout the extent of their education—and eventually in their professional lives. Familiarity with computerized tasks decreases the gap in access to information and for creation tasks (Lee, 2014). The use of heuristics to teach students to investigate source dimensions should be adequately ingrained into students’ critical thinking development. Additionally, programs that include tasks for creation appear to show the deepest synthesis when access and evaluation are planned into instructional time, especially when students are required to practice using those skills. Together with practice, teachers must also be aware of students’ tendencies to specialize in production related roles. Although there are real-world benefits to specialization, teachers must carefully plan methods to ensure all students receive sufficient pre-production training and essential critical analytical and evaluative skills (Hobbs et al., 2013).
Limitations and Future Research
The studies reviewed in this paper shared several key areas of limitations. The first of these limitations concerns the definition and skillset of media literacy. More clearly established definitions are needed for the various types of literacies with explicit criterion to measure each one empirically. This limitation was noted by Ilomäki et al. (2016), which identified 34 different terms through a review of 76 studies. It remained a point of confusion for reviewing other studies like Jones-Jang et al. (2021), who highlighted information literacy, while others emphasized civic online reasoning (McGrew et al., 2019; McGrew, 2020), media literacy (Hobbs & Frost, 2003; Hobbs et al., 2013; Martens & Hobbs, 2015), and news media literacy (Tully & Vraga, 2018; Vraga & Tully, 2016). Each type of literacy contains overlap with other types of literacies. For best results, media literacy should be formally defined and individual skills between each type of literacy should be tabulated. Both recommendations should help disentangle the actual skills used to critically examine the media that is created or consumed and would clearly quantify the effect sizes for each skill further extrapolated to the individual literacies.
Relating to this goal, further meta-analyses and research studies of media literacy skills should be conducted. Perhaps this could render a quantifiable skills list to illustrate the overlap of the numerous literacies. Future studies should determine the impact on each skill’s acquisition rather than the summation of the literacy itself. It is necessary to understand more completely which skills are the most interrelated in their development. Furthermore, it is important to establish the process for developing skill acquisition scope and sequence from early elementary through high school.
Another limitation of the literature stems from the focus of scanning information. List et al. (2016) examined the step between selecting and scanning information to determine epistemic or nonepistemic reasons for source selections. The study examined reliability and relevance related criteria for selecting sources of information. However, drawing conclusions about source selections based on searching for information and without scanning or processing information is rather limiting in its methodology. It is recommended for future studies to measure students’ reasoning of source selection between scanning and processing stages to determine if those steps further positively correlated with epistemic rationales. Likewise, although web interfaces for the study by List et al. (2016) were purposely designed to control for outside factors, a more qualitative approach using the open Internet might yield more authentic results, which could then be explored through individual interviews or focus groups.
Several studies contained relatively small population sizes (Fleming, 2014; Hobbs et al., 2013; Lee, 2014; List et al., 2016; McGrew et al., 2019; McGrew, 2020; Watson & Pecchioni). More sizeable studies are needed to draw larger conclusions about media literacy across various demographics. This appears to hold especially true regarding student motivation, as Hobbs et al. (2013) questioned whether students became motivated because of the media studies course or if students who were motivated excelled because of a self-selection bias. It would be of interest to explore the study by Hobbs et al. (2013) through a large-scale random assignment in a video production course to determine if media literacy skills were acquired by interest and if motivation was a product of self-selection or natural occurrence within the class.
Furthermore, larger studies failed to control for political biases or further investigate to what degree potentially biased questions or targeted demographics affected measured outcomes positively or negatively (Breakstone et al., 2019; Vraga & Tully, 2016). As Kahne and Bowyer (2017) suggested, political bias can adversely affect one’s ability to identify fake news. However, Tully and Vraga (2018) later determined that political partisans experienced the most growth in news media literacy on two of five tasks. Considering the conflicting information surrounding partisan politics and media literacy, further studies should also explore the relationship between the two. Similarly, it would be of interest to discover the relationship between instructors’ political affiliations and the possible impact on student media literacy acquisition, also accounting for effects based on student partisan preferences. To control for discrepancies between self-described political affiliation and actual policy preferences, a survey could be used to rate participants’ conservative or liberation leanings.
Curiosity is another facet that should be investigated. It is well established in the field that curiosity is strongly correlated with critical thinking. This dimension was explored in Hobbs et al. (2013) and as “need for cognition” in Tully and Vraga (2018, p. 169). It is also established that critical thinking is a component of media literacy (Hobbs & Frost, 2003; Vraga & Tully, 2016). Curiosity should therefore be inextricably connected to media literacy. Future studies should measure the correlation between curiosity and media literacy predisposition.
A few other limitations that necessitate consideration exist. Studies in the past have relied on fact-checkers as arbiters of truth, which becomes troubling if they are perceived to be immune to pressures of corruption. This warning was issued by the award-winning investigative reporter, Matt Taibbi (2021), who underscored recent politicization of the fact-checking business. Future studies need to consider possible agendas behind fact-checked information used in studies because the information could adversely affect the distribution of empirical data related to media literacy along partisan lines.
Additionally, studies should also measure the effect size of media literacy interventions by discipline. Although the research reviewed would suggest that media literacy is best paired with relevant media studies classes, further evidence should be measured regarding the data spread of media-related but separately siloed subject areas. Another potential area for a case study would be to extrapolate the findings of the academy model outlined by Martens and Hobbs (2015) to discover further applications for interdisciplinary media literacy courses. Likewise, further research could explore through a comparison analysis media literacy acquisition in traditional school programs versus extracurricular school programs to note the benefits or drawbacks of each.
Conclusion
Media literacy needs to be interwoven into all areas of school culture, but most specifically within media studies courses where it will likely have the greatest impact. The research suggested that media literacy is best achieved when technological skills are taught to allow for access, analytical and evaluative skills are scaffolded to train for critical thinking, and creation is relied upon to place students in authentic roles to fully synthesize their learning. Creation must depend upon aspects of access, analysis, and evaluation for the interventions to be most effective. Although all parts are possible separately, it is necessary to combine these skills to achieve the greatest student impact. To this end, students must critically assume the roles of both consumers and producers of media to obtain the highest form of media literacy.
References
Breakstone, J., Smith, M., Wineburg, S., Rapaport, A., Carle, J., Garland, M., Saavedra, A. (2019). Students’ civic online reasoning: A national portrait. Stanford History Education Group & Gibson Consulting. https://purl.stanford.edu/gf151tb4868
Coiro, J., Coscarelli, C., Maykel, C., & Forzani, E. (2015). Investigating criteria that seventh graders use to evaluate the quality of online information. Journal of Adolescent & Adult Literacy, 59(3), 287–297. https://doi.org/10.1002/jaal.448
Delli Carpini, M. (n.d.). Civic engagement. American Psychological Association. https://www.apa.org/education/undergrad/civic-engagement
Fabry, M. (2017, August 24). The history of fact checking jobs in news journalism. Time. https://time.com/4858683/fact-checking-history/
Fleming, J. (2014). Media literacy, news literacy, or news appreciation? A case study of the news literacy program at Stony Brook University. Journalism & Mass Communication Educator, 69(2), 146-165. https://doi.org/10.1177/1077695813517885
Grantham, Nick. (2014, August 18). Bloom’s ‘Digital’ Taxonomy – printable reference table. Fractus Learning. https://web.archive.org/web/20141102002423/https://www.fractuslearning.com/2014/08/18/blooms-digital-taxonomy-overview/
Hobbs, R., & Frost, R. (2003). Measuring the acquisition of media-literacy skills. Reading Research Quarterly, 38(3), 330–355. https://doi.org/10.1598/RRQ.38.3.2
Hobbs, R., Donnelly, K., Friesem, J., & Moen, M. (2013). Learning to engage: How positive attitudes about the news, media literacy, and video production contribute to adolescent civic engagement. Educational Media International, 50(4), 231–246. https://doi.org/10.1080/09523987.2013.862364
Ilomäki, L., Paavola, S., Lakkala, M., & Kantosalo, A. (2016). Digital competence – an emergent boundary concept for policy and educational research. Education and Information Technologies, 21(3), 655-679. https://doi.org/10.1007/s10639-014-9346-4
International Society of Technology in Education. (2016). ISTE standards for students. ISTE. https://www.iste.org/standards/for-students
Jeong, S., Cho, H., & Hwang, Y. (2012). Media literacy interventions: A meta-analytic review. Journal of Communication, 62(3), 454–472. https://doi.org/10.1111/j.1460-2466.2012.01643.x
Jones-Jang, S., Mortensen, T., & Liu, J. (2021). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. The American Behavioral Scientist (Beverly Hills), 65(2), 371–388. https://doi.org/10.1177/0002764219869406
Kahne, J., & Bowyer, B. (2017). Educating for democracy in a partisan age: Confronting the challenges of motivated reasoning and misinformation. American Educational Research Journal, 54(1), 3–34. https://doi.org/10.3102/0002831216679817
Lee, S. (2014). Digital literacy education for the development of digital literacy. International Journal of Digital Literacy and Digital Competence, 5(3), 29-43. https://doi.org/10.4018/ijdldc.2014070103
List, A., Grossnickle, E., & Alexander, P. (2016). Undergraduate students’ justifications for source selection in a digital academic context. Journal of Educational Computing Research, 54(1), 22–61. https://doi.org/10.1177/0735633115606659
Martens, H., & Hobbs, R. (2015). How media literacy supports civic engagement in a digital age. Atlantic Journal of Communication, 23(2), 120–137. https://doi.org/10.1080/15456870.2014.961636
McGrew, S., Smith, M., Breakstone, J., Ortega, T., & Wineburg, S. (2019). Improving university students’ web savvy: An intervention study. British Journal of Educational Psychology, 89(3), 485–500. https://doi.org/10.1111/bjep.12279
McGrew, S. (2020). Learning to evaluate: An intervention in civic online reasoning. Computers and Education, 145, 103711–. https://doi.org/10.1016/j.compedu.2019.103711Scriven, M., & Paul, R. (1987). A statement presented at the 8th Annual International Conference on Critical Thinking and Education Reform, Summer 1987. http://www.criticalthinking.org/pages/defining-critical-thinking/766
Melro, A., & Pereira, S. (2019). Fake or not fake? Perceptions of undergraduates on (dis)information and critical thinking. Medijske Studije, 10(19), 46–67. https://doi.org/10.20901/ms.10.19.3
Negi, Udita. (2018). Fake news and information literacy: A case study of Doon University, Dehradun. International Research: Journal of Library and Information Science, 8(2).
Pérez, A., Potocki, A., Stadtler, M., Macedo-Rouet, M., Paul, J., Salmerón, L., & Rouet, J. (2018). Fostering teenagers’ assessment of information reliability: Effects of a classroom intervention focused on critical source dimensions. Learning and Instruction, 58, 53–64. https://doi.org/10.1016/j.learninstruc.2018.04.006
Silverman, C. (2015, February 10). Lies, damn lies, and viral content. Columbia Journalism Review. https://www.cjr.org/tow_center_reports/craig_silverman_lies_damn_lies_
viral_content.php
Taibbi, M. (2021, May 24). “Fact-checking” takes another beating. TK News on Substack. https://taibbi.substack.com/p/fact-checking-takes-another-beating
The Foundation for Critical Thinking. (n.d.). Defining critical thinking. http://www.criticalthinking.org/pages/defining-critical-thinking/766
Tully, M., & Vraga, E. (2018). Who experiences growth in news media literacy and why does it matter? Examining education, individual differences, and democratic outcomes. Journalism & Mass Communication Educator, 73(2), 167–181. https://doi.org/10.1177/1077695817706572
Vraga, E., & Tully, M. (2016). Effectiveness of a Non-Classroom News Media Literacy Intervention Among Different Undergraduate Populations. Journalism & Mass Communication Educator, 71(4), 440–452. https://doi.org/10.1177/1077695815623399
Watson, J. A., & Pecchioni, L. L. (2011). Digital natives and digital media in the college classroom: Assignment design and impacts on student learning. Educational Media International, 48(4), 307-320. https://doi.org/10.1080/09523987.2011.632278