Skip to main content
Humanities LibreTexts

1.7: Web Evaluation Skills- A “Bleak” Track Record

  • Page ID
    241933
    • Walter D. Butler; Aloha Sargent; and Kelsey Smith
    • Pasadena City College, Cabrillo College, and West Hills Community College
    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    “At present, we worry that democracy is threatened by the ease at which disinformation about civic issues is allowed to spread and flourish” (Wineberg et al., “Evaluating Information”).

    How are we doing when it comes to recognizing disinformation and navigating the information disorder landscape?

    Web Evaluation Skills: Students

    Icon of student with backpack and book

    In 2016, a few months before the U.S. Presidential election, an influential study on Web literacy was completed by the Stanford History Education Group. Their report—titled “Evaluating Information: The Cornerstone of Civic Online Reasoning(opens in new window)”—was concerned with the spread of disinformation online, and how this might threaten our democracy. The study asked nearly 8,000 students (in middle school, high school, and college) to perform five Web evaluation tasks. The results were quite shocking:

    • 80% of students couldn’t distinguish “sponsored content” from news articles on websites
    • 67% of students failed to recognize potential bias in online information
    • 65% of students took online images at face value
    • Almost all struggled to evaluate information on social media.

    In 2019, as yet another U.S. Presidential election approached, the Stanford History Education Group conducted a similar study of civic online reasoning(opens in new window), this time with a sample of 3,446 high school students. The results?

    • 52% of students believed a grainy video shot in Russia constituted “strong evidence” of voter fraud (ballot stuffing) in the U.S.
    • Two-thirds of students couldn’t tell the difference between news stories and ads (“Sponsored Content”)
    • 96% of students did not consider how bias impacts website credibility (for example, ties between a climate change website and the fossil fuel industry).

    These studies conclude by describing student’s ability to reason about the information as either “bleak” or “troubling” (Wineberg et al., “Evaluating Information”; Breakstone et al.).

    However, as we will see below, this trouble is not just limited to students.

    Web Evaluation Skills: “Experts”

    Icon of three people on laptops

    In 2017, the Stanford History Education Group conducted a study, “Lateral Reading: Reading Less and Learning More When Evaluating Digital Information(opens in new window).” Here, they assessed the Web evaluation skills of presumed experts: Stanford undergraduates, History professors, and professional fact-checkers. This fascinating study confirmed that even Stanford students and professors with PhDs in History struggled to identify credible sources on the Web.

    For example, in one task, the participants were presented with two websites that provided information on bullying, and they were given up to ten minutes to determine which was the more reliable site. One of the websites (American Academy of Pediatrics) was from the largest professional organization of pediatricians in the world, while the other site (American College of Pediatricians) had been labeled a hate group because of its virulently anti-gay stance. The result?

    • Only 50% of the historians identified the reliable website
    • Only 20% of the undergrads identified the reliable website
    • 100% of the fact-checkers were able to quickly identify the reliable website

    Sources

    Breakstone, Joel, et al. “Students’ Civic Online Reasoning: A National Portrait(opens in new window).” Stanford Digital Repository, 14 Nov. 2019. Licensed under CC BY-NC-ND 3.0(opens in new window)

    Image: “College Student(opens in new window)” by Gan Khoon Lay(opens in new window), adapted by Aloha Sargent(opens in new window), is licensed under CC BY 4.0(opens in new window)

    Image: “Users(opens in new window)” by Wilson Joseph(opens in new window), adapted by Aloha Sargent(opens in new window), is licensed under CC BY 4.0(opens in new window)

    Wineburg, Sam, Sarah McGrew, Joel Breakstone, and Teresa Ortega. “Evaluating Information: The Cornerstone of Civic Online Reasoning(opens in new window).” Stanford Digital Repository, 22 Nov. 2016. Licensed under CC BY-NC-ND 3.0(opens in new window)

    Wineburg, Sam, and Sarah McGrew. “Lateral Reading: Reading Less and Learning More When Evaluating Digital Information(opens in new window).” Stanford History Education Group Working Paper No. 2017-A1, 6 Oct. 2017, dx.doi.org/10.2139/ssrn.3048994.


    This page titled 1.7: Web Evaluation Skills- A “Bleak” Track Record is shared under a CC BY-NC 4.0 license and was authored, remixed, and/or curated by Walter D. Butler; Aloha Sargent; and Kelsey Smith via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.