Skip to main content
Humanities LibreTexts

2.4: Takeaway 4- Discussions about Algorithms barely, if ever, make it into the Classroom.

  • Page ID
    98080
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Despite students and faculty expressing deep concern about the ways algorithms shape and influence how we learn about the world, it was surprising to learn this topic rarely came up in the college classroom. Instead, many students, like amateur sleuths, had discovered algorithms through keen observation, noticing how their content was personalized and different from what their friends were seeing.

    Students’ suspicions were often confirmed in informal discussions with friends or relatives. As one student said, “A lot of times I learn about technology by copying what other people are doing, like my cousin was using a VPN, so I started using one too.”

    When we asked instructors about the courses they teach, and if they had helped students think about how information is created and encountered on different online platforms, the responses were telling. Of the 37 faculty interviewed, only 10 came up with answers that addressed algorithmic platforms. Most strategies they mentioned were superficial, such as introducing students to DuckDuckGo as an alternative to Google when searching the Web.

    One professor described a case study that used news reports of an event in the past to demonstrate how news is socially constructed. He encouraged students to look not at whether the claims made are true or false, but “what social and cultural work those truth claims are doing.” Still, though, the professor did not explicitly tie that to the current information landscape to explore how algorithms could filter meaning.

    Instead, it was far more common for faculty to not even consider the changing information landscape. These instructors often expressed societal concerns about personalization, mourning “the loss of a common culture” and pointing to deeper epistemological implications, but it did not influence their teaching. As one instructor put it, “The genie is being let out of the bottle and it is a problem we’re stuck with now.”

    The large majority of faculty members said they saw value in encouraging students to use peer-reviewed research and instilling critical thinking practices through close reading and textual analysis. In a few cases, faculty said they had scheduled a visit from a librarian in their courses to provide information literacy lessons.

    One faculty member seemed surprised that it had never occurred to her:

    I’m concerned societally that lots of us are getting sort of one perspective on events, and it isn’t being presented, as you know, the most accurate perspective on events. And that we would all benefit from seeing a wider variety, or having more sort of similarity in the things that we see but I’ve never, that I can think of, specifically talked about the way that that search algorithms or algorithms that are providing news might, might have an influence.

    Others thought algorithmic literacy should be part of a college education — so long as someone else took the lead.

    It probably should be happening in every classroom space, or be sort of introductory things required of all students entering higher ed. Ideally, that would be really useful as an instructor who’s working with juniors and seniors, I would like to not have to teach those skills.

    And yet, when asked, this instructor, like most, said she did not discuss how algorithms influenced the information environment in her courses. An important finding to emerge about faculty is they thought a greater attention to generic critical thinking and rhetorical analysis skills early in the four years of college would prepare students to navigate the current environment. Indeed, students at one institution pointed to a required course on critical thinking as the place where such discussions belonged.

    But many instructors seemed to assume critical thinking skills taught traditionally were sufficient. “With critical thinking,” one instructor said, “you have to teach them how to think for themselves and how to pose certain questions to themselves about the universe and their place in it and using scientific inquiry, no matter what it is that you’re tackling, no matter what your research project.” Since critical thinking is part of every discipline, one instructor argued, there was no pressing need for change.

    Some students agreed. At one focus group, they said their institution valued critical approaches, and because it was such a strong institutional value, they were confident they could apply those skills widely (though they had previously expressed concern and a sense of helpless inevitability when it came to the workings of algorithms). Students elsewhere were more likely to draw no particular connection between what they were asked to do for school and the kinds of information practices they needed for everyday life. As one stated, “My instructors usually encourage us to use the databases provided by the school that can let you know that you’re getting academic journals or peer reviewed journals that are solid information.”

    Essentially, students thought algorithmic systems were part of life, but the information students needed for school assignments had nothing to do with life beyond college. Some scoffed at the outdated advice about the internet provided in their courses: “They talk a lot in school about .org or .edu. But now with YouTube videos, they don’t really have those things.” Because most students believed that they knew more than their teachers about algorithmic technologies, they saw no value in addressing them in the college classroom. In one focus group, students agreed that their professors were too clueless about technology to cover it in courses, but they “forgave” their professors because they had other valuable knowledge to share. In another group a student was dismissive:

    Usually, it’s like a two-day thing about ‘This is how you make sure your sources are credible.’ Well, I heard that in high school, you know, and that information is just kind of outdated for the caliber that the internet is today. I mean it’s just not the same as what it used to be.

    Despite being alarmed about trends in technology, students found personal experience to be a better teacher than anything that might happen in a college classroom. Many expressed a preference for learning from peers. Yet, students were intrigued by the implications of algorithms in matters of ethics, public policy, and social inequality, and many linked their college experiences to issues of algorithmic justice. One student, who had learned in the focus group about personalization efforts by learning management systems (LMS) companies, saw these changes looming large:

    Without knowledge, there’s no way you can create change, so it’s like, you can’t fix climate change, if you don’t understand where it comes from, or you don’t know what exists, which I think is a big problem, and now I think especially of Canvas, I had no idea that LMS was tracking my information for other uses. We’re not seeing these things here, so people don’t care about them, because they’re not seeing it in the media, they’re not seeing it in their schools, or wherever we get our information from, it therefore leaves private industries to take over without anyone else knowing.

    This understanding of the interplay between individual and societal impacts was evident across the focus groups. Students often expressed confidence in their own ability to work with or around algorithms, having witnessed their rise as a self-described ‘pivot generation,’ but they were worried about the wider implications for others (see sidebar, “The pivot generation”).

    When the age of algorithms is in the classroom

    Though few faculty members had ready answers for our questions about the classroom, there were some noteworthy exceptions. One social science professor described how he tied social theory to the ways students present themselves online.

    They very quickly see how the whole ecosystem is designed to convey not a realistic version of themselves, but an idealized one that allows them to perform a certain kind of identity. And then that links to social movements, and current events, and how we are framing current events through these platforms for consumption, but also to display our own selves as part of a group or another.

    Though it was particularly relevant to his discipline, this instructor went on to argue that students are “eager to engage in that kind of inquiry, it’s crucial for any informed citizen of the world, and every student should be learning this.” Another faculty member frequently demonstrated how tracking can be revealed on websites to open discussion of how large corporations collect and use information. “I share that with every class that I teach.”

    Similarly, an instructor asked students to think about how wearable technology like FitBit made their personal health data valuable in a variety of unforeseen ways, saying “perhaps the most profound issue of our time is understanding how science and technology in society interact with one another, and it clearly interests students at all grade levels. We’re crazy if we don’t address it.”

    Other faculty members raised the topic more circumspectly. Courses that taught students quantitative understanding provided an opening. As one instructor said, “I can’t have a data source, and not talk about biases and heuristics.” Students also drew connections between algorithms and critical approaches to data; learning about confirmation bias, statistical modeling, and questioning the sources of data contributed to their ability to understand the ways algorithms shape what they see, even if algorithms were not addressed directly in class. The well-being aspects of technology also provided openings:

    We talk about mental health and the effect of living online, and in that sense it kind of comes up. We don’t get into specifics about different platforms, but I’d gladly have that conversation with them.

    Though a minority of faculty members had found ways to incorporate algorithmic literacy into their courses, several thought it was important for their institution and speculated that the solution was to develop new interdisciplinary courses. One faculty member suggested that the best way to teach about the intersection between technology and society was to bring humanities professors into the discussion since they are accustomed to reasoning through ethical questions. Another faculty member, who teaches an information literacy course, argued that the power of algorithms should be repeatedly addressed throughout the curriculum as a crucial part of understanding the role of information in society. A single “vaccination” approach might be counterproductive.

    Students often don’t think about how their cell phones track so much of their lives and how the internet tracks so much of their lives in turn, and what this could mean for them as individuals and for society as a whole. So just by bringing that back into the forefront of their mind, they’re often very surprised. When it’s just brought up once, maybe twice, it’s something that you can easily push aside, because it’s a scary thought. And if you only think about it once and then you get kind of freaked out, you maybe just want to never think about it again. So it needs to be reinforced over and over again.

    All in all though, faculty in our interviews were divided: Though nearly all expressed great concern about the effect of algorithms on our information environment, only a few embraced the challenge of incorporating discussion of algorithms in their courses. Others were hopeful about adding something new to the curriculum, but the majority still believed that their current curricula about critical thinking, and encouraging students to use peer-reviewed sources rather than internet-based sources were applicable and sufficient. As one faculty member pointed out, however, there’s no guarantee those lessons have lasting value for students: “Are they going to learn the lesson so well that they will take it with them when they leave and apply it to everything they see for the rest of their lives? That is the challenge.”

    The “pivot generation”

    clipboard_e6d2030b606b23300916d6467ca14c726.png

    Throughout the focus group sessions, students expressed concerns about the ability of people older than themselves to navigate systems designed for algorithmic attention and persuasion. As one student said, “My grandfather says ‘Oh, man, Facebook is so addicting.’ I’m like, ‘Yeah, because it’s designed to keep you there, like heroin.’” Another student offered an incisive reflection, “Everyone was so focused on making sure that kids learned that they forgot they also needed to teach grandparents.”

    Students were even more worried about the effect of technology on younger people growing up with tablets and phones, especially related to privacy and wellness.* As one student explained:

    Now that kids are like learning with iPads and all this new technology from the first day of school, it’s important to make them aware of all the things that are going on behind the scenes, like how they’re personalizing and using all the data from it. Our generation is kind of different, because we’ve learned how to do it right when it came out, and so we are more aware of what’s going on. But if you grew up completely with all this technology and all that, then you would just have no idea about potentially some of the negative effects that it could have.

    Faculty also differentiated the news awareness of this group of students from others they have taught in recent years. As one noted, “They all know about it already, there’s a sense that they have a set of shared knowledge, because they are very up to the minute. They are aware of a difference in terms of how quickly they get information, and they’re aware that it spreads differently through their generation than it does through my generation.” A student echoed this perspective, but worried that these changes came with a cost:

    When I listen to younger kids, like kids that are in middle school and high school, they’re talking about climate change, and bias in the media and corruption in politics. And I’m, like, all I cared about when I was 13 was whether my mom would let me get like a bigger pen; what I wanted was moon boots.

    While there are many valid criticisms of categorizing people by birth cohort,† students in this study characterized themselves in generational terms as “the pivot generation.” This is in and of itself unique from prior PIL studies with college students. This sample of students identified as members of a distinct group who came of age at a pivotal moment in the history of technology. Or, as one student put it:

    Since we were raised at least for a period of time without this omnipresent influence of social media, we had more of a choice to join the world of social media than the next generation. And I think that we have a lot more perspective on it than the other generations will have.

    * Some research suggests young children are fairly sophisticated about testing the limits of technology, though many questions remain. See Tanya Basu (6 December 2019), “Why kids don’t trust Alexa,” MIT Technology Review, www.technologyreview.com/s/6...nt-trust-alexa; Judith H. Danovich (2019), “Growing up with Google: How children’s understanding and use of internet-based devices relates to cognitive development,” Human Behavior and Emerging Technologies 2, 81-90, DOI: doi.org/10.1002/hbe2.142

    † See, for example, Mark Bullen, Tannis Morgan, and Adnan Qayyum (2011), “Digital learners in higher education: Generation is not the issue,” Canadian Journal of Learning and Technology 37(1), https://www.learntechlib.org/p/42755/; Ellen Johanna Helsper, and Rebecca Eynon (2013), “Digital natives: Where is the evidence?” British Educational Research Journal 36(3), 503-520, DOI: doi.org/10.1080/01411920902989227

    Contributors and Attributions