Skip to main content
Humanities LibreTexts

3.2: Recommendations

  • Page ID
    98082
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    These recommendations are provided for key stakeholders — educators, librarians, administrators, and journalists — involved in promoting truth and knowledge in the “post-truth” era. They are grounded in findings from this study, lessons learned from a decade of PIL research, and from discussions with a cross-disciplinary panel of experts convened at the Harvard Graduate School of Education in November 2019.78 Above all, they are aimed at enhancing algorithmic literacy.

    For the large part, they make students partners in addressing algorithmic education in order to promote widespread awareness of algorithms and strategies that may preserve human agency and autonomy. Striking a balance between being idealistic and practical, they build on what teachers, librarians, and journalists already do to advance public understanding and engagement with information in a fast-changing world.

    Recommendation 1: Use peer-to-peer learning to nurture personal agency and advance campus-wide learning.

    Students in our focus groups almost always identified their peers as knowledge sources about algorithms. This self-described “pivot generation” felt they were better positioned to question and counter personalized information flows than their older family members and instructors. They learned how systems worked through observation and comparing notes and workarounds with their friends and younger relatives.

    In other cases, they connected this personal experience to broader classroom discussions of social justice and politics, bringing a sophisticated critical lens to bear on complex issues related to the flow of information. In stark contrast, the faculty we interviewed for this study were concerned and curious about how algorithmic systems are influencing public life, but they rarely felt personally prepared to broach these issues in their classrooms.

    Together, these findings point to the potential of “students as partners” initiatives to raise awareness of algorithms on campus. The emerging literature on developing partnerships between students and faculty, staff, and administrators provides models for practice that actively and authentically involve students in curriculum and service design, and in research and teaching.79 Campus teaching and learning centers often have programs already in place to support these partnerships.

    In this scenario, educators’ disciplinary understanding of information and society could be combined with students’ knowledge and creative practices to develop a curriculum that meets the needs of students as citizens, not just as scholars. Rather than advocating algorithmic abstinence, as many of the faculty we talked to might choose to do, this could lead students to examine how algorithms affect their lives. It could lead them to consider when and how they might respond, from employing defensive tactics to pushing for social or legislative change.

    At the next level, co-learning positions students as peer teachers. Peer-to-peer teaching is well established in college classrooms including its use to foster digital literacy.80 Formalizing the everyday habits of learning students already have while leveraging shared experiences, concerns, and language may be an effective way to increase algorithmic literacy on campus.

    This teaching role could be further supported and extended beyond the classroom with students providing assistance for learners, including faculty and staff, as they develop digital and algorithmic literacy. To highlight students’ ownership of their knowledge, a skill-share session could be organized on campus with stakeholders from student affairs, academic departments, IT, and the library in which students lead conversations about the social implications of data-driven decision systems and provide hands-on training on tools and strategies, such as surveillance self-defense.

    As institutions examine their own use of personal data, particularly in learning analytics, there is also a role for students to play in designing for transparency,81 and such efforts could open campus-wide discussions about the intersection of algorithms and agency. At the very least they might spark investigations by student journalists.82

    The concern students showed for the naiveté of those with less awareness of the media environment could inform peer learning outreach work beyond the institution to the wider community. As communication studies and political science departments develop deliberative democracy programs,83 students could lead discussions about the social and political ramifications of algorithms and how they influence our understanding of current events, especially relevant during a heated political season. (This peer learning model could also be folded into school and adult learning partnerships as sketched out in Recommendation 2.)

    Reconceiving of students as partners and incorporating co-learning into the curriculum will require some thoughtful realignment of roles, while calling for vulnerability and trust.84 Teachers need to be willing to ask and welcome questions they cannot answer. Students must take responsibility for developing and sharing their knowledge and listening to one another.85 Both parties must be willing to set aside the reassuring familiarity of hierarchies of power and information and, instead, encourage curiosity and tolerate ambiguity, even if the result of such learning is difficult to predict and entails risk for both students and the faculty who must, ultimately, assign grades.

    Finally, we cannot leave it solely to students to be change agents on campus; librarians and educators need to address their own knowledge gaps. Instructors need to develop a greater understanding of how algorithms affect their own teaching and research, and shape the lives of their students. Librarians could take the lead on campus, forming communities of interest among faculty, identifying campus experts, sponsoring an “algo book club” and using instructional partnerships to help instructors integrate algorithmic literacy into their courses. To build their own knowledge, librarians could form a journal club to share readings, add a “skill share” or “what’s new” technology component to regular staff meetings, and strategize how the library can support learning about the age of algorithms through programming, services, interdepartmental initiatives, and the library’s instruction program.86

    Recommendation 2: The K-20 student learning experience must be interdisciplinary, holistic, and integrated.

    Students in this study described their exposure to information literacy and critical thinking from elementary school through college as scattered, inadequate, and disconnected. Critical thinking practices that focus on closely analyzing texts can be valuable, but must be accompanied by a nimbler set of evaluative strategies for sorting through new information on the fly to cope with the volume of choices we face in a world saturated with information.87 News and information is no longer something we seek, it seeks us through a variety of channels that clamor for attention.

    As algorithms continue to have a pervasive presence in our lives, more must be done to make information literacy instruction coherent and holistic throughout K-20. This is especially true in light of concerns we heard from students about the targeting of children by algorithm driven platforms promoting commercial products. Students need a greater understanding of how news and information work in, and on, society as they flow and are shaped by algorithm-driven and market-influenced intermediaries. Moreover, we should explore how the domains of reading, writing, and quantitative skills connect to efforts across the lives of students to introduce them to media, news, digital, and information literacy. This integrative work will require the formation of alliances at the local and national levels.

    At a tactical level, local efforts could start with finding stakeholders on college campuses who have already built bridges to the local schools and into the community. These may be teacher education faculty, coordinators of programs that pair college students with elementary school students, community outreach programs and student participants. Working through existing connections, educators, students, librarians from public, school, and academic sectors, and representatives from local news organizations could be invited to an open-ended discussion and workshop. Such a gathering could audit what students are learning, map connections across the learning experience, identify gaps, and seek ways to continue working together.

    Rather than focusing on inventing new curricula, these conversations could engage participants with questions, such as: What are students at different levels actually learning? How can that learning be improved and scaffolded so there is greater cohesiveness from K-12 to college and beyond? What will it take to give learners of all ages a better grasp of the hallmarks of trustworthy information and how can they learn about the ethical standards undergirding journalism, scholarship, and science? What do they need to know about how news content is produced and disseminated to develop discernment, and ultimately, where warranted, trust in the news? What do they need to know about how algorithm driven systems work and affect the information they engage with? Considering the impact of algorithms on information systems and our daily lives, what else needs to be folded into existing instruction, and how can educators and the broader public get up to speed?

    More strategically, endorsement from local leadership such as a school superintendent, a college president, or a dean of students could kick start conversations by providing a meeting space and lending their imprimatur to the effort. At the same time, national organizations could work on bridging the different cultures and needs of K-12, college, and community educators, developing a shared understanding of what it means to be information literate today. For example, the American Library Association could facilitate a meeting of interested parties from the American Association of School Librarians, the Association of College and Research Libraries, and the Public Library Association to collaborate on a vision for integrating information literacy efforts across the lifespan.

    Another configuration would bring together representatives from professional organizations, such as the Society of Professional Journalists, the Association for Education in Journalism and Mass Communication, and the International Society for Technology in Education, and leading scholars in the field of Science and Technology Studies. Policy experts at The Knight Foundation, Pew Research, and others in the nonprofit sector have a key role in seeking common ground and develop a blueprint for educating the public about the impact of algorithms on news and information. Library-focused organizations could meet with those in other domains to build connections. The idea would be to better coordinate and update information literacy and related programs, find common ground, and promote public understanding.

    These national efforts would be costly, but not as costly as ignorance. Ideally, funding and sponsorship could be sought to design a sustainable cross-disciplinary curriculum toolkit that could collect and curate exemplary learning materials for learners at all levels,88 as well as launching a newsletter to update stakeholders on what’s new at the intersection of information, technology, and society.

    Recommendation 3: News outlets must expand algorithm coverage, while being transparent about their own practices.

    While participants in this study could see the personal effects of algorithms in the news and advertisements served up to them, it takes the contextualization of solid reporting to demonstrate that these are part of larger patterns with social consequences.89 Reporting that includes practical tactics — defensive practices like those we heard about from students — could help counter the narrative of helpless resignation we heard in our student focus groups and faculty interviews.90 New tools make it easier for journalists to track the trackers and investigate the algorithms.91 “Open Sourced,” an initiative from Vox, is a promising development, looking beyond hype and hysteria to focus on “explaining the risks and benefits when it comes to AI and digital privacy so you can make informed decisions.”92

    Greater public awareness may shape what companies do and what policies governments enact. There is evidence that there are deep concerns with algorithmic personalization, particularly when it comes to news93 and that this backlash is having an effect on data-gathering efforts and corporate strategy.94 Public pressure may lead to improved access to information about proprietary algorithms that would facilitate more third-party monitoring by experts and journalists; so far the internet giants have shown little interest in facilitating this.95

    As the news industry has come under greater scrutiny, journalists have been called upon to help people distinguish objective news coverage from misinformation and outright lies. These demands of journalistic duty to democracy have become greater as algorithms permeate daily life. Journalists must use their platforms to shed light on how algorithms work and de-mystify them in clear language. This requires developing deeper expertise of their own and working with a wider array of academic and technical experts to deepen their investigations.96

    Too often, journalism plays into dangerous anthropomorphization, granting more agency and power to the systems than they actually have. We need reporting that questions both good and bad uses of algorithms, including their benefits and harms, and tells the stories of impact on individuals within the context of wider society.97 This need is real: In our focus groups and faculty interviews, it became clear that many people, no matter how educated, did not understand the way algorithms shape the flow of information.98

    As editors and journalists at professional news outlets weigh the benefits of integrating algorithms into their business and reporting models, they have a responsibility to be transparent and ethical about their own practices, too. Algorithm-based tools have also become integral to the production of news, from story generation to source identification.99 As journalists learn to use these resources, they need to understand the limitations and ethical implications of relying on automated filtering and decision making.100

    While audiences for news may be increasingly aware that algorithms shape what they see, they may not know that similar sets of filters are determining what alerts the journalist to a story, or directs the sources reporters contact or angles they develop.101 This confirms our findings about the general lack of knowledge our students — and others — have about the way that news is produced. Media organizations need to be transparent about how they’re using these tools to create their content.

    Greater transparency, however, is even more urgently required around the use of algorithms by news outlets to target news and advertising.102 Students and faculty in this study expressed concern about the quality of news available to them. Many stated that algorithm-driven personalization increased their distrust of news and raised additional issues, particularly the potential for news silos. Some media organizations are responding to this concern by making their personalization policies more openly available,103 undertaking deeper studies to ensure a responsible approach,104 and in some cases, moving away from data sharing with other platforms.105 Still, on most news sites, transparency information is difficult to find, hard to read, and incomplete.106 To regain the trust of their audiences, media organizations need to be much clearer about what information they collect, how they use it and with whom they share it.

    Recommendation 4: Learning about algorithmic justice supports education for democracy

    Despite their aura of sophisticated cynicism, students in our focus groups often became energized when discussing the impact of algorithms on equality, status, inclusion, and opportunities. As one student noted, “It worries me if it’s systemically allowing certain groups to succeed over others.” Another observed “our moral compass seems to be broken online.” Comments like these present a rich opportunity for engaged learning and civic engagement. Though students in our study expressed helplessness in the face of powerful corporations, they became motivated to challenge them as they learned more. This fault line between the perception of helplessness and a desire to create change is a productive site of emotional friction that opens opportunities to engage in “education for democracy.”107

    News reports remind us daily that there is work to be done: Cities debate the ethics of facial recognition systems108 and residents question the use of doorbell surveillance to monitor neighborhoods;109 legislators wrestle with regulating data-gathering;110 extremists and pedophiles use popular platforms to groom the vulnerable;111 librarians and educators raise concerns about commercial products that harvest data from students;112 software engineers question the morality of their work.113 Every day new issues surface. It is around pressing topics like these that students can break through a sense of helplessness, be emboldened with personal agency to grapple with complex issues, and feel empowered to take on the challenge of promoting algorithmic justice.

    At a practical level, individual instructors can look out for stories in the news that link their subject matter to issues of algorithmic justice: How does the digital surveillance of children influence child development? What information could help hospitals follow up with patients without introducing bias? How does microtargeting ads for jobs and housing relate to the history of redlining? Librarians who serve as liaisons to academic departments could support these efforts by creating ongoing curated collections of relevant news stories targeted to specific courses and disciplines, strengthening their own algorithmic literacy while broadening the working definition of information literacy on campus.

    By injecting current controversies around the algorithmic systems that influence our lives into their course material, educators can tie their disciplinary knowledge to pressing questions of ethics, fairness, and social justice. As we learned in PIL’s 2018 news study, the classroom is an influential incubator for the discussion of news and the interpretation of current events; almost two-thirds of survey respondents had learned about news from faculty discussions during the previous week.114

    Librarians have developed programs of their own that can be shared across and beyond the campus. Two programs funded by the Institute of Museum and Library Services develop capacity for librarians to take on the challenges of our digital environment. The Library Freedom Project trains librarians to become local experts on privacy practices who can take their knowledge into their communities.115 The Algorithmic Awareness Project is developing a curriculum, syllabi and software for educating librarians and developing open educational resources.116 Many libraries have stepped up to develop guides and workshops for their communities117 as well as credit-bearing courses and open educational resources.118 More than ever, librarians are bringing issues of social justice and information systems into their teaching, tying digital ethics to information literacy.119

    Several research programs are developing interesting resources for algorithmic literacy. Researchers at the University of Amsterdam’s Department of Media Studies are developing tools to allow users to compare and reflect on how social media platforms personalize and filter their information, enabling “data activism” — an intervention that could be used to help students understand and conduct research on social media.120 The UnBias Project in the U.K. has created “youth juries” to involve students in weighing concerns about algorithms and proposing solutions.121 Their Fairness Toolkit can be downloaded and adapted in classrooms K-20 to encourage civic learning and action.122 Student-centered activities like these promote a “collective approach to imagining the future as a contrast to the individual atomizing effect that such technologies often cause.” At the same time, a team of researchers at MIT’s Media Lab is working on an “Algorithmic Justice” program to help students understand how to navigate algorithmic systems from the inside out, by designing them from scratch using a series of low cost “unplugged” activities.123

    Instruction in racial literacy124 intersects powerfully with algorithmic literacy. The computer science curriculum must include a component of racial literacy to ensure future coders are cognizant of the ethical considerations they must bear in mind as they design new systems.125 History courses could develop a unit on how early computing efforts melded with eugenics contributed to the Holocaust, and link that history to contemporary controversies about the rise of online extremism. One programmatic idea is to pair a scholar of racism with a computer scientist to lead public discussions of how social media platforms and the communities that use them could address online harassment, race-based targeting, and the spread of extremist propaganda.

    Opportunities to introduce learning about algorithmic justice can be found throughout the curriculum. At the college level, a group of interested faculty from across the curriculum could conduct a curriculum-mapping project to engage students in the social dimensions of the present moment. As students proceed through their education, they could encounter the rich intersections of the humanistic, social, technical, and quantitative aspects of algorithms that could help them connect their lives with social trends, their learning across disciplines, and their personal lives with broader issues of social justice.

    We are facing a global epistemological crisis. People no longer know what to believe or on what grounds we can determine what is true. It is imperative that truth-seeking institutions — education and journalism — take the lead in healing the social fractures that technology has widened. The technical infrastructure that channels and shapes so much of our understanding and social interaction was created in the utopian belief that making information universally available and giving every individual a voice would improve our lives. But as that infrastructure became an engine of surveillance and persuasion, trading in the intimate details of our lives to create sophisticated marketing tools to sell consumer goods and ideas, that utopian ideal has become dystopian. The power of machine learning and artificial intelligence has been unleashed without regulation or informed consent.

    It is no wonder both students and faculty in this study felt helpless and anxious about the future. These recommendations show a path forward. As students claim their authority as learners, as algorithmic literacy is woven into education across the curriculum and students’ life spans, and as journalists give the public tools to understand this epistemological crisis, we will be better prepared to tackle both the unchecked power of algorithms as well the social problems they expose and exacerbate. This education for democracy — both formal and beyond — can empower us to reclaim our role in shaping the future.

    References

    1. Note their comments found here
    2. Alison Cook-Sather, Melanie Bahti, and Anita Ntem (2019), Pedagogical partnerships: A how-to guide for faculty, students, and academic developers in higher education, Elon University Center for Engaged Learning, https://www.centerforengagedlearning...-partnerships/
    3. See for example, Julia Terry, Alyson Davies, Catherine Williams, Sarah Tait, and Louise Condon (2019), “Improving the digital literacy competence of nursing and midwifery students: A qualitative study of the experiences of NICE student champions,” Nurse Education in Practice 34, 192-198, DOI: https://doi.org/10.1016/j.nepr.2018.11.016; and the ePioneers program at Oxford Brookes University detailed here https:// jiscinfonetcasestudies.pbworks.com/w/page/74349344/Digital%20Literacies%20at%20Oxford%20Brookes%20University
    4. Mollie Dollinger and Jason Lodge (2019), “What learning analytics can learn from students as partners,” Educational Media International 56(3), 1-15. DOI: https://doi.org/10.1080/09523987.2019.1669883; Christine Broughan, and Paul Prinsloo (2019), “(Re)centring students in learning analytics: In conversation with Paulo Freire,” Assessment & Evaluation in Higher Education (2019): 1-12, DOI: https://doi.org/10.1080/02602938.2019.1679716
    5. Op. cit. Zak Vescera 2019.
    6. Nancy L. Thomas and J. Kyle Upchurch (2018), “Strengthening democracy by design: Challenges and opportunities,” Journal of Public Deliberation 14(2), https://www.publicdeliberation.net/j...ol14/iss2/art9; Hayley J. Cole. (2013), “Teaching, practicing, and performing deliberative democracy in the classroom,” Journal of Public Deliberation 9(2), https://www.publicdeliberation.net/j...ol9/iss2/art10
    7. Kelly E. Matthews, Lucy Mercer-Mapstone, Sam Lucie Dvorakova, Anita Acai, Alison Cook-Sather, Peter Felten, Mick Healey, Ruth L. Healey, and Elizabeth Marquis (2019), “Enhancing outcomes and reducing inhibitors to the engagement of students and staff in learning and teaching partnerships: Implications for academic development,” International Journal for Academic Development 24(3), 246-259, DOI: doi.org/10.1 080/1360144X.2018.1545233; Abbi Flint (2015), “Students and staff as partners in innovation and change,” The Journal of Educational Innovation, Partnership and Change 1(1), https://journals.studentengagement.o...ticle/view/218
    8. Hayley Burke (2013), “Legitimizing student expertise in student-faculty partnerships,” Teaching and Learning Together in Higher Education 10, http://repository.brynmawr.edu/tlthe/vol1/iss10/6
    9. For examples of a range of peer learning initiatives around technology, see Elizabeth Bennett and Susan Folley (2015), D4 Strategic Project: Developing Staff Digital Literacies. External Scoping Report. University of Huddersfield, eprints.hud.ac.uk/id/eprint/2...ttExternal.pdf
    10. For examples of curricular approaches that emphasize practical evaluation of web sources, see Mike Caulfield’s “Check, Please” lesson plans (http:// lessons.checkplease.cc) and the Stanford History Education Group’s Civic Online Reasoning curriculum collections (https://cor.stanford.edu/).
    11. See for example, Brett Gaylor’s 2015 project “Do Not Track” donottrack-doc.com, an award-winning, interactive, personalized documentary.
    12. See for example, Brandi Geurkink (5 December 2019), “Congratulations YouTube… Now show your work,” Mozilla Foundation, https:// foundation.mozilla.org/en/blog/congratulations-youtube-now-show-your-work/
    13. As examples, see Stuart A. Thompson and Charlie Warzel (20 December 2019), “How to track President Trump,” The New York Times, https:// www.nytimes.com/interactive/2019/12/20/opinion/location-data-national-security.html; Geoffrey A. Fowler (27 June 2019), “Help desk: How to fight the spies in your Chrome browser,” Washington Post, www.washingtonpost.com/techn...chrome-browser
    14. See for example Twitter Trails, http://twittertrails.com/, the Princeton IoT Inspector, iot-inspector.princeton.edu/, and Tracking Exposed, https://tracking.exposed/
    15. Samantha Oltman and Joss Fong (10 December 2019), “Open Sourced: The hidden consequences of tech revealed,” Vox, www.vox. com/recode/2019/12/10/20991304/open-sourced-the-hidden-consequences-of-tech-revealed
    16. Nash 2019, op.cit.; Smith 2018, op.cit.; Elisa Shearer and Elizabeth Grieco (2 October 2019), “Americans are wary of the role of social media sites in delivering news,” Pew Research Center, www.journalism.org/2019/10/0...ring-the-news/; Jihii 2014, op.cit.
    17. Gartner Group (2 December 2019), “Gartner predicts 80% of marketers will abandon personalization efforts by 2025,” www.gartner. com/en/newsroom/press-releases/2019-12-02-gartner-predicts-80--of-marketers-will-abandon-person; Sara Fischer (19 November 2019), “New York Times dropping most social media trackers,” www.axios.com/new-york-times...241-424a-a398- 12344c78ac32.html
    18. Jeremy B. Merrill and Ariana Tobin (28 January 2019), “Facebook moves to block ad transparency tools — including ours,” ProPublica, https:// www.propublica.org/article/facebook-blocks-ad-transparency-tools; Craig Silverman (22 August 2019), “Facebook said it would give detailed data to academics. They’re still waiting,” Buzzfeed News, https://www.buzzfeednews.com/article.../slow-facebook
    19. Ren LaForme (15 March 2018), “The next big thing in journalism might be algorithm reporters,” Poynter, https://www.poynter.org/techtools/20...thm-reporters/
    20. Francesco Marconi, Till Daldrup, and Rajiv Pant (14 February, 2019), “Acing the algorithm beat, journalism’s next frontier,” NiemanLab, https://www.niemanlab.org/2019/02/ac...next-frontier/
    21. Op cit. John Wihbey (2019).
    22. Nicholas Diakopoulos (November 2018), “An algorithmic nose for news,” Columbia Journalism Review, www.cjr.org/tow_center/analg...e-for-news.php
    23. Paul Cheung (21 November 2019), “Journalism’s superfood: AI?” https://knightfoundation.org/article...-superfood-ai/; Op. cit. Nicholas Diakopoulos (November 2018); Amy Batt and Jacob Granger (11 November 2019), “Artificial Intelligence is not the future - it is happening right now,” www.journalism.co.uk/news/ar...ow/s2/a747107/
    24. Op cit. John Wihbey (2019), see especially Chapter 6, “Data, Artificial Intelligence, and the News Future.”
    25. Cheung (21 November 2019), op.cit.; Sophia Ignatidou (December 2019), “AI Driven personalization in digital media: political and societal implications,” www.chathamhouse.org/sites/d...tal%20Media%20 final%20WEB.pdf; United States, Cong. Senate, Filter Bubble Transparency Act. S. 2763, Washington: GPO, 2019, www.thune.senate. gov/public/_cache/files/c3a43550-7c36-4f77-b05c-d2275c0d568c/CE3DDB84DDB9284CC6D372833D039A20.filter-bubble-final.pdf; Max Z. Van Drunen, Natali Helberger, and Mariella Bastian (2019), “Know your algorithm: What media organizations need to explain to their users about news personalization,” International Data Privacy Law, DOI: https://doi.org/10.1093/idpl/ipz011
    26. “Personalization,” (14 May 2018), The New York Times https://help.nytimes.com/hc/en-us/ar...ersonalization
    27. The British Broadcasting Corporation (2018), “Responsible machine learning in the public interest: Developing machine learning and dataenabled technology in a responsible way that upholds BBC values,” https://www.bbc.co.uk/rd/projects/re...chine-learning
    28. Fischer 2019, op.cit.
    29. Tim Libert and Reuben Binns (2019), “Good news for people who love bad news: Centralization, privacy, and transparency on US news sites,” WebSci ‘19, June 30-July 3, 2019, Boston, MA, https://timlibert.me/pdf/LIBERT_BINN...-GOOD_NEWS.pdf
    30. Philosopher John Dewey wrote about a progressive approach to education over a century ago in Democracy and Education (1916) and other works which continue to influence educators. See for example, Tomas Englund (2000), “Rethinking democracy and education: towards an education of deliberative citizens,” Journal of Curriculum Studies, 32(2), 305-313, DOI: doi.org/10.1080/002202700182772
    31. Blake Montgomery (31 July 2019), “Facial recognition bans: Coming soon to a city near you,” The Daily Beast, www.thedailybeast. com/facial-recognition-bans-coming-soon-to-a-city-near-you
    32. Caroline Haskins (3 December 2019), “How Ring went from ‘Shark Tank’ reject to American’s scariest surveillance company,” Vice, https:// www.vice.com/en_us/article/zmjp53/how-ring-went-from-shark-tank-reject-to-americas-scariest-surveillance-company
    33. Dominic Rushe (26 November. 2019), “Democrats propose sweeping new online privacy laws to rein in tech giants,” The Guardian, https:// www.theguardian.com/world/2019/nov/26/democrats-propose-online-privacy-laws
    34. Anonymous. (5 May 2019), “What happened after my 13-year-old son joined the alt-right,” Washingtonian, www.washingtonian. com/2019/05/05/what-happened-after-my-13-year-old-son-joined-the-alt-right/; Nellie Bowles and Michael H. Keller (7 December 2019), “Video games and online chats are ‘hunting grounds’ for sexual predators,” The New York Times, https://nyti.ms/2qx6fmn
    35. Kyle M. L. Jones and Barbara Fister (14 October 2019),“Kyle M.L. Jones: The datafied student and the ethics of learning analytics,” [email interview] by Barbara Fister, Project Information Literacy, Smart Talk Interview, no. 32, www.projectinfolit.org/kyle-jones-smart-talk. html; Rebecca Koenig (17 October 2019), “At Educause, a push to monitor student data is met with concerns about privacy and equity,” Edsurge, https://www.edsurge.com/news/2019-10...acy-and-equity; Kyle M.L. Jones (2 July 2019) op. cit.; Data Doubles, https://datadoubles.org/; Audrey Watters, Hack Education, http://hackeducation.com/
    36. Arielle Pardes (22 November 2019), Google employees protest to ‘fight for the future of tech,’” Wired, www.wired.com/story/googleem...t-retaliation/
    37. Op. cit. Head, et al. 2018, How students engage with news: Five takeaways for educators, journalists, and librarians.
    38. Library Freedom Project, https://libraryfreedom.org/
    39. Jason Clark, Algorithmic Awareness, https://github.com/jasonclark/algorithmic-awareness
    40. For example, a workshop and guide was developed to accompany a public event about mass surveillance (http://libguides.gustavus.edu/mayday)
    41. A textbook created by Barbara Fister, Rachel Flynn and students in IDS 101 for a course titled Clickbait, bias, and propaganda in information networks is a handbook for understanding and evaluating information in a networked environment that includes student-authored chapters, https://mlpp.pressbooks.pub/informationnetworks/
    42. For example the Library and Information Technology Association webinar, “Engaging with algorithm bias: How librarians can meet information literacy needs of computer science and engineering students” on 16 December 2019. www.ala.org/lita/engaging-alg...er-science-and
    43. Algorithms Exposed, https://algorithms.exposed/
    44. UnBias Project, unbias.wp.horizon.ac.uk/
    45. Fairness Toolkit, UnBias Project, unbias.wp.horizon.ac.uk/fairness-toolkit/
    46. Michelle Ma (13 May 2019), “The future of everything: How to teach kids about AI,” The Wall Street Journal, www.wsj.com/articles/ how-to-teach-kids-about-ai-11557759541
    47. Winona Guo (27 May, 2019), “Toward a culture of racial literacy,” Harvard Political Review, harvardpolitics.com/harvard/toward-a-culture/
    48. Op. cit. Jessie Daniels et al 2019.

    Contributors and Attributions

     


    This page titled 3.2: Recommendations is shared under a not declared license and was authored, remixed, and/or curated by Alison J. Head, Barbara Fister, & Margy MacMillan.

    • Was this article helpful?