Log in

Democracy in the Time of “Hyperlead”: Knowledge Acquisition via Algorithmic Recommendation and Its Political Implication in Comparison with Orality, Literacy, and Hyperlink

  • Research Article
  • Published:
Philosophy & Technology Aims and scope Submit manuscript

Abstract

Why hasn’t democracy been promoted by nor ICT been controlled by democratic governance? To answer this question, this research begins its investigation by comparing knowledge acquisition systems throughout history: orality, literacy, hyperlink, and hyperlead. “Hyperlead” is a newly coined concept to emphasize the passivity of people when achieving knowledge and information via algorithmic recommendation technologies. Subsequently, the four systems are compared in terms of their epistemological characteristics and political implications. It is argued that, while literacy and hyperlink contributed to the furthering of democracy, hyperlead poses a fundamental challenge to it, undermining human autonomy to make decisions and aggravating vertical and lateral polarizations. In addition, the similarity between orality and hyperlead is addressed. Finally, suggestions to improve or to advert the current trend are provided. What happened during the transition period from orality to literacy and subsequently to hyperlink could be a reference for an alternative to hyperlead. Some technical adjustments and appropriate regulations requiring more transparency in algorithmic recommendation systems would help us to overcome hyperlead and preserve human autonomy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (France)

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. It should be noted, however, that empirical studies vary in terms of whether there is sufficient evidence to back up the claim (Cho et al., 2020; Flaxman et al., 2016; Zuiderveen Borgesuis et al., 2018).

  2. Among the 5 criteria of evaluating the democratic process suggested by Dahl, at least two criteria, enlightened understanding and control of agenda, are closely linked to truthful information and valid knowledge (Dahl, 1998: 37–38).

  3. Postman reports that people came to think that children needed to learn reading in the seventeenth century in England, when, according to him, the notion of “childhood” began (Postman, 1988/1992: 153). Ong (1982/2002: 113–114) mentions the “tenaciousness of orality” that lasted until the early twentieth century.

  4. Although being closely related to each other, hyperlink refers to the connection between information while the hypertext is text with hyperlink. Hypertext is often compared with printed text and the difference has great implications not only in activities of reading and writing, but also in the contents of texts and how they are mediated. Bolter submits the notion of “remediation” that characterizes the hypertext (c.f. Bolter & Grusin, 2000). However, in this paper, we focus on how the hyperlink functions as a method of knowledge and information acquisition.

  5. The irony here is that for Plato himself the method to be used was dialogue, not writing. In Phaedrus, Plato expresses his contempt against written text as it can distort the truth, while what has been spoken can be repeated. Therefore, he calls writing as pharmakon that can be both medicine and poison. However, Derrida points out that the issue is not necessarily speaking of writing, but the word pharmakon itself is indeterminate and subject to interpretation (Derrida, 1981). While Derrida’s idea does not assume the independent reality, ironically again, it does not contradict the epistemology of literacy that promotes the subject’s endeavor to find a better answer.

  6. One might be reminded of the aural world regained by electronic media analyzed by Marshall McLuhan. In different passages of Understanding Media, McLuhan describes how new electronic media bring back the characteristics of oral society. Especially radio brings back the experience of privacy and intimacy, the “tribal magic” to the literate West (McLuhan, 1964: 99–101; 324–335). While McLuhan’s focus is on the resonating dimension of radio and its sensational effects, this research does, however, concentrate on the transmission of knowledge and information.

  7. The attempt to develop the so-called explainable AI addresses this problem from a different angle. Experts try to make AI explain itself so that humans can check and evaluate all the steps taken in the intermediary stages of AI’s operation. This effort seems to be meaningful in terms of not only the trustworthiness of the AI but also the human control over AI. However, Coeckelbergh (2020, 118–119) observes that there are several further issues concerning the transparency and explainability of AI. He raises questions about the very possibility of its realization in terms of (i) technical and practical issues, (ii) ethical problems concerning the trade-offs between performance and explainability, and (iii) what an “explanation” really means.

  8. There is a critique against the idea of the filter bubble and echo chamber that insists people can still make their choices (Bruns, 2019). However, the empirical fact that some people are not influenced by the bubble filter does not mean that the structure is neutral. It is not difficult to imagine that the hyperlead will limit the range of knowledge that the user might be interested in.

  9. Mill expresses reservation even for consolidating true opinions through investigation and questioning. “[T]hough… this gradual narrowing of the bounds of diversity of opinions [and consolidate true opinions] is necessary… we are not therefore obliged to conclude that all its consequences must be beneficial” (Mill, 1859/2009: 53). He argues for the need of a “contrivance” in education to create an atmosphere where students are pressed to defend themselves against “a dissentient champion” (Mill, 1859/2009: 53).

  10. Although Han-Gul was invented by the initiative of a King, aristocrats of the Chosun Dynasty resisted its wider usage at the time. The wider and general use of Han-Gul today had to wait for nearly five centuries. In the meantime, Han-Gul was intended for use by the lower classes.

  11. Cadwalladr, a reporter for the Guardian who is known for her investigation on Cambridge Analytica (Cadwalladr and Graham-Harrison, 2018), compares Facebook with North Korea. “If Facebook was a country, it would be a rogue state. It would be North Korea… And just as the citizens of North Korea are unable to operate outside the state, it feels almost impossible to be alive today and live a life untouched by Facebook, WhatsApp, and Instagram” (Cadwalladr, 2020). This comparison is ironic because North Korea would never allow such SNS services to the general public (Reddy, 2019).

  12. A similar attempt has been made in order to provide even more “efficient” recommendations. Ziarani and Ravanmehr (2021) provide the literature review of many pieces of research on the recommendation algorithm that try to accommodate “serendipity.” According to them, serendipity is “a criterion for making appealing and useful recommendations” so that the recommendation would not be “predictable and boring” to customers. In this case, however, pursuing serendipity could be an even more sophisticated method of manipulation.

  13. https://gobo.social/about (accessed 18 April 2020). The “Civic Media” is not active and this introduction is not available anymore. A briefer description is found at https://www.media.mit.edu/projects/gobo/overview/ (accessed 22 June 2021).

References

  • Amazon (2021). The history of Amazon’s forecasting algorithm: The story of a decade-plus-long journey toward a unified forecasting model. 9 August. Retrieved July 18, 2022, from https://han.gl/pkFBd

  • Amer, K. & Noujaim, J. (2019). The Great Hack. Netflix Documentary.

  • Anderson, C. (2008). The end of theory. Wired Magazine, 23 June. Retrieved December 13, 2021, from https://goo.gl/iPkDVZ

  • Anderson, J. & Lainie, L. (2020). Many tech experts say digital disruption will hurt democracy. Pew Research Center Report, 21 February. Retrieved December 13, 2021, from https://han.gl/lKYJR

  • Barlow, J. P. (1996). Declaration of the independence of cyberspace. Retrieved December 13, 2021, from https://www.eff.org/cyberspace-independence

  • Berneys, E. (1928/2009). Propaganda. Trans. by Kang M. G. [into Korean]. Gongzon.

  • Bilton, R. (2016). The Washington post tests personalized “pop-up” newsletters to promote its big stories. NiemanLab, 12 May. Retrieved July 18, 2022, from https://han.gl/vvTgi

  • Bolter, J. D., & Grusin, R. (2000). Remediation: Understanding new media (Revised). The MIT Press.

    Google Scholar 

  • Bolter, J. D. (2001). Writing space, computer, hypertext, and the remediation of print (2nd ed.). Routledge.

    Book  Google Scholar 

  • Broad, W. J. (1992). Clinton to promote high technology, with Gore in charge. The New York Times, 10 Nov. Retrieved December 13, 2021, from https://url.kr/kqezoa

  • Bruns, A. (2019). It’s not the technology, stupid: How the ‘echo chamber’ and ‘filter bubble’ metaphors have failed us. International Association for Media and Communication Research. Retrieved December 13, 2021, from https://eprints.qut.edu.au/131675/

  • Cadwalladr, C. (2020). Facebook is out of control. If it were a country it would be North Korea. The Guardian. 5 July. Retrieved December 13, 2021, from https://han.gl/MkosG

  • Cadwalladr, C. & Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge analytical in major data breach. The Guardian 17 March. Retrieved December 13, 2021, from https://han.gl/rEw7t

  • Cho, J., Ahmed, S., Hilbert, M., Liu, B., & Luu, J. (2020). Do search algorithms endanger democracy? An experimental investigation of algorithm effects on political polarization. Journal of Broadcasting & Electronic Media, 64(2), 150–172. https://doi.org/10.1080/08838151.2020.1757365

    Article  Google Scholar 

  • Coeckelbergh, M. (2020). AI Ethics. The MIT Press.

    Book  Google Scholar 

  • Dahl, R. A. (1998). On Democracy (2nd ed.). Yale U.P.

    Google Scholar 

  • Derrida, J. (1981). Dissemination. Trans. By Barbara Johnson. Chicago U.P.

  • Dreyfus, H. (2009). On the Internet (2nd ed.). Routledge.

    Google Scholar 

  • Feenberg, A. (1999). Questioning Technology. Routledge.

    Google Scholar 

  • Flaxman, S. R., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. https://doi.org/10.1093/poq/nfw006

    Article  Google Scholar 

  • Floridi, L. (2021a). Trump, Parler, and regulating the Infosphere as our commons. Philosophy & Technology, 34, 1–5. https://doi.org/10.1007/s13347-021-00446-7

    Article  Google Scholar 

  • Floridi, L. (2021b). The end of an era: From self-regulation to hard law for the digital industry. Philosophy & Technology, 34, 619–622. https://doi.org/10.1007/s13347-021-00493-0

    Article  Google Scholar 

  • Hayles, N. K. (2010). How we read: Close, hyper, machine. ADE Bulletin Nr. 150.

  • Härkönen, T., Vänskä, R., & Vahti, J. (2022). Digital power extends further into our daily lives than we realized. SITRA, 1 April. Retrieved July 18, 2022, from https://han.gl/FGWTv

  • Helbing, D., et al. (2017). Will democracy survive big data and artificial intelligence? Scientific American 25 February. Retrieved December 13, 2021, from https://han.gl/qdGCF

  • Held, D. (2006). Models of democracy (3rd ed.). Polity Press.

    Google Scholar 

  • Jeong, J. H. (2021). ‘AI sexual harassment’ controversial chatbot achieves suspicion of disclosure of personal information. [in Korean] Segyeilbo 9 Jan. Retrieved December 13, 2021, from http://www.segye.com/newsView/20210109506258

  • Kant, I. (1781/1996). Critique of pure reason. Hackett Publishing Company.

  • Kaiser, B. (2019). Targeted. HarperCollins.

    Google Scholar 

  • Kearns, & Roth. (2020). The ethical algorithm. Oxford U.P.

    Google Scholar 

  • Lashbrook, A. (2018). AI-driven dermatology could leave dark-skinned patients behind. The Atlantic 17 August. Retrieved July 30, 2021 from https://han.gl/RHEVr

  • Le Bon, G. (1895/2014). Psychologie des foules. Trans. by Min M. H. [into Korean]. Checksesang.

  • Lee, C., Shin, J., & Hong, A. (2018). Does social media use really make people politically polarized? Direct and indirect effects of social media use on political polarization in South Korea. Telematics and Informatics, 35(1), 245–254. https://doi.org/10.1016/j.tele.2017.11.005

    Article  Google Scholar 

  • Lee, W. (2021). South Korean AI developer shuts down chatbot following privacy violation Probe. MLex, 13 Jan. Retrieved December 13, 2021, from https://han.gl/ybFRK

  • Levy, G. & Razin, R. (2020). Social media and political polarisation. LSE Public Policy Review, 1(1). https://doi.org/10.31389/lseppr.5

  • Lippman, W. (1954). Public opinion. Macmillan.

    Google Scholar 

  • Macnish, K., & Galliott, J. (Eds.). (2020). Big data and democracy. Endinburg U. P.

    Google Scholar 

  • Mayer-Schönberger, V., & Cukier, K. N. (2013). Big data: A revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt Publishing.

    Google Scholar 

  • McLuhan, M. (1964/2002). Understanding Media. Routledge.

  • McNamee, R. (2019). Zucked: Waking up to the Facebook catastrophe. Penguin Books.

    Google Scholar 

  • Mill, J. S. (1859/2009). On liberty and other essays. Kaplan Publishing.

  • Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.

    Book  Google Scholar 

  • Obermeyer, Z., et al. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 336, 447–453. https://doi.org/10.1126/science.aax2342

    Article  Google Scholar 

  • O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.

    Google Scholar 

  • Ong, W. (1982/2002). Orality and literacy: The technologizing of the word. Routledge.

  • Park, C. S., Ha, D. C., & Son, W. C. (2019). Posthuman society and new norms. Acanet. [in Korean]

  • Pariser, E. (2012). The filter bubble: How the new personalized web is changing what we read and how we think. Penguin Books.

    Book  Google Scholar 

  • Peters, U. (2022). Algorithmic political bias in artificial intelligence systems. Philosophy & Technology, 35, 25. https://doi.org/10.1007/s13347-022-00512-8

    Article  Google Scholar 

  • Postman, N. (1985). Amusing ourselves to death: Public discourse in the age of show business. Penguin Books.

    Google Scholar 

  • Postman, N. (1988/1992). Conscientious objections: Stirring up trouble about language, technology, and education. Vintage Books.

  • Reddy, S. (2019). Analysis: How does North Korea use social media? BBC. 5 July. Retrieved December 13, 2021, from https://monitoring.bbc.co.uk/product/c200xiwn

  • Sclove, E. R. (1995). Democracy and Technology. Guilford Press.

    Google Scholar 

  • Shaffer, K. (2019). Data versus democracy: How big data algorithms shape opinions and alter the course of history. Apress Media.

    Book  Google Scholar 

  • Son, W. C. (2005). Modern technology and democracy. Doctoral dissertation. Katholieke Universiteit Leuven.

  • Son, W. C. (2018). Propaganda in the ICT Era. Sogang Journal of Philosophy, 52, 125-154. https://doi.org/10.17325/sgjp.2018.52..125 [In Korean]

  • Sunstein, C. R. (2017). #Republic: Divided Democracy in the Age of Social Media. Princeton U. P.

    Book  Google Scholar 

  • Yamaguchi, S. (2020). Why are there so many extreme opinions online? An empirical analysis comparing Japan, Korea, and the United States. SSRN. Retrieved December 13, 2021, from https://ssrn.com/abstract=3568457

  • Vaidhyanathan, S. (2018). Antisocial media: How Facebook disconnects us and undermines democracy. Oxford Univ.

    Google Scholar 

  • Winner, L. (2020). Whale and the reactor: A search for limits in an age of high technology (2nd ed.). Chicago U. P.

    Book  Google Scholar 

  • Ziarani, R. J., & Ravanmehr, R. (2021). Serendipity in recommender systems: A systematic literature review. Journal of Computer Science and Technology, 36, 375–396. https://doi.org/10.1007/s11390-020-0135-9

    Article  Google Scholar 

  • Zuiderveen Borgesius F., et al. (2018). Online political microtargeting: Promises and threats for democracy. Utrecht Law Review 14(1): 82–96. Retrieved December 13, 2021, from https://han.gl/sGfXz

Download references

Funding

This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2019S1A5A2A01045745).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wha-Chul Son.

Ethics declarations

Ethical Approval

N/A.

Consent to Participate

N/A.

Consent to Publish

N/A.

Competing Interests

The author declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the Topical Collection on Information in Interactions between Humans and Machines.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Son, WC. Democracy in the Time of “Hyperlead”: Knowledge Acquisition via Algorithmic Recommendation and Its Political Implication in Comparison with Orality, Literacy, and Hyperlink. Philos. Technol. 35, 80 (2022). https://doi.org/10.1007/s13347-022-00573-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s13347-022-00573-9

Keywords

Navigation