The economy of attention in the age of (mis)information
© Bessi et al.; licensee Springer. 2014
Received: 3 July 2014
Accepted: 27 November 2014
Published: 31 December 2014
In this work we present a thorough quantitative analysis of information consumption patterns of qualitatively different information on Facebook. Pages are categorized, according to their topics and the communities of interests they pertain to, in a) alternative information sources (diffusing topics that are neglected by science and main stream media); b) online political activism; and c) main stream media. We find similar information consumption patterns despite the very different nature of contents. Then, we classify users according to their interaction patterns among the different topics and measure how they responded to the injection of 2788 false information (parodistic imitations of alternative stories). We find that users prominently interacting with alternative information sources – i.e. more exposed to unsubstantiated claims – are more prone to interact with intentional and parodistic false claims.
KeywordsMisinformation Attention patterns Rumor spreading
People can populate their informational domain – i.e. the amount of information available to a society member. The functioning of socio-technical systems, as any socio-cognitive system, requires individuals to interact in order to acquire information to cope with uncertainty. In particular, when dealing with content selection, the efficacy of such systems rely on the accuracy and the completeness of information. In order to have complete information, individuals need perspectives, where all the relevant angles of looking are presented as squarely and objectively as possible. However, the unprecedented diffusion of online social media allowed the massive and proactive production of different perspectives and narratives. Along this path, research on trust needs to account for the relation between information available and its role in the public opinion.
In fact, each decision needs cognitive strategies to reduce the level of uncertainty in the process of beliefs’ formation and revision with respect to the decision’s consequences. Reputation systems are used to collect and analyze information about the performance of service entities with the purpose of computing reputation scores for service objects and service entities. A fundamental assumption of reputation systems is that reputation scores can help predict the future performance of the respective entities and thereby reduce uncertainty of relying parties during the decision making processes -.
However, the World Economic Forum, in its 2013 report , has listed the “massive digital misinformation” as one of the main risks for the modern society. People perceptions, knowledge, beliefs, and opinions about the world and its evolution get (in)formed and modulated through the information they can access, most of which coming from newspapers, television , and, more recently, the Internet. The world wide web has changed the way we can pursue intellectual growth or shape ideas. In particular, large social networks, with their user-provided content, are facilitating the study of how the economy of attention leads to specific patterns for the emergence, production, and consumption of information -.
Despite the enthusiastic rhetoric about the ways in which new technologies have burst the interest in debating political or social relevant issues -, the role of the socio-technical system in enforcing informed debates still remains unclear. Indeed, the emergence of knowledge from this process has been dubbed collective intelligence or even more rhetorically wisdom of crowds-, although we have become increasingly aware of the presence of unsubstantiated or untruthful rumors.
In this paper we show a genuine example of how false information is particularly pervasive on social media, fostering sometimes a sort of collective credulity. We perform a thorough quantitative analysis of information consumption patterns of qualitatively different information on Facebook over a set of 50 pages on which interacted 2.3 million of users. In order to study attention and consumption patterns of different contents we divide pages in categories according to the kind of narrative supported. More precisely, pages are categorized, according to their topics and the communities of interests they pertain to, in a) alternative information sources (diffusing topics that are neglected by science and main stream media); b) online political activism; and c) main stream media.
Then, we classify users according to their interaction patterns among the different topics and measure how they responded to the injection of 2788 false information (parodistic imitations of alternative stories). Our findings show a) similar information consumption patterns despite the very different nature of contents and b) that users prominently interacting with alternative information sources – i.e. more exposed to unsubstantiated claims – are more prone to interact with intentional and parodistic false claims.
Narratives on online social media
On the Web several cultures coexist, and conspiracy thinking is one of the most prolific. The vast majority of unsubstantiated rumors on the web is related to conspiracy stories and find on the Web a natural medium for their dissemination. Narratives grounded on conspiracy theories tend to reduce the complexity of reality and are able to contain the uncertainty they generate -. They are able to create a climate of disengagement from mainstream society and from officially recommended practices  – e.g. vaccinations, diet, etc. Conspiracy thinking exposes individuals to unsubstantiated (difficult to verify) hypotheses providing alternative explanations to reality -. In particular, conspiracists are prone to explain significant social or political aspects as plots conceived by powerful individuals or organizations . As these kinds of arguments can sometimes involve the rejection of science, alternative explanations are invoked to replace the scientific evidence. For instance, people who reject the link between HIV and AIDS generally believe that AIDS was created by the U.S. Government to control the African American population ,. Since unsubstantiated claims are proliferating over the Internet, what could happen if they were used as the basis for policy making? Nonetheless, several tools have been recently designed to help users disambiguate misinformation and false news ,. On the other hand, basic questions remain on how the quality of (mis)information affects the economy of attention processes, concerning, for example, the virality of information, its lifespan and the consumption patterns.
In this work we address the relationship between information sources and collective debates. In particular, our focus is on how alternative and mainstream news are used when people discuss and organize political actions online – i.e. political activism. More precisely, pages are categorized, according to their topics and the communities of interests they pertain to, in a) alternative information sources (diffusing topics that are neglected by science and main stream media); b) online political activism; and c) main stream media.
Moreover, recently it has been noticed the emergence of very distinct groups, namely trolls, building Facebook pages as a parodistic imitation of both alternative information sources and online political activism. Their activities range from controversial comments, and posting satirical content mimicking alternative news sources, to the fabrication of purely fictitious statements, heavily unrealistic and sarcastic. This pervasiveness of unreliable contents might lead to mix up unsubstantiated stories with their satirical counterparts – e.g. the presence of sildenafil-citratum (the active ingredient of Viagra™)  in chem-trails or the anti hypnotic effects of lemons (more than 45000 shares) ,. Not rarely, these memes became viral and were used as evidence in online debates from political activists .
Inspired by these lively and controversial social dynamics at the edge between virality and credulity, we addressed the quantitative analysis of the interlink between information sources and political discussion on the web by focusing on different profile of users (according to the kind of content they are mostly exposed to) and how they are biased in selecting contents. In particular we first characterize information consumption patterns in the various categories (alternative news, mainstream news and political activism) and then we measure how the most polarized users are responsive to intentional false claims diffused by a troll page.
Experiment setup and data collection
The impressive pervasiveness of unsubstantiated rumors online has been listed as one of the main risk for our society for its effect on the public opinion. Our experiment aims at understanding the effect of the exposure to unsubstantiated claims on the content selection criteria and, in particular, if such an exposure might lead to interact with information that are intentionally false.
Firstly, we show users attention patterns with respect to different kind of contents, and then we look at who are the users more prone to interact with intentional false information according to the content they are usually exposed to. To do this, we focus on the Italian Facebook ecosystem and we define three categories of pages according to the kind of information they promote. The categorization of the pages is based on their different social functions together with the type of information they disseminate. The first category includes all pages of main stream newspapers; the second category consists of alternative information sources – pages which disseminate controversial information, often lacking supporting evidence and sometimes contradictory of the official news (e.g. conspiracy theories, link between vaccines and autism, etc); the third category is that of self-organized online political movements – with the role of gathering users to publicly convey discontent against the current political and socio-economic situation (e.g. one major political party in Italy has most of its activity online). The criteria to define pages categories are based on pages’ self-description and the kind of content they disseminate. All national newspapers active of Facebook belongs to mainstream news. Pages of political activism and alternative news have been identified with the help of Facebook groups very active in the debunking unsubstantiated rumors. However, all pages of alternative news in their mission declare to have the role to disseminate information neglected by “manipulated main stream media”.
Breakdown of Facebook dataset
Likes to Comments
Results and discussion
The information space
Information consumption patterns
To provide a better picture of fruition patterns with respect to different information, we continue our analysis by zooming in at the level of posts. Recalling that the division of pages in categories accounts for the very distinctive nature between political discussion and kind of information used (mainstream or alternative), we focus on the coexistence in the political discussion of mainstream news and conspiracy-like information. Firstly, we analyze general fruition patterns in terms of number of comments, likes and shares for posts grouped by page category. Then, to characterize the online discussion around qualitatively different information, we measure the duration of collective debates for each post diffused by the different pages.
Information-based communities are aggregated around shared narratives and the debates among them contribute to the proliferation of political pages and alternative information sources with the aim to organize and convey the public discontent (with respect to the crisis and the decisions of the national government) by exploiting the Internet peculiarities. According to our results, collective debates grounded of different information persist similarly, independently of whether the topic is the product of conspiracist or mainstream source. In this portion of the Italian Facebook ecosystem, untruthful rumors spread and trigger viral debate, representing an important part of the information flow animating the political scenario and shaping the public opinion.
Interaction with false information
The goal of our study is to detect potential bias induced by the exposure to untruthful rumors on users’ content selection criteria in an information environment where mainstream and alternative news reverberate in a similar way. Now we want to measure the attitude of a user to interact with intentional parodistic false information.
Continuing our investigation, we want to understand if this information context might affect the users’ selection criteria. Therefore, we measure the reaction of users to a set of 2788 false information injected by a troll page – i.e. a page promoting caricatural version of alternative news and political activism stories (see Section Narratives on online social media for further details).
For better discriminating users’ behavior, we focus on the users’ activity rates on the various categories looking for the polarized ones – i.e. users that are mostly exposed to one type of content among mainstream news, political activism and alternative news.
the topic of the post is coherent with the theme of the page on which it was published;
a user is interested in the topic of the post if he/she likes the post. A comment – although it reflects interest – is more ambiguous, thus it is not considered to express a positive preference of the topic;
we neither have access to nor try to guess the page subscription list of the users, regardless of their privacy settings. Every step of the analysis involves only the active (participating) users on each page.
According to our results, users with strong preferences for alternative information sources, perhaps motivated by the will to avoid the manipulation played by mainstream media controlled by the government, are more susceptible to false information. Our result suggests that those who took a less systematic (more heuristic) approach in evaluating any evidence were more likely to end up with an account that was more consistent with their previous beliefs even if these are parodistic posts.
The free circulation of contents is facilitating users attention to critical matter such as the financial crisis as well as any political argument. However, in this work we show that unsubstantiated rumors are pervasive in online social media and they might affect users belief formation and revision.
Information based on conspiracy are able to create a climate of disengagement from mainstream society and from officially recommended practices  – e.g. vaccinations, diet, etc. Conspiracy thinking exposes individuals to unsubstantiated (difficult to verify) hypotheses providing alternative explanations to reality -. In particular, conspiracists are prone to explain significant social or political aspects as plots conceived by powerful individuals or organizations . Furthermore our results suggests that exposure to unsubstantiated rumors might facilitate the interaction with intentional false claims such as the case of Senator Cirenga.
- Jøsang A, Quattrociocchi W: Advanced features in bayesian reputation systems. In TrustBus, volume 5695 of Lecture Notes in Computer Science. Edited by: Fischer-Hübner S, Lambrinoudakis C, Pernul G. Springer, Berlin; 2009:105–114.Google Scholar
- Kreps DM, Wilson R: Reputation and imperfect information. J Econ Theory 1982, 27(2):253–279. 10.1016/0022-0531(82)90030-8MATHMathSciNetView ArticleGoogle Scholar
- Diamond DW: Monitoring and reputation: the choice between bank loans and directly placed debt. J Pol Econ 1991, 99(4):689–721. 10.1086/261775MathSciNetView ArticleGoogle Scholar
- Jøsang A, Ismail R, Boyd C: A survey of trust and reputation systems for online service provision. Decis Support Syst 2007, 43(2):618–644. 10.1016/j.dss.2005.05.019View ArticleGoogle Scholar
- Rice SC: Reputation and uncertainty in online markets: an experimental study. Inf Syst Res 2012, 23(2):436–452. 10.1287/isre.1110.0362View ArticleGoogle Scholar
- Quattrociocchi W, Paolucci M, Conte R: Reputation and uncertainty reduction: Simulating partner selection. In Trust in Agent Societies, volume 5396 of Lecture Notes in Computer Science. Edited by: Falcone R, Barber S, Sabater-Mir J, Singh M. Springer, Berlin, Heidelberg; 2008:308–325.Google Scholar
- Howell L: Digital Wildfires in a Hyperconnected World. World Economic Forum, Davos, CH; 2013.Google Scholar
- Mccombs ME, Shaw DL: The agenda-setting function of mass media. Public Opinion Q 1972, 36(2):176–187. 10.1086/267990View ArticleGoogle Scholar
- Lanham RA: The economics of attention: style and substance in the age of information. University Of Chicago Press, USA; 2007.Google Scholar
- Qazvinian V, Rosengren E, Radev DR, Mei Q: Rumor has it: identifying misinformation in microblogs. In Proceedings of the conference on empirical methods in natural language processing, EMNLP ’11, Association for Computational Linguistics. Stroudsburg, PA, USA; 2011:1589–1599.Google Scholar
- Dow Pa, Adamic LA, Friggeri A (2013) The anatomy of large Facebook cascades. The AAAI Press.Google Scholar
- Friggeri A, Adamic L, Eckles D, Cheng J: Rumor cascades. In Proceedings of the 8th International AAAI conference on weblogs and social media (ICWSM’14). MI, Ann Arbor; 2014.Google Scholar
- Feizy R: An evaluation of identity in online social networking: distinguishing fact from fiction. University of Sussex, UK; 2010.Google Scholar
- Guillory J, Spiegel J, Drislane M, Weiss B, Donner W, Hancock J: Upset now?: emotion contagion in distributed groups. In Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’11. ACM, New York, NY, USA; 2011:745–748.Google Scholar
- Bekkers V, Beunders H, Edwards A, Moody R: New media, micromobilization, and political agenda setting: crossover effects in political mobilization and media usage. Inf Soc 2011, 27(4):209–219. 10.1080/01972243.2011.583812View ArticleGoogle Scholar
- Gonzalez-Bailon S, Borge-Holthoefer J, Rivero A, Moreno Y (2011) The dynamics of protest recruitment through an online network. Scientific Report.Google Scholar
- Garcia D, Mendez F, Serdült U, Schweitzer F: Political polarization and popularity in online participatory media: an integrated approach. In Proceedings of the first edition workshop on Politics, elections and data, PLEAD ’12. ACM, New York, NY, USA; 2012:3–10. 10.1145/2389661.2389665View ArticleGoogle Scholar
- Crespi I: The public opinion process. In How the people speak. Lawrence Erlbaum Associates, USA; 1997.Google Scholar
- Lippmann W: Public opinion. Penguin Books, UK; 1946.Google Scholar
- Levy P: Collective Intelligence: mankind’s Emerging World in Cyberspace. Perseus Publishing, USA; 1999.Google Scholar
- Buckingham Shum S, Aberer K, Schmidt A, Bishop S, Lukowicz P, Anderson S, Charalabidis Y, Domingue J, Freitas S, Dunwell I, Edmonds B, Grey F, Haklay M, Jelasity M, Karpištšenko A, Kohlhammer J, Lewis J, Pitt J, Sumner R, Helbing D: Towards a global participatory platform. Eur Phys J Special Topics 2012, 214(1):109–152. 10.1140/epjst/e2012-01690-3View ArticleGoogle Scholar
- Malone TW, Klein M: Harnessing collective intelligence to address global climate change. Innovations: Technol Governance Globalization 2007, 2(3):15–26. 10.1162/itgg.2007.2.3.15View ArticleGoogle Scholar
- Shadbolt N, Hall W, Hendler JA, Dutton WH (2013) Web science: a new frontier. Phys Eng Sci 371(1987).Google Scholar
- Byford J: Conspiracy theories: a critical introduction. Palgrave Macmillan, USA; 2011.View ArticleGoogle Scholar
- Fine GA, Campion-Vincent V, Heath CRumor Mills: the social impact of rumor and legend. Social problems and social issues, Transaction Publishers, USA.Google Scholar
- Hogg MA, Blaylock DL: Extremism and the psychology of uncertainty. In Blackwell/Claremont Applied Social Psychology Series. Wiley, USA; 2011.Google Scholar
- Bauer M: Resistance to new technology: nuclear power, information technology and biotechnology. Cambridge University Press, UK; 1997.Google Scholar
- Quattrociocchi W, Caldarelli G, Scala A (2014) Opinion dynamics on interacting networks: media competition and social influence. Scientific Reports.Google Scholar
- Mocanu D, Rossi L, Zhang Q, Karsai M, Quattrociocchi W (2014) Collective attention in the age of (mis)information. CoRR, abs/1403: 3344.Google Scholar
- Quattrociocchi W, Amblard F, Galeota E: Selection in scientific networks. Soc Netw Anal Mining 2012, 2(3):229–237. 10.1007/s13278-011-0043-7View ArticleGoogle Scholar
- Bessi A, Coletto M, Davidescu GA, Scala A, Caldarelli G, Quattrociocchi W (2014) Science vs conspiracy: collective narratives in the age of (mis)information. submitted to Plos One.Google Scholar
- Bessi A, Coletto M, Davidescu GA, Scala A, Quattrociocchi W: Misinformation in the loop: the emergence of narratives in osn. In Itais 2014. Genova, Italy; 2014.Google Scholar
- Sunstein CR, Vermeule A: Conspiracy theories: Causes and cures*. J Pol Philos 2009, 17(2):202–227. 10.1111/j.1467-9760.2008.00325.xView ArticleGoogle Scholar
- Bogart LM, Thorburn S: Are HIV/AIDS conspiracy beliefs a barrier to HIV prevention among African Americans. J Acquir Immune Defic Syndr 2005, 38(2):213–8. 10.1097/00126334-200502010-00014View ArticleGoogle Scholar
- Kalichman SC: Denying AIDS: Conspiracy theories, pseudoscience, and human tragedy. Springer, Berlin; 2009.View ArticleGoogle Scholar
- Ennals R, Trushkowsky B, Agosta JM: Highlighting disputed claims on the web. In Proceedings of the 19th international conference on World wide web, WWW ’10. ACM, New York, NY USA; 2010:341–350. 10.1145/1772690.1772726View ArticleGoogle Scholar
- McKelvey KR, Menczer F: Truthy: Enabling the study of online social networks. In Proceedings of the 2013 conference on computer supported cooperative work companion, CSCW ’13. ACM, New York, NY, USA; 2013:23–26.Google Scholar
- Onnela J-P, Reed-Tsochas F: Spontaneous emergence of social influence in online systems. Proc Natl Acad Sci 2010, 107(43):18375–18380. 10.1073/pnas.0914572107View ArticleGoogle Scholar
- Ugander J, Backstrom L, Marlow C, Kleinberg J: Structural diversity in social contagion. Proc Natl Acad Sci 2012, 109(16):5962–5966. 10.1073/pnas.1116502109View ArticleGoogle Scholar
- Lewis K, Gonzalez M, Kaufman J: Social selection and peer influence in an online social network. Proc Natl Acad Sci 2012, 109: 68–72. 10.1073/pnas.1109739109View ArticleGoogle Scholar
- Mocanu D, Baronchelli A, Gonçalves B, Perra N, Zhang Q, Vespignani A (2013) The twitter of babel: Mapping world languages through microblogging platforms. PLOS ONE 8(4): e61981. Mocanu D, Baronchelli A, Gonçalves B, Perra N, Zhang Q, Vespignani A (2013) The twitter of babel: Mapping world languages through microblogging platforms. PLOS ONE 8(4): e61981.Google Scholar
- Adamic L, Glance N: The political blogosphere and the 2004 u.s. election: Divided they blog. In In LinkKDD’05: Proceedings of the 3rd international workshop on Link discovery. Chicago Illinois, ACM, USA; 2005:36–43. 10.1145/1134271.1134277View ArticleGoogle Scholar
- Kleinberg J (2013) Analysis of large-scale social and information networks. Philos Trans R Soc A: Math Phys Eng Sci: 371.Google Scholar
- Hannak A, Margolin D, Keegan B, Weber I: Get back! you don’t know me like that: The social mediation of fact checking interventions in twitter conversations. In Proceedings of the 8th international AAAI conference on weblogs and social media (ICWSM’14). MI, Ann Arbor; 2014.Google Scholar
- simplyhumans (2014) Simply humans - benefic effect of lemons. Website, 7 2014. Last checked: 31.07.2014.Google Scholar
- simplyhumans (2014) Dieta peronalizzata - benefic effect of lemons. Website, 7 2014. Last checked, 31.07.2014.Google Scholar
- simplyhumans (2014) Simply humans - benefic effect of lemons. Website, 7 2014. Last checked, 31.07.2014.Google Scholar
- Ambrosetti G (2013) I forconi: “il senato ha approvato una legge per i parlamentari in crisi”. chi non verrà rieletto, oltre alla buonuscita, si beccherà altri soldi. sarà veroWebsite, 8 2013. last checked: 19.01.2014.Google Scholar
- Facebook (2013) Using the graph api. Website, 8 2013. last checked, 19.01.2014.Google Scholar
- Semplicementeme (2012) Website, 8 2012. last checked: 19.01.2014.Google Scholar
- Ellison NB, Steinfield C, Lampe C: The benefits of facebook “friends:” social capital and college students’ use of online social network sites. J Computer-Mediated Commun 2007, 12(4):1143–1168. 10.1111/j.1083-6101.2007.00367.xView ArticleGoogle Scholar
- Joinson AN: Looking at, looking up or keeping up with people?: Motives and use of facebook. In Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’08. ACM, New York NY, USA; 2008:1027–1036.Google Scholar
- Viswanath B, Mislove A, Cha M, Gummadi KP: On the evolution of user interaction in facebook. In Proceedings of the 2Nd ACM workshop on online social networks, WOSN ’09. ACM, New York, NY, USA; 2009:37–42. 10.1145/1592665.1592675View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.