Skip to main content

The economy of attention in the age of (mis)information

Abstract

In this work we present a thorough quantitative analysis of information consumption patterns of qualitatively different information on Facebook. Pages are categorized, according to their topics and the communities of interests they pertain to, in a) alternative information sources (diffusing topics that are neglected by science and main stream media); b) online political activism; and c) main stream media. We find similar information consumption patterns despite the very different nature of contents. Then, we classify users according to their interaction patterns among the different topics and measure how they responded to the injection of 2788 false information (parodistic imitations of alternative stories). We find that users prominently interacting with alternative information sources – i.e. more exposed to unsubstantiated claims – are more prone to interact with intentional and parodistic false claims.

Introduction

People can populate their informational domain – i.e. the amount of information available to a society member. The functioning of socio-technical systems, as any socio-cognitive system, requires individuals to interact in order to acquire information to cope with uncertainty. In particular, when dealing with content selection, the efficacy of such systems rely on the accuracy and the completeness of information. In order to have complete information, individuals need perspectives, where all the relevant angles of looking are presented as squarely and objectively as possible. However, the unprecedented diffusion of online social media allowed the massive and proactive production of different perspectives and narratives. Along this path, research on trust needs to account for the relation between information available and its role in the public opinion.

In fact, each decision needs cognitive strategies to reduce the level of uncertainty in the process of beliefs’ formation and revision with respect to the decision’s consequences. Reputation systems are used to collect and analyze information about the performance of service entities with the purpose of computing reputation scores for service objects and service entities. A fundamental assumption of reputation systems is that reputation scores can help predict the future performance of the respective entities and thereby reduce uncertainty of relying parties during the decision making processes [1]-[6].

However, the World Economic Forum, in its 2013 report [7], has listed the “massive digital misinformation” as one of the main risks for the modern society. People perceptions, knowledge, beliefs, and opinions about the world and its evolution get (in)formed and modulated through the information they can access, most of which coming from newspapers, television [8], and, more recently, the Internet. The world wide web has changed the way we can pursue intellectual growth or shape ideas. In particular, large social networks, with their user-provided content, are facilitating the study of how the economy of attention leads to specific patterns for the emergence, production, and consumption of information [9]-[13].

Despite the enthusiastic rhetoric about the ways in which new technologies have burst the interest in debating political or social relevant issues [14]-[19], the role of the socio-technical system in enforcing informed debates still remains unclear. Indeed, the emergence of knowledge from this process has been dubbed collective intelligence or even more rhetorically wisdom of crowds[20]-[23], although we have become increasingly aware of the presence of unsubstantiated or untruthful rumors.

In this paper we show a genuine example of how false information is particularly pervasive on social media, fostering sometimes a sort of collective credulity. We perform a thorough quantitative analysis of information consumption patterns of qualitatively different information on Facebook over a set of 50 pages on which interacted 2.3 million of users. In order to study attention and consumption patterns of different contents we divide pages in categories according to the kind of narrative supported. More precisely, pages are categorized, according to their topics and the communities of interests they pertain to, in a) alternative information sources (diffusing topics that are neglected by science and main stream media); b) online political activism; and c) main stream media.

Then, we classify users according to their interaction patterns among the different topics and measure how they responded to the injection of 2788 false information (parodistic imitations of alternative stories). Our findings show a) similar information consumption patterns despite the very different nature of contents and b) that users prominently interacting with alternative information sources – i.e. more exposed to unsubstantiated claims – are more prone to interact with intentional and parodistic false claims.

Narratives on online social media

On the Web several cultures coexist, and conspiracy thinking is one of the most prolific. The vast majority of unsubstantiated rumors on the web is related to conspiracy stories and find on the Web a natural medium for their dissemination. Narratives grounded on conspiracy theories tend to reduce the complexity of reality and are able to contain the uncertainty they generate [24]-[26]. They are able to create a climate of disengagement from mainstream society and from officially recommended practices [27] – e.g. vaccinations, diet, etc. Conspiracy thinking exposes individuals to unsubstantiated (difficult to verify) hypotheses providing alternative explanations to reality [28]-[32]. In particular, conspiracists are prone to explain significant social or political aspects as plots conceived by powerful individuals or organizations [33]. As these kinds of arguments can sometimes involve the rejection of science, alternative explanations are invoked to replace the scientific evidence. For instance, people who reject the link between HIV and AIDS generally believe that AIDS was created by the U.S. Government to control the African American population [34],[35]. Since unsubstantiated claims are proliferating over the Internet, what could happen if they were used as the basis for policy making? Nonetheless, several tools have been recently designed to help users disambiguate misinformation and false news [36],[37]. On the other hand, basic questions remain on how the quality of (mis)information affects the economy of attention processes, concerning, for example, the virality of information, its lifespan and the consumption patterns.

A large body of literature addresses the study of social dynamics on socio-technical systems [12],[38]-[44].

In this work we address the relationship between information sources and collective debates. In particular, our focus is on how alternative and mainstream news are used when people discuss and organize political actions online – i.e. political activism. More precisely, pages are categorized, according to their topics and the communities of interests they pertain to, in a) alternative information sources (diffusing topics that are neglected by science and main stream media); b) online political activism; and c) main stream media.

Moreover, recently it has been noticed the emergence of very distinct groups, namely trolls, building Facebook pages as a parodistic imitation of both alternative information sources and online political activism. Their activities range from controversial comments, and posting satirical content mimicking alternative news sources, to the fabrication of purely fictitious statements, heavily unrealistic and sarcastic. This pervasiveness of unreliable contents might lead to mix up unsubstantiated stories with their satirical counterparts – e.g. the presence of sildenafil-citratum (the active ingredient of Viagra™) [45] in chem-trails or the anti hypnotic effects of lemons (more than 45000 shares) [46],[47]. Not rarely, these memes became viral and were used as evidence in online debates from political activists [48].

Inspired by these lively and controversial social dynamics at the edge between virality and credulity, we addressed the quantitative analysis of the interlink between information sources and political discussion on the web by focusing on different profile of users (according to the kind of content they are mostly exposed to) and how they are biased in selecting contents. In particular we first characterize information consumption patterns in the various categories (alternative news, mainstream news and political activism) and then we measure how the most polarized users are responsive to intentional false claims diffused by a troll page.

Experiment setup and data collection

The impressive pervasiveness of unsubstantiated rumors online has been listed as one of the main risk for our society for its effect on the public opinion. Our experiment aims at understanding the effect of the exposure to unsubstantiated claims on the content selection criteria and, in particular, if such an exposure might lead to interact with information that are intentionally false.

Firstly, we show users attention patterns with respect to different kind of contents, and then we look at who are the users more prone to interact with intentional false information according to the content they are usually exposed to. To do this, we focus on the Italian Facebook ecosystem and we define three categories of pages according to the kind of information they promote. The categorization of the pages is based on their different social functions together with the type of information they disseminate. The first category includes all pages of main stream newspapers; the second category consists of alternative information sources – pages which disseminate controversial information, often lacking supporting evidence and sometimes contradictory of the official news (e.g. conspiracy theories, link between vaccines and autism, etc); the third category is that of self-organized online political movements – with the role of gathering users to publicly convey discontent against the current political and socio-economic situation (e.g. one major political party in Italy has most of its activity online). The criteria to define pages categories are based on pages’ self-description and the kind of content they disseminate. All national newspapers active of Facebook belongs to mainstream news. Pages of political activism and alternative news have been identified with the help of Facebook groups very active in the debunking unsubstantiated rumors. However, all pages of alternative news in their mission declare to have the role to disseminate information neglected by “manipulated main stream media”.

The resulting dataset is composed of 50 public pages for which we download all the posts (and the relative users’ interactions) in a timespan of six months (from September 1st, 2012 to February 28th, 2013). The data are publicly available as they come from a public online social site (Facebook). However, any information has been analyzed anonymously and in aggregated form. The entire data collection process has been performed exclusively with the Facebook Graph API [49], which is publicly available, and for the analysis (according to the specification settings of the API) we used only public available data (users with privacy restrictions are not included in the dataset). The pages from which we download data are public Facebook entities (can be accessed anyone). User content contributing to such pages is also public unless the user’s privacy settings specify otherwise. The exact breakdown of the data is presented in Table 1. For all categories the focus of our analysis is on the interaction of users with the public posts – i.e. likes, shares, and comments. Finally, we got access to 2788 post from a troll page [50]. All of these posts are caricatural version of political activism and alternative news stories, with the peculiarity to include always false information and demential claims. Despite the small dimension (7430 unique users, 18212 likes, 11337 comments and 9549 likes to comments), the page was able to trigger several viral phenomena, one of which reached 100K shares. We use these troll memes to measure how the social ecosystem under investigation is responsive to the injection of intentional false information. On Facebook, users have three main actions to interact with posts: likes (intended as positive feedbacks to the post), comments (a measure of the activity of online collective debates), and shares (intended as the attitude to share a given information). Since comments and shares are more controversial [51]-[53], to measure the responsitivity with respect to the injection of intentional false claims, we account only for likes as a good approximation for users’ engagement.

Table 1 Breakdown of Facebook dataset

Results and discussion

The information space

Our dataset has two main entities: users and pages. In order to characterize information consumption patterns we represent the dataset as a bipartite network. As shown in Figure 1, a bipartite graph is a triple G=(A,B,E), where A={a i | i=1…n A } and B={b j | j=1…n B } are two disjoint sets of vertices, and EA×B is the set of edges, i.e. edges exist only between vertices of the two different sets A and B.

Figure 1
figure 1

Bipartite network and projections. Pages in red and users in blue. Users are linked to a page if they give a comment or a like to a page post. From the bipartite network it is possible to define the users and pages projection.

The bipartite graph is described by the matrix M defined as

M ij = 1 if an edge exists between a i and b j 0 otherwise

According to the bipartite transformation, in Figure 2 we show the projection on the pages side for comments and likes – i.e. nodes are pages and the links stands for having users that commented and liked two pages. The connectivity pattern in the two networks is similar and really densely interconnected. Thus, alternative news, main stream news and political discussion have several common users. As a like stands for a positive feedback to the post and a comments might be either negative of positive (discussion on the information), from this transformation we can notice that online discussions and positive feedbacks on information belonging to different topics and contexts are similar. Such a first result suggests that different contents (mainstream news, alternative news and political discussion) are consumed in a comparable way.

Figure 2
figure 2

Page network. Projection of the bipartite network on pages for users’ likes (left) and users’ comments (right). Two pages are connected if a user commented on both. Alternative news (green), mainstream news (violet) and political activism pages have several users in common.

To further detail such a similarity, in Figure 3 we show the empirical complementary cumulative distribution function of edge weights – i.e. the number of comments or likes the pages have in common – of the two bipartite projections on pages of users’ likes and comments. Reminding that edge weights stand for the number of posts in which two users commented or liked together, we can see that the distributions are similarly shaped. The information space we are exploring is densely interconnected and presents similar connectivity patterns in terms of likes (positive feedback) and comments (discussion), meaning that unsubstantiated claims (information difficult to verify and often related to conspiracy theories), main stream news and political discussion are consumed and reverberate in a similar way on Facebook.

Figure 3
figure 3

Page network. Empirical complementary cumulative distribution function (CCDF) of the edge weights on the page projection. Link weights stand for the number of common users between pages that commented (blue) and liked (red). The distribution are similar.

Information consumption patterns

To provide a better picture of fruition patterns with respect to different information, we continue our analysis by zooming in at the level of posts. Recalling that the division of pages in categories accounts for the very distinctive nature between political discussion and kind of information used (mainstream or alternative), we focus on the coexistence in the political discussion of mainstream news and conspiracy-like information. Firstly, we analyze general fruition patterns in terms of number of comments, likes and shares for posts grouped by page category. Then, to characterize the online discussion around qualitatively different information, we measure the duration of collective debates for each post diffused by the different pages.

In Figure 4 we show the empirical complementary cumulative distribution function of the actions (like, comments and shares) on each post for different categories of pages – i.e. alternative news, mainstream news and political activism. Such a plot clearly shows that the fruition patterns of qualitatively different contents are similar.

Figure 4
figure 4

Attention patterns. Empirical complementary cumulative distribution function (CCDF) of users’ actions (likes and comments) on posts grouped by page category. The distributions are indicating similar consumption patterns on the various page categories. Qualitatively different information are consumed in a similar way.

Finally, in order to quantify the level of engagement produced by an information, we measure the lifetime of each post – i.e. the temporal distance between the first and the last comment – as a good approximation of the attention received by an information. In Figure 5 we show the lifetime’s probability distribution function for each post in each category.

Figure 5
figure 5

Posts lifetime. Distribution of posts’ lifetime – i.e. the temporal distance between the first and last comment – grouped by page category. The life time of posts in both categories is similar.

Information-based communities are aggregated around shared narratives and the debates among them contribute to the proliferation of political pages and alternative information sources with the aim to organize and convey the public discontent (with respect to the crisis and the decisions of the national government) by exploiting the Internet peculiarities. According to our results, collective debates grounded of different information persist similarly, independently of whether the topic is the product of conspiracist or mainstream source. In this portion of the Italian Facebook ecosystem, untruthful rumors spread and trigger viral debate, representing an important part of the information flow animating the political scenario and shaping the public opinion.

Interaction with false information

The goal of our study is to detect potential bias induced by the exposure to untruthful rumors on users’ content selection criteria in an information environment where mainstream and alternative news reverberate in a similar way. Now we want to measure the attitude of a user to interact with intentional parodistic false information.

For better discriminating the users’ behavior, we focus on the users’ activity rates on the various categories. In Figure 6 we show how the number of actions for each user is distributed within the various categories. We rank users by their activity and each curve stands for the fraction of a given bin – i.e. data values falling in a given small interval (bin) are replaced by the average value of the interval. As we can see, users are really more active on political activism and alternative news rather than main stream news. User activity is dominated by alternative news and political discussion.

Figure 6
figure 6

User activity on caterories of pages. Fraction of user activity (likes and comments) on each category per user ranked by activity rate (total number of action of a user). How each user has the total activity distributed among the different pages categories. The most of the activity is on political discussion (red) and alternative news (green).

Continuing our investigation, we want to understand if this information context might affect the users’ selection criteria. Therefore, we measure the reaction of users to a set of 2788 false information injected by a troll page – i.e. a page promoting caricatural version of alternative news and political activism stories (see Section Narratives on online social media for further details).

For better discriminating users’ behavior, we focus on the users’ activity rates on the various categories looking for the polarized ones – i.e. users that are mostly exposed to one type of content among mainstream news, political activism and alternative news.

We apply a classification strategy in order to discriminate typical users for each page category. In particular, we are interested in distinguishing users on the base of their behavior. Having access to the 6 months historical likes, comments, and likes to comments on all posts within the timespan (and within the privacy restrictions), we quantify the interaction of each user with the posts in each category. As we do this, the following assumptions are in place:

  • the topic of the post is coherent with the theme of the page on which it was published;

  • a user is interested in the topic of the post if he/she likes the post. A comment – although it reflects interest – is more ambiguous, thus it is not considered to express a positive preference of the topic;

  • we neither have access to nor try to guess the page subscription list of the users, regardless of their privacy settings. Every step of the analysis involves only the active (participating) users on each page.

According to these assumptions, we use solely the likes to the posts. For instance, if a user likes 10 different posts on one or multiple pages of the same political movement, but that user never liked posts of any other topic, we will label that user to be associated with the political movement. Given the outline of users distribution within the various categories, we want to see which users are more responsive to the injection of false information in terms of interaction. As before, we cannot use the comments as discriminators, since they can represent either positive or negative feedbacks with respect to the published topic. Therefore, we focus only on the users liking 2788 troll posts. As previously mentioned, troll posts are related to arguments debated by political activists or on alternative information sources but with a clear parodistic flavor. For instance, one of the most popular memes that explicitly spread a false rumor (in text form) reads: Italian Senate voted and accepted (257 in favor and 165 abstentions) a law proposed by Senator Cirenga aimed at funding with 134 billion Euros the policy-makers to find a job in case of defeat in the political competition. We were able to easily verify that this meme contains at least four false statements: the name of the senator, the total number of votes is higher than possible, the amount of money (more than 10% of Italian GDP) as well as the law itself. This meme was created by a troll page and, on the wave of public discontent against italian policy-makers, quickly became viral, obtaining about 35,000 shares in less than one month. Shortly thereafter, the image was downloaded and reposted (with the addition of a commentary) by a page describing itself as being focused on political debate. Nowadays, this meme is among the arguments used by protesters manifesting in several Italian cities. This is a striking example of the large scale effect of misinformation diffusion on the opinion formation process. To characterize the spreading pattern of this outstanding example, in Figure 7 we show the total number of actions (likes and comments) made on the troll post about Senator Cirenga grouped by users classified in different categories.

Figure 7
figure 7

Cirenga fruition. Cumulative number of comments per day of one of the most viral post within the false information for classified users. Users more active on commenting are usual consumer of alternative news.

Such a post received a lot of attention initially from alternative news and political activists which through their volume made it viral to an extent that it is still used as argumentation in the political discussion. There are two spikes in the diffusion, however the higher number of comments is made by those users who are usual consumer of alternative information sources. As shown in Figure 8, by counting the polarized users that liked the posts, we find that the users most susceptible to interact with false information are those that are mostly exposed and interacting with unsubstantiated claims (i.e. posts on alternative information pages).

Figure 8
figure 8

Users interacting with false information. Fraction of labeled users on the different categories – i.e., user usually exposed to one type of content among the tree (alternative, mainstream a political) – which interacted with 2788 intentional false information. Users more prone to interact with false claims are consumer of alternative information sources.

According to our results, users with strong preferences for alternative information sources, perhaps motivated by the will to avoid the manipulation played by mainstream media controlled by the government, are more susceptible to false information. Our result suggests that those who took a less systematic (more heuristic) approach in evaluating any evidence were more likely to end up with an account that was more consistent with their previous beliefs even if these are parodistic posts.

The free circulation of contents is facilitating users attention to critical matter such as the financial crisis as well as any political argument. However, in this work we show that unsubstantiated rumors are pervasive in online social media and they might affect users belief formation and revision.

Information based on conspiracy are able to create a climate of disengagement from mainstream society and from officially recommended practices [27] – e.g. vaccinations, diet, etc. Conspiracy thinking exposes individuals to unsubstantiated (difficult to verify) hypotheses providing alternative explanations to reality [28]-[32]. In particular, conspiracists are prone to explain significant social or political aspects as plots conceived by powerful individuals or organizations [33]. Furthermore our results suggests that exposure to unsubstantiated rumors might facilitate the interaction with intentional false claims such as the case of Senator Cirenga.

References

  1. Jøsang A, Quattrociocchi W: Advanced features in bayesian reputation systems. In TrustBus, volume 5695 of Lecture Notes in Computer Science. Edited by: Fischer-Hübner S, Lambrinoudakis C, Pernul G. Springer, Berlin; 2009:105–114.

    Google Scholar 

  2. Kreps DM, Wilson R: Reputation and imperfect information. J Econ Theory 1982, 27(2):253–279. 10.1016/0022-0531(82)90030-8

    Article  MATH  MathSciNet  Google Scholar 

  3. Diamond DW: Monitoring and reputation: the choice between bank loans and directly placed debt. J Pol Econ 1991, 99(4):689–721. 10.1086/261775

    Article  MathSciNet  Google Scholar 

  4. Jøsang A, Ismail R, Boyd C: A survey of trust and reputation systems for online service provision. Decis Support Syst 2007, 43(2):618–644. 10.1016/j.dss.2005.05.019

    Article  Google Scholar 

  5. Rice SC: Reputation and uncertainty in online markets: an experimental study. Inf Syst Res 2012, 23(2):436–452. 10.1287/isre.1110.0362

    Article  Google Scholar 

  6. Quattrociocchi W, Paolucci M, Conte R: Reputation and uncertainty reduction: Simulating partner selection. In Trust in Agent Societies, volume 5396 of Lecture Notes in Computer Science. Edited by: Falcone R, Barber S, Sabater-Mir J, Singh M. Springer, Berlin, Heidelberg; 2008:308–325.

    Google Scholar 

  7. Howell L: Digital Wildfires in a Hyperconnected World. World Economic Forum, Davos, CH; 2013.

    Google Scholar 

  8. Mccombs ME, Shaw DL: The agenda-setting function of mass media. Public Opinion Q 1972, 36(2):176–187. 10.1086/267990

    Article  Google Scholar 

  9. Lanham RA: The economics of attention: style and substance in the age of information. University Of Chicago Press, USA; 2007.

    Google Scholar 

  10. Qazvinian V, Rosengren E, Radev DR, Mei Q: Rumor has it: identifying misinformation in microblogs. In Proceedings of the conference on empirical methods in natural language processing, EMNLP ’11, Association for Computational Linguistics. Stroudsburg, PA, USA; 2011:1589–1599.

    Google Scholar 

  11. Dow Pa, Adamic LA, Friggeri A (2013) The anatomy of large Facebook cascades. The AAAI Press.

    Google Scholar 

  12. Friggeri A, Adamic L, Eckles D, Cheng J: Rumor cascades. In Proceedings of the 8th International AAAI conference on weblogs and social media (ICWSM’14). MI, Ann Arbor; 2014.

    Google Scholar 

  13. Feizy R: An evaluation of identity in online social networking: distinguishing fact from fiction. University of Sussex, UK; 2010.

    Google Scholar 

  14. Guillory J, Spiegel J, Drislane M, Weiss B, Donner W, Hancock J: Upset now?: emotion contagion in distributed groups. In Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’11. ACM, New York, NY, USA; 2011:745–748.

    Google Scholar 

  15. Bekkers V, Beunders H, Edwards A, Moody R: New media, micromobilization, and political agenda setting: crossover effects in political mobilization and media usage. Inf Soc 2011, 27(4):209–219. 10.1080/01972243.2011.583812

    Article  Google Scholar 

  16. Gonzalez-Bailon S, Borge-Holthoefer J, Rivero A, Moreno Y (2011) The dynamics of protest recruitment through an online network. Scientific Report.

    Google Scholar 

  17. Garcia D, Mendez F, Serdült U, Schweitzer F: Political polarization and popularity in online participatory media: an integrated approach. In Proceedings of the first edition workshop on Politics, elections and data, PLEAD ’12. ACM, New York, NY, USA; 2012:3–10. 10.1145/2389661.2389665

    Chapter  Google Scholar 

  18. Crespi I: The public opinion process. In How the people speak. Lawrence Erlbaum Associates, USA; 1997.

    Google Scholar 

  19. Lippmann W: Public opinion. Penguin Books, UK; 1946.

    Google Scholar 

  20. Levy P: Collective Intelligence: mankind’s Emerging World in Cyberspace. Perseus Publishing, USA; 1999.

    Google Scholar 

  21. Buckingham Shum S, Aberer K, Schmidt A, Bishop S, Lukowicz P, Anderson S, Charalabidis Y, Domingue J, Freitas S, Dunwell I, Edmonds B, Grey F, Haklay M, Jelasity M, Karpištšenko A, Kohlhammer J, Lewis J, Pitt J, Sumner R, Helbing D: Towards a global participatory platform. Eur Phys J Special Topics 2012, 214(1):109–152. 10.1140/epjst/e2012-01690-3

    Article  Google Scholar 

  22. Malone TW, Klein M: Harnessing collective intelligence to address global climate change. Innovations: Technol Governance Globalization 2007, 2(3):15–26. 10.1162/itgg.2007.2.3.15

    Article  Google Scholar 

  23. Shadbolt N, Hall W, Hendler JA, Dutton WH (2013) Web science: a new frontier. Phys Eng Sci 371(1987).

    Google Scholar 

  24. Byford J: Conspiracy theories: a critical introduction. Palgrave Macmillan, USA; 2011.

    Book  Google Scholar 

  25. Fine GA, Campion-Vincent V, Heath CRumor Mills: the social impact of rumor and legend. Social problems and social issues, Transaction Publishers, USA.

  26. Hogg MA, Blaylock DL: Extremism and the psychology of uncertainty. In Blackwell/Claremont Applied Social Psychology Series. Wiley, USA; 2011.

    Google Scholar 

  27. Bauer M: Resistance to new technology: nuclear power, information technology and biotechnology. Cambridge University Press, UK; 1997.

    Google Scholar 

  28. Quattrociocchi W, Caldarelli G, Scala A (2014) Opinion dynamics on interacting networks: media competition and social influence. Scientific Reports.

    Google Scholar 

  29. Mocanu D, Rossi L, Zhang Q, Karsai M, Quattrociocchi W (2014) Collective attention in the age of (mis)information. CoRR, abs/1403: 3344.

    Google Scholar 

  30. Quattrociocchi W, Amblard F, Galeota E: Selection in scientific networks. Soc Netw Anal Mining 2012, 2(3):229–237. 10.1007/s13278-011-0043-7

    Article  Google Scholar 

  31. Bessi A, Coletto M, Davidescu GA, Scala A, Caldarelli G, Quattrociocchi W (2014) Science vs conspiracy: collective narratives in the age of (mis)information. submitted to Plos One.

    Google Scholar 

  32. Bessi A, Coletto M, Davidescu GA, Scala A, Quattrociocchi W: Misinformation in the loop: the emergence of narratives in osn. In Itais 2014. Genova, Italy; 2014.

    Google Scholar 

  33. Sunstein CR, Vermeule A: Conspiracy theories: Causes and cures*. J Pol Philos 2009, 17(2):202–227. 10.1111/j.1467-9760.2008.00325.x

    Article  Google Scholar 

  34. Bogart LM, Thorburn S: Are HIV/AIDS conspiracy beliefs a barrier to HIV prevention among African Americans. J Acquir Immune Defic Syndr 2005, 38(2):213–8. 10.1097/00126334-200502010-00014

    Article  Google Scholar 

  35. Kalichman SC: Denying AIDS: Conspiracy theories, pseudoscience, and human tragedy. Springer, Berlin; 2009.

    Book  Google Scholar 

  36. Ennals R, Trushkowsky B, Agosta JM: Highlighting disputed claims on the web. In Proceedings of the 19th international conference on World wide web, WWW ’10. ACM, New York, NY USA; 2010:341–350. 10.1145/1772690.1772726

    Chapter  Google Scholar 

  37. McKelvey KR, Menczer F: Truthy: Enabling the study of online social networks. In Proceedings of the 2013 conference on computer supported cooperative work companion, CSCW ’13. ACM, New York, NY, USA; 2013:23–26.

    Google Scholar 

  38. Onnela J-P, Reed-Tsochas F: Spontaneous emergence of social influence in online systems. Proc Natl Acad Sci 2010, 107(43):18375–18380. 10.1073/pnas.0914572107

    Article  Google Scholar 

  39. Ugander J, Backstrom L, Marlow C, Kleinberg J: Structural diversity in social contagion. Proc Natl Acad Sci 2012, 109(16):5962–5966. 10.1073/pnas.1116502109

    Article  Google Scholar 

  40. Lewis K, Gonzalez M, Kaufman J: Social selection and peer influence in an online social network. Proc Natl Acad Sci 2012, 109: 68–72. 10.1073/pnas.1109739109

    Article  Google Scholar 

  41. Mocanu D, Baronchelli A, Gonçalves B, Perra N, Zhang Q, Vespignani A (2013) The twitter of babel: Mapping world languages through microblogging platforms. PLOS ONE 8(4): e61981. Mocanu D, Baronchelli A, Gonçalves B, Perra N, Zhang Q, Vespignani A (2013) The twitter of babel: Mapping world languages through microblogging platforms. PLOS ONE 8(4): e61981.

  42. Adamic L, Glance N: The political blogosphere and the 2004 u.s. election: Divided they blog. In In LinkKDD’05: Proceedings of the 3rd international workshop on Link discovery. Chicago Illinois, ACM, USA; 2005:36–43. 10.1145/1134271.1134277

    Chapter  Google Scholar 

  43. Kleinberg J (2013) Analysis of large-scale social and information networks. Philos Trans R Soc A: Math Phys Eng Sci: 371.

    Google Scholar 

  44. Hannak A, Margolin D, Keegan B, Weber I: Get back! you don’t know me like that: The social mediation of fact checking interventions in twitter conversations. In Proceedings of the 8th international AAAI conference on weblogs and social media (ICWSM’14). MI, Ann Arbor; 2014.

    Google Scholar 

  45. simplyhumans (2014) Simply humans - benefic effect of lemons. Website, 7 2014. Last checked: 31.07.2014.

  46. simplyhumans (2014) Dieta peronalizzata - benefic effect of lemons. Website, 7 2014. Last checked, 31.07.2014.

  47. simplyhumans (2014) Simply humans - benefic effect of lemons. Website, 7 2014. Last checked, 31.07.2014.

  48. Ambrosetti G (2013) I forconi: “il senato ha approvato una legge per i parlamentari in crisi”. chi non verrà rieletto, oltre alla buonuscita, si beccherà altri soldi. sarà veroWebsite, 8 2013. last checked: 19.01.2014.

    Google Scholar 

  49. Facebook (2013) Using the graph api. Website, 8 2013. last checked, 19.01.2014.

  50. Semplicementeme (2012) Website, 8 2012. last checked: 19.01.2014.

  51. Ellison NB, Steinfield C, Lampe C: The benefits of facebook “friends:” social capital and college students’ use of online social network sites. J Computer-Mediated Commun 2007, 12(4):1143–1168. 10.1111/j.1083-6101.2007.00367.x

    Article  Google Scholar 

  52. Joinson AN: Looking at, looking up or keeping up with people?: Motives and use of facebook. In Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’08. ACM, New York NY, USA; 2008:1027–1036.

    Google Scholar 

  53. Viswanath B, Mislove A, Cha M, Gummadi KP: On the evolution of user interaction in facebook. In Proceedings of the 2Nd ACM workshop on online social networks, WOSN ’09. ACM, New York, NY, USA; 2009:37–42. 10.1145/1592665.1592675

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Walter Quattrociocchi.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

Alessandro Bessi, Antonio Scala, Luca Rossi, Qian Zhang, Walter Quattrociocchi LR implemented the tools to download the data, QZ WQ AB performed the analysis, AB AS WQ wrote the paper. All authors read and approved the final manuscript.

Authors’ original submitted files for images

Rights and permissions

Open Access  This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.

The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

To view a copy of this licence, visit https://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bessi, A., Scala, A., Rossi, L. et al. The economy of attention in the age of (mis)information. J Trust Manag 1, 12 (2014). https://doi.org/10.1186/s40493-014-0012-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40493-014-0012-y

Keywords