TY - JOUR AU1 - Hopp,, Toby AU2 - Ferrucci,, Patrick AU3 - Vargo, Chris, J AB - Abstract Recently, substantial attention has been paid to the spread of highly partisan and often factually incorrect information (i.e., so-called “fake news”) on social media. In this study, we attempt to extend current knowledge on this topic by exploring the degree to which individual levels of ideological extremity, social trust, and trust in the news media are associated with the dissemination of countermedia content, or web-based, ideologically extreme information that uses false, biased, misleading, and hyper-partisan claims to counter the knowledge produced by the mainstream news media. To investigate these possible associations, we used a combination of self-report survey data and trace data collected from Facebook and Twitter. The results suggested that sharing countermedia content on Facebook is positively associated with ideological extremity and negatively associated with trust in the mainstream news media. On Twitter, we found evidence that countermedia content sharing is negatively associated with social trust. In the aftermath of the 2016 United States presidential election, then-President Barack Obama called the spread of false political information a corrosive threat to society (Higgins, McIntire, & Dance, 2016). Reflecting similar concerns, news organizations have devoted substantial amounts of column space to identifying actors who have profited from the spread of false, misleading, hyper-partisan, and sensationalized political information (e.g., Ohlheiser, 2016). Somewhat distressingly, traditional renderings of the public sphere hold that political information that is factually incorrect or distortedly biased is not supposed to exist in the public sphere, as civil society will, theoretically, recognize its uselessness and force it out (Habermas, 1991). Indeed, the gatekeeping metaphor articulates the processes by which journalists dutifully guard the public sphere, sift through reams of information for verification, and then disseminate only accurate, pertinent, and important news (Shoemaker & Vos, 2009). However, the emergence of social media, coupled with rapidly declining levels of institutional and social trust (e.g., Edelman, 2017) and the weakening authority of professional journalists (e.g., Tong, 2018), has resulted in an information environment that is perhaps uniquely conducive to the spread of untrue and or otherwise misleading political information. In light of structural changes to informational, social, and political ecosystems, and with the so-called fake news epidemic as a backdrop, this study set out to better understand how false and hyper-partisan political information is spread on social media. We argue that information commonly referred to as fake news may be better understood as a type of “countermedia” content (e.g., Ylä-Anttila, Bauvois, & Pyrhönen, 2019), or web-based information that employs a combination of false, biased, and misleading claims as a means of countering the knowledge traditionally produced by the mainstream press. We specifically suggest that countermedia content is typically ideologically extreme, and tends to invoke themes of mistrust in both social others and in the mainstream news media. Furthermore, using prior work on selective sharing (Shin & Thorson, 2017), we predict that countermedia content is most likely to be shared on social media by those who are ideologically extreme, those with low levels of social trust, and those with low levels of trust in the mainstream news. These hypotheses are tested using a unique data set that combined trace data from Facebook and Twitter with self-reported survey data. The mainstream news media as an epistemic authority Most descriptions of the functions of the media in the 20th century implicitly utilize the theoretical visioning of the public sphere cultivated by Habermas (1991). Essential to these accounts of the public sphere's proper functioning is the idea of gatekeeping, which is the process by which a piece of information gets selected, altered, and morphed into a digestible message suitable for large-scale distribution and consumption (Shoemaker & Vos, 2009). During the gatekeeping process, a journalist gathers as much information as possible and then selects—based on a host of factors—the pieces of information most important for the democratic foundation of a society. Thus, for at least the last several decades, journalists have normatively engaged in a set of practices where they objectively assess information and provide a truthful, good-faith rendering of reality that represents all legitimate viewpoints (Schudson, 2001). In this way, the mainstream media assumed the role of an epistemic authority within the dominant public sphere (Bruns, 2018). Presently, however, the mainstream press' capacity to serve as an epistemic authority is regularly challenged (e.g., Rojas, 2010; Tong, 2018). This denigration of authority creates new epistemic opportunities for non-journalistic entities. Indeed, according to Kreiss (2017, p. 452), “as a result of a world that has changed around them economically, technologically, and politically, professional journalists now face multiple crises … including of legitimacy.” Such a weakening of authority effectively undermines “journalism's ability to serve as a communicative institution in the civil sphere that ultimately protects democratic values” (Kreiss, 2017, p. 452). The factors causing the dilution of the modern press' epistemic authority are both complicated and diffuse (e.g., Waisbord, 2018). Having said that, there is reason to suspect that the emergence of social and digital media infrastructures plays an especially critical role in the promotion of alternative sources of epistemic authority. On one hand, the connectivity (Ellison, Steinfeld, & Lampe, 2011), information-acquisition (HerdaĞdelen, Zuo, Gard-Murray, & Bar-Yam, 2013), and shareability (Oeldorf-Hirsch & Sundar, 2015) affordances of digital and social media allow non-institutional actors to create and disseminate content that both bypasses traditional gatekeeping capabilities and directly challenges the mainstream press' epistemic authority. On the other hand, digital and social networking platforms have—in a myriad of ways—also negatively affected the business models that have, historically speaking, sustained professional journalism (Usher & Carlson, 2018). It should, then, be no surprise that scholars have repeatedly linked the current “post-truth” state of society to digital media platforms generally and to social media platforms specifically (e.g., Hannan, 2018; Lazer et al., 2018). Below, we argue that the social media–based spread of false, biased, and/or misleading political information is most likely to occur when such information is perceived by audiences to be personally relevant and attitudinally consistent (e.g., Shin & Thorson, 2017). First, however, we present and define the concept of countermedia, which is central to this study. Countermedia Recently, scholars (e.g., Noppari, Hiltunen, & Ahva, 2019; Wasilewski, 2019; Ylä-Anttila, 2018; Ylä-Anttila et al., 2019) have argued that information commonly referred to as fake news in the popular and academic literatures may be better understood as countermedia content. Numerous difficulties arise when seeking to precisely conceptualize fake news as a social phenomenon. First, the word “fake” connotes intentionality: a manifest attempt on the part of the message creator to spread untrue information. This results in obvious issues, as it is often impossible to objectively assess communicator intent. Second, the phrase “fake news” lends itself to politicization (Lazer et al., 2018). Indeed, one recent study suggested that the use of the phrase may hamper audiences' ability to distinguish news content from false political information (Van Duyn & Collier, 2018). Third, social and political events are often subject to interpretation (Guess, Nagler, & Tucker, 2019). In many cases, it is difficult to establish specific criteria for what constitutes objectively false information. Fourth, many websites that publish false political information also “distort genuine news reports, and copy or repurpose content from other outlets” and “selectively amplify political events in an over-the-top style that flatters the prejudices of a candidate's supporters” (Guess, Nyhan, & Reifler, 2018, p. 4). Classifying all content appearing on these sites as outright false is inaccurate and has led authors such as Mourão and Robertson (2019, p. 1) to arrive at somewhat paradoxical conclusions, such as “fake news is defined more by partisanship and identity politics than misinformation and deception.” Other proposed labels also incur problems. For instance, Wardle and Derakhshan (2017, p. 20) argued for the adoption of the terms “misinformation” (“information that is false, but not created with the intention of causing harm”) and “disinformation” (“information that is false and deliberately created to harm a person, social group, organization or country”). Similarly, the “mock news” label introduced by Ekström and Westlund (2019, p. 264) defines content in terms of “imitating the tone and appearance of news material” while being “comprised of intentionally fake content.” These terms again problematically rely upon both an assumption of the motivations driving content creators (i.e., that there exists an intent to inflict harm) and a presumption that the information is, indeed, factually incorrect, rather than simply biased in tone, framing, or story selection. In light of the foregoing conceptual issues that pertain to the articulation of fake news, we adopt and adapt the term countermedia in this work. According to Ylä-Anttila et al. (2019, p. 1), countermedia sites are “media outlets, but also tend to explicitly oppose ‘the (mainstream) media,’ as well as the establishment more generally (however ambiguously defined).” These sites produce informational content that combines “fact with fiction and rumors, sometimes intentionally blurring the lines or spreading outright lies, most often by cherry-picking, coloring, and framing information” (Ylä-Anttila, 2018, p. 357). Frequently, countermedia sites adopt the tone and appearance of traditional news while simultaneously rejecting the mainstream press' normative values of objectivity and verifiability. According to Noppari, Hiltunen, and Ahva (2019, p. 25), countermedia websites seek to “oppose, challenge and offer alternatives to established media coverage and discussion within specific areas.” In this way, countermedia sites provide counterknowledge, or “alternative knowledge which challenges establishment knowledge, replacing knowledge authorities with new ones, thus providing an opportunity for political mobilization” (Ylä-Anttila, 2018, p. 359). Importantly, some prior research has rightly pointed out that disenfranchised groups can use alternative forms of non-mainstream journalistic information to pursue goals related to inclusion and social justice. Ylä-Anttila et al. (2019) addressed this tension point by arguing that countermedia can be distinguished from other (normatively and democratically positive) forms of alternative media, because countermedia content seeks to inhibit, rather than enhance, core democratic values, such as inclusion, veracity, and fairness. Taken as a whole, then, the countermedia conceptualization encompasses outright false information while also accounting for more frequently observed instances of hyper-partisan information that can be contrasted with the normatively objective knowledge produced by the mainstream press. In the below sections, we provide further clarification on the countermedia concept by using extant research to describe the typifying characteristics of countermedia content. Ideological extremity as a countermedia attribute Countermedia content is commonly ideologically extreme in nature. For instance, in Narayanan et al.'s (2018, p. 2) attempt to classify “junk news sites” (broadly defined as outlets that “deliberately publish misleading, deceptive or incorrect information purporting to be real news about politics, economics or culture”), the authors identified political bias as a key attribute, noting that “reporting in these outlets is highly biased and ideologically skewed, which is otherwise described as hyper-partisan reporting. These outlets frequently present opinion and commentary essays as news.” Likewise, Allcott and Gentzkow (2017, p. 216) remarked that so-called fake news websites produce content that very deliberately takes advantage of the “increasingly negative feelings each side of the political spectrum holds toward the other.” Lewis and Marwick (2017, p. 18) noted that ideologically extreme right-wing groups, such as “White supremacists, men's rights activists, 4chan and 8chan users, trolls, and conspiracy theorists,” frequently used various forms of false political information in the service of “redpilling,” or recruiting those with mainstream beliefs into extremist spaces. Finally, Mourão and Robertson's (2019) recent content analysis of popular fake news sites found that approximately 55% of all articles contained some level of partisanship and that around a quarter of all articles were either moderately or extremely partisan in nature. Taken as a whole, this research suggests that countermedia sites frequently produce hyper-partisan content that can be obviously contrasted with the knowledge generated by the normatively non-partisan mainstream press. Social mistrust as a countermedia attribute In addition to producing ideologically extreme content, countermedia sites also commonly feature content that capitalizes on and stokes feelings of social mistrust. These sites commonly enforce a worldview comprised of two groups of people: those in the know (a small group of people, such as readers of the site) and “sheeple” (everyone else, and especially those who populate the dominant political sphere; e.g., Harambam & Aupers, 2014; Ylä-Anttila, 2018). The emphasis here is that most people are either ignorant or operating with ulterior motives (Weinberg, 2019) and are, therefore, unworthy of trust. For example, the prominent countermedia site Breitbart is frequently and explicitly critical of a host of social groups, including racial/ethnic minority groups, so-called social justice warriors, globalists, feminists, Muslims, and a host of other types of people (e.g., Hylton, 2017). By thematically invoking feelings of social mistrust, countermedia provide new forms of knowledge that are not based upon factuality or verifiability. Instead, these sites construct knowledge within the framework of individual and social identity attributes, effectively winnowing the subset of trustable others to a select few. Mistrust in the mainstream news media as a countermedia attribute As sources of epistemic renderings that run counter to those found in the mainstream press, countermedia sites simultaneously take advantage of and incur further mistrust in the mainstream news media. Scholars frequently tie the rise of fake news to collapsing levels of trust in the mainstream media (e.g., Allcott & Gentzkow, 2017; Lewis & Marwick, 2017). Because countermedia sites offer information that contradicts or modifies the reporting found in the mainstream news media, it is specifically advantageous for them to hold an adversarial (and at times, antagonistic) relationship with the mainstream press. For instance, Breitbart regularly publishes instances of mainstream reporting that it excoriates as fake, false, and/or intentionally misleading (in other words, assuming the self-appointed role of mainstream-media watchdog). In this sense, countermedia often take on a “corrective” character (Rojas, 2010), as they seek to address and correct the inadequate (and ultimately, untrustworthy) reporting proffered by the mainstream news media. Selective sharing of countermedia content Countermedia dissemination is explicitly reliant upon the sharing behaviors of individual users. Accordingly, understanding the individual attributes of those most likely to share this content is critical to the comprehensive understanding of the spread of democratically deleterious information online. Prior research by Noppari et al. (2019, p. 28) linked the consumption of Finnish countermedia sites to audiences that were societal outsiders who were often ideologically extreme, held feelings of “social alienation,” and distrusted the mainstream generally and mainstream news specifically. In this study, we extend these findings to the sharing of countermedia content using Shin and Thorson's (2017) concept of selective political information sharing. Specifically, these authors argue that social media users are “likely to share ideologically congenial information” (Shin & Thorson, 2017, p. 235), reasoning that “sharing is fundamentally a social activity intended for or motivated by the imagined audience” (Shin & Thorson, 2017, p. 236) and that sharing ideologically consonant information helps mitigate states of negative dissonance incurred by the existence of contradictory information, particularly as it pertains to one's in-group. Other, similar work on social media–based content sharing suggests that people may selectively share attitudinally consonant content for a variety of different reasons linked to personal relevancy, identity management, and personal influence (Bobkowski, 2015; Liang, 2018). More broadly, this line of inquiry can be attached to research on informational utility, which shows that people tend to digitally share news information that resonates with their own lives and is consistent with their political and social worldviews (Knobloch-Westerwick, 2015). Taken as a whole, the selective sharing perspective indicates that people share information on social media that they agree with and that they believe can make them look good to others, obtain/maintain social status, and/or accomplish personal influence–linked goals. One important aspect of the selective sharing perspective relates to social media's capacity to afford users with the ability to take corrective actions (Liang, 2018; Shin & Thorson, 2017). Corrective actions are reactive behaviors that involve individual actors “correcting” facts and narratives derived from the mainstream news media that are perceived to be faulty, incorrect, incomplete, or biased (Rojas, 2010). These behaviors commonly occur when an actor believes that some aspect of her or his worldview or personal/social identity is under threat. The social connection, information acquisition, and shareability affordances of social media facilitate the performance of corrective actions because many barriers associated with traditional forms of participation are reduced or removed altogether (e.g., Barnidge & Rojas, 2014). In terms of partisan identity, there exists ample empirical support for the selective sharing perspective. Shin and Thorson (2017, p. 233) found that Twitter-based partisans selectively shared fact-check articles that “cheerlead their own candidate and denigrate the opposing party's candidate.” This finding is consistent with other research showing that social media users tend to like and share content that conforms to their partisan beliefs (Colleoni, Rozza, & Arvidsson, 2014). Perhaps most pertinent to the current study, studies on mainstream news sharing tend to show that Democrats/liberals are more likely than Republicans/conservatives to post links to news articles (e.g., Weeks & Holbert, 2013). In this study, we build upon and extend the selective sharing approach to argue that conformity between countermedia content and personal attributes will result in a heightened likelihood of countermedia content sharing on social media. Notably, we look beyond straightforward partisan factors and, instead, focus our energies on other critical features of the political and social self. This line of inquiry is, we believe, important, because partisan identification is an outcome itself linked to various features of one's traits, socialization, and worldviews (e.g., Capara, Barbaranelli, & Zimbardo, 1999; Greene, 1999). It stands to reason that exploration of factors more nuanced than one's identification as a Democrat, Republican, or Independent may yield a more precise rendering of the factors associated with online content sharing. Moreover, countermedia content does not always neatly or absolutely conform to the positions held by political parties (Noppari et al., 2019). In the below sections, we build upon the notion of selective sharing to theorize that those who are ideologically extreme, high in social mistrust, and high in news media mistrust will extract the most informational utility from countermedia and, hence, will be the most likely to share it on social media. Ideological extremity and countermedia sharing As an epistemic authority in the public sphere, the mainstream news media has traditionally embraced deliberative values of good-faith compromise between those with divergent—albeit legitimate—perspectives (Hallin, 1989). For this reason, the mainstream press has tended to reflect an era-specific, centrist position (Schlesinger, 1990). Such an approach is obviously unattractive to those with politically extreme beliefs, as those beliefs are implicitly understood as deviant by the mainstream journalistic apparatus; it should, then, not be a surprise that research shows that those with extreme political positions are unlikely to feel positive sentiment toward the mainstream media (Hallin, 1989). Indeed, research into fake news consumption has reliably indicated that individuals positioned at the extreme ends of the ideological spectrum consume and share false political information at comparably higher rates (e.g., Grinberg, Joseph, Friedland, Thompson, & Lazer, 2019; Guess et al., 2019; Guess et al., 2018). Mourão and Robertson's (2019) recent study indicated that fake news articles featuring partisanship were most likely to be shared on both Facebook and Twitter. Of note, this study found that a folded measure of partisan strength better accounted for link sharing than directional measures of partisan identification/valence. Based upon these findings, we theorize that ideologically extreme social media users are unlikely to perceive the mainstream news as possessing informational utility. Conversely, the sort of counterknowledge provided by the often hyper-partisan countermedia does offer informational utility, resulting in the user taking corrective action by selectively sharing countermedia content. H1: Ideological extremity will be positively associated with sharing countermedia content on social media. Social mistrust and countermedia sharing Normative approaches to political and civic participation suggest that commitment to the dominant public sphere is conditioned upon the belief that a wide variety of social others can be trusted and that these other actors share a mutual desire for social progress and betterment (Brants, 2013). This perception permeates mainstream news coverage, which has normatively adopted an approach which gives voice to multiple sides of a given issue or news item (Schudson, 2001). In other words, mainstream journalism tends to treat actors as good-faith entities. This can be contrasted with countermedia content, which often coalesces around themes of mistrust in others (Harambam & Aupers, 2014; Ylä-Anttila, 2018). For individuals with low levels of social trust, it thus seems reasonable to propose that countermedia offers a great deal of informational utility (especially in comparison to mainstream news content), as it confirms their feelings that social others generally operate from a place of bad faith and should be considered adversarial in nature. Sharing countermedia content online, then, takes on a corrective form, as it can be used to signal that normative presumptions of good faith (including those held by the mainstream media) are faulty. Conversely, prior research on fake news verification behaviors suggests that those with high levels of general social trust are likely to assemble online networks comprised of diverse others. These networks lead to greater informational inflows, which, ultimately, stimulate verification behaviors that hamper the spread of fake news (Torres, Gerhart, & Negahban, 2018). H2: Social trust will be negatively associated with sharing countermedia content on social media. Mistrust in the mainstream news and countermedia sharing Trust in the mainstream media is a critical precursor to its use (Tsfati & Capella, 2003). Research has shown that those who do not trust the mainstream news obtain information from other sources (Ardèvol-Abreu & Gil de Zúñiga, 2017). For those who do not believe that the mainstream press is accurately reporting on important social and political issues, the alternative renderings of reality offered by countermedia sites should have enhanced information utility (Allcott & Gentzkow, 2017). Furthermore, posting countermedia content on social media allows for corrective actions to be taken on a number of levels, as the promotion of these sources of counterknowledge allows for questioning of both the facts and the institutions that have historically been charged with disseminating facts (Noppari et al., 2019). H3: Trust in the mainstream news media will be negatively associated with sharing countermedia content on social media. Implications for mainstream news sharing The basic principle underlying the selective sharing perspective is that people share content that provides personal and social benefits, and do not share content that contradicts ideological, attitudinal, and or identity attributes (Shin & Thorson, 2017). In this study, we have suggested that countermedia exists as an alternative form of knowledge that directly challenges the authority of the traditional press. Building on these two general propositions, H1–H3 stipulated that congruency between countermedia content and social media user features should result in countermedia content sharing being highest among the ideologically extreme, those with low social trust, and those with low levels of trust in the mainstream media. If the foregoing contentions are correct, it should, then, also be the case that the aforementioned variables are either unrelated or negatively associated (i.e., asymmetrical) with sharing mainstream news content. H4: The associations between mainstream news sharing and ideological extremity, social trust, and mainstream news trust should be comparatively asymmetrical to the associations between countermedia sharing and ideological extremity, social trust, and mainstream news trust. Platform differences In this work, we focus on Facebook and Twitter, two of the most popular social media platforms in the United States for political discussion. It is possible—and perhaps even probable—that countermedia sharing behaviors differ across social media platforms. Research on fake news has shown that Facebook specifically serves an especially important facilitator in the spread of fake news (Allcott & Gentzkow, 2017; Guess et al., 2019; Guess et al., 2018). Separately, cross-platform studies suggest that social media sites have different technological and social capabilities, and that these factors may, together, result in usage behaviors (e.g., Shane-Simpson, Manago, & Gillespie-Lynch, 2018). However, while it may be likely that countermedia content sharing behaviors differ across platforms, the research as it currently stands does not provide enough information to form specific hypotheses pertaining to the degree to which associations between the independent variables of primary interest and countermedia content sharing may differ across Facebook and Twitter. Nonetheless, the question is potentially important. RQ1: Do associations between ideological extremity, social trust, and trust in the mainstream news media and countermedia content sharing differ across Facebook and Twitter? Method This study combined digital trace data collected on Facebook and Twitter with self-report data obtained via a survey. Study recruitment was accomplished using Qualtrics. Four sample controls were enforced: an approximate 50/50 gender split1; a requirement that respondents be active users of both Facebook and Twitter2; a requirement that respondents be current U.S. citizens; and a requirement that respondents be 18 years or older. Given the exploratory nature of the current study, we also requested in the recruitment that respondents talk about politics online at least monthly. However, due to method-based limitations, this was not enforced as a sample inclusion requirement.3 Before participating in the study, respondents were informed of the requirements governing participation and provided with a consent form that articulated the parameters of data collection. Regarding the collection of social media data, respondents were told that social media messages “will be collected anonymously, and at no time will the researchers know your identity, or the identities of your friends. The data will solely be used to better understand how you share news on social media.” Upon agreement to the terms of the study, respondents were asked to authorize a custom application that was used to harvest their social media data. After authorizing the application, respondents were piped into the survey environment, which was hosted on Qualtrics' servers. Self-report and social media data were joined using an anonymous identification code that was assigned by the web application. The following information was extracted from the Facebook API: mobile_status_update, created_note, shared_story, created_event, wall_post, app_created_story, and published_story. We did not capture newsfeed or friend information. In the case of Twitter, only the content of posted tweets was retained. In the cases of both Facebook and Twitter, our data extraction processes allowed us to assess text and links posted by users. Our institutional review board approved all study procedures. All data collection processes fully conformed with both platforms' terms of service at the time of execution. Data collection occurred in the period between 7 March 2017 and 6 June 2017. Social media data was downloaded on 6 June 2017. This means that the obtained social media data for each user was exhaustive from their first post on the platform up until the harvest date. Facebook and Twitter analytic samples A total of 783 valid responses were collected.4 Two analytic subsamples were created from the overall sample: a Facebook analytic sample and Twitter analytic sample. For both platforms, we coded content that appeared between 1 August 2015 and 6 June 2017. This allowed us to assess content posted in the period before, during, and after the 2016 U.S. presidential election. In each subsample, we deleted those cases whose platform activity period began after 1 August 2015 to ensure that the study period was consistent for all analyzed users. This resulted in the deletion of information collected from 32 respondents from the Facebook sample (n = 751) and 184 cases from the Twitter sample (n = 599). For the reported statistical analyses, we focused on those that provided complete information on the self-report measures of interest, resulting in a Facebook final analytic n of 678 and a Twitter final analytic n of 543. In all, 525 respondents were in both the Facebook and Twitter analytic samples.5,6 Measures Countermedia content sharing Countermedia sharing frequency was assessed using a list-matching procedure. To develop a comprehensive list of countermedia sites, we first generated a corpus of domains by pooling lists of so-called fake news domains published by About.com, Aloisius, CBS News, The Daily Dot, Fake News Watch, Fake News Checker, Melissa Zimdars, NPR, Snopes Field Guide, The New Republic, and U.S. News and World Report.7 We manually reviewed the final roster of sites to ensure that sites appearing on our final list were predominantly focused on political and/or social events and were not parody accounts (e.g., The Onion, Clickhole). To appear in our list, a site had to appear on at least two of the identified fake news domain lists. In total, we identified 106 countermedia websites. A Python script was used to extract all social media posts that contained articles from the countermedia identified domains. Content from a total of 62 domains was shared across both platforms (see Supporting Information Appendix A for more details). To create the final sharing measure, we summed the number of posts containing countermedia links for each user. In the Facebook sample, a total of 1,152 countermedia content shares were identified. The maximum number of pieces of content shared by a single user was 171. The majority of respondents (71.1%) did not share any countermedia items. The overall mean number of shares was 1.70 (SD = 8.54). Among the subsample of users who shared at least one instance of countermedia, the mean number of shares was 5.88 (SD = 15.11). In the Twitter sample, 128 countermedia content shares were identified. No user shared more than 33 countermedia content items. A large majority (95.0%) of respondents did not share any countermedia content. The overall countermedia sharing mean was 0.24 (SD = 1.94). Among those that shared at least one piece of countermedia content, the mean number of shares was 4.74 (SD = 7.49). Density and histogram plots showing the countermedia sharing distributions for both Facebook and Twitter are provided in Supporting Information Appendix B. Mainstream news content sharing To measure the mainstream news content sharing frequency, we used Vargo and Guo's (2017) list of 1,914 “traditional” news sources. As was the case for countermedia sharing, the final user-level measure was the total number of mainstream news shares occurring between 1 August 2015 and 6 June 2017. A total of 7,468 mainstream news shares were observed in the Facebook sample. In all, 512 participants (75.5%) shared at least one mainstream news article (maximum number of observed shares = 456). Across the sample, the mean number of shares was 11.01 (SD = 29.13). Among those who shared at least one mainstream news article, the mean number of shares was 14.60 articles (SD = 32.75). Across the Twitter sample, a total of 4,229 mainstream news shares from 156 respondents (28.7% of the sample) were recorded. The maximum number of observed shares for a single user was 2,750. Among those sharing at least one content instance, the mean number of shares was 27.11 (SD = 22.59). Density plots and histograms for the mainstream news sharing variables are provided in Supporting Information Appendix B. Ideological extremity Ideological attitude extremity was measured by assessing participant conservatism using three items, all on 7-point scales where higher scores indicated higher levels of conservatism. These individual items were subsequently recoded into a 4-point scale where those who selected options at the scale poles were assigned higher numbers (i.e., a 1 or 7 was coded as a 4, a 2 or 6 was coded as 3, and so on) and collapsed into a single measure (Facebook sample: M = 2.44, SD = 1.09, α = .95; Twitter sample: M = 2.45, SD = 1.08, α = .94). Social trust Social trust was measured using six items (all on 7-point scales) that, together, assessed the degree to which the respondent held reciprocal relationships with a variety of social others, including friends, family members, and members of their surrounding community. According to Fukuyama (1996), wide-ranging feelings of mutuality and affinity can be understood as the end-product of those with high levels of social trust. In this sense, our measure tapped into perceptions pertaining to accumulated levels of social trust with a variety of others, which, as described above, have been shown to be an integral part of participation in the public sphere (Facebook sample: M = 5.29, SD = 1.12, α = .81; Twitter sample: M = 5.32, SD = 1.06, α = .79). Mainstream news media trust Trust in the mainstream news media was measured using eight items, all adapted from Kohring and Matthes (2007). All items were on 7-point scales. The employed mainstream news media trust scale taps four aspects of trust: trust in the media's selectivity of topics; trust in the media's selectivity of facts; trust in the accuracy of depictions found in the mainstream news media; and trust in the journalistic assessments found in the mainstream news media. A single index was formed from the individual items (Facebook sample: M = 4.07, SD = 1.39, α = .94; Twitter sample: M = 4.05, SD = 1.39, α = .94). Control measures Control factors were divided into three broad categories: political factors, media use factors, and demographic factors. Political interest was measured using four items, all placed on 7-point scales where higher scores indicated higher levels of political interest. To measure party identification, respondents were asked to select the option that best represented their party affiliation. This measure was subsequently dummy coded, with Democrat set as the reference group. Conservatism was measured using the three measures discussed in the context of the political extremity measure. As it pertained to media use factors, we asked respondents to indicate how often they: actively sought out political content on social media; watched cable news; watched network news; read the newspaper; and read political blogs. All items were placed on 7-point scales (1 = never and 7 = frequently). We also asked respondents to indicate the degree to which they believed that posting political content on social media is appropriate (1 = strongly disagree, 7 = strongly agree). Platform usage intensity was assessed by asking respondents how often they use Facebook and Twitter (1 = never and 7 = frequently). Finally, we calculated the time period (in years) interrelating the respondent's first and most recent post as a measure of platform activity duration. For demographics, the survey assessed respondents' sex and age. Complete wording for all measures (primary and control) is provided in Supporting Information Appendix C. Full descriptive statistics for all control variables are reported in Supporting Information Appendix D. Modeling approach The distributions of the content sharing created some complications as they pertained to the selection of appropriate modeling techniques. In the case of Twitter, the exceptionally small number of cases with non-zero countermedia sharing counts (n = 27) resulted in concerns related to power and our ability to identify a stable and unbiased count model to describe the data. Accordingly, we used a logistic regression model to assess the relationship between the countermedia identity markers and the odds of sharing one or more countermedia content items on Twitter. In regard to the mainstream news content sharing frequency, we again observed a relatively small number of participants with non-zero counts (n = 156). Within this subsample, the five most active users shared over 75% of all shared content. The relatively small sample number of non-zero cases, combined with the presence of outlying cases, again made it difficult to identify a trustworthy modeling approach for the count data; as such, a binary logistic regression model was employed. In the case of Facebook, we observed a comparatively greater number of cases with non-zero share counts for both the countermedia (n = 196) and mainstream news (n = 512) variables. As such, we were reasonably confident in our ability to model the positive counts in an unbiased manner. Four plausible count modeling approaches were considered: a Poisson model, a negative binomial model, a two-part Poisson logit hurdle model, and a two-part negative binomial logit hurdle model (NBLH; for a full discussion of these models, see Hilbe, 2014). Model fitness was determined empirically. Supporting Information Appendix E provides further discussion of the considered modeling approaches and describes the process used for model fit evaluation for these models. A comparative model fit evaluation suggested that the NBLH approach was perhaps best suited for the current data. NBLH models partition the overall model into two components: (1) a binary (logistic) model that accounts for zero versus positive counts; and (2) a truncated negative binomial model that addresses all positive counts (Hilbe, 2014). For the Twitter models and the binary component of the Facebook NBLH models, we provide both the logged odds (b) and odds ratios (ORs), the latter of which reflect the odds of sharing one or more instances of countermedia content relative to a one-unit increase in the non-criterion variable. For the negative binomial component of the NBLH models, we provide the logged counts (b) and incidence rate ratios (IRRs). The IRR coefficients can be interpreted as the rate of change in countermedia content sharing, given a 1-unit increase in the independent variable under consideration. In addition to the primary Facebook- and Twitter-based models, several additional models were generated. Specifically, there were several outlying sharers of countermedia in the Facebook sample; an additional model without these cases was estimated. Moreover, we estimated a model that looked at countermedia content sharing frequency across both Facebook and Twitter, as we surmised that an examination of this model would add additional context to our results. Results H1 suggested that ideological extremity would be positively related to countermedia content sharing. As shown in Table 1, the logistic component of the Facebook model indicated that ideological extremity was positively and statistically significantly related to posting at least one instance of countermedia content on Facebook (b = .34; SE = .10; p < .001; OR = 1.41). In the count model, the data indicated that ideological extremity was positively and statistically significantly associated with the rate at which countermedia content was shared on Facebook (b = .34; SE = .17; p = .044; IRR = 1.41). However, in the Twitter model (see Table 2), we failed to observe a statistically significant relationship between ideological extremity and the odds of sharing one or more instances of countermedia (b = −.02; SE = .24; p = .929; OR = .98). Table 1 Logistic and Count Components of Negative Binomial Logit Hurdle Predicting Countermedia Content Sharing on Facebook . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.34** 0.10 1.41 0.34* 0.17 1.41 Social trust −0.16† 0.09 0.85 −.27† 0.14 0.77 News media trust −0.27** 0.08 0.76 −0.04 0.10 0.96 Political interest 0.17 0.12 1.19 −0.02 0.19 0.98 Democrat (0)—Republican (1) contrast 0.26 0.32 1.30 −0.55 0.58 0.58 Democrat (0)—Independent (1) contrast 0.21 0.25 1.23 0.17 0.40 1.19 Democrat (0)—Other (1) contrast −0.10 0.52 0.90 1.27† 0.73 3.55 Ideological conservatism −0.11 0.07 0.90 0.12 0.11 1.12 Use SM for political information 0.07 0.07 1.08 0.02 0.11 1.02 Watch cable TV news 0.02 0.06 1.02 −0.06 0.09 0.94 Watch network TV news −0.09 0.07 0.91 0.03 0.10 1.03 Read newspapers (online or hardcopy) −0.05 0.06 0.95 −0.22* 0.09 0.80 Read political blogs 0.07 0.07 1.08 0.38** 0.11 1.46 Acceptable to talk about politics on SM 0.10 0.07 1.10 0.20† 0.12 1.22 Facebook usage intensity 0.44** 0.12 1.55 −0.38† 0.21 0.69 Facebook activity duration −0.02 0.05 0.98 0.00 0.08 1.00 Age 0.04** 0.01 1.04 0.08** 0.01 1.08 Sex (1 = female) −0.47* 0.20 0.63 0.13 0.28 1.14 LogLikelihood ratio χ2 = 131.68, df = 18** χ2 = 100.85, df = 18** McFadden R2 .16 .11 . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.34** 0.10 1.41 0.34* 0.17 1.41 Social trust −0.16† 0.09 0.85 −.27† 0.14 0.77 News media trust −0.27** 0.08 0.76 −0.04 0.10 0.96 Political interest 0.17 0.12 1.19 −0.02 0.19 0.98 Democrat (0)—Republican (1) contrast 0.26 0.32 1.30 −0.55 0.58 0.58 Democrat (0)—Independent (1) contrast 0.21 0.25 1.23 0.17 0.40 1.19 Democrat (0)—Other (1) contrast −0.10 0.52 0.90 1.27† 0.73 3.55 Ideological conservatism −0.11 0.07 0.90 0.12 0.11 1.12 Use SM for political information 0.07 0.07 1.08 0.02 0.11 1.02 Watch cable TV news 0.02 0.06 1.02 −0.06 0.09 0.94 Watch network TV news −0.09 0.07 0.91 0.03 0.10 1.03 Read newspapers (online or hardcopy) −0.05 0.06 0.95 −0.22* 0.09 0.80 Read political blogs 0.07 0.07 1.08 0.38** 0.11 1.46 Acceptable to talk about politics on SM 0.10 0.07 1.10 0.20† 0.12 1.22 Facebook usage intensity 0.44** 0.12 1.55 −0.38† 0.21 0.69 Facebook activity duration −0.02 0.05 0.98 0.00 0.08 1.00 Age 0.04** 0.01 1.04 0.08** 0.01 1.08 Sex (1 = female) −0.47* 0.20 0.63 0.13 0.28 1.14 LogLikelihood ratio χ2 = 131.68, df = 18** χ2 = 100.85, df = 18** McFadden R2 .16 .11 Notes: All variance inflation factors are below 2.75. IRR = incidence rate ratio; OR = odds ratio; SM = social media. †p < .10; *p < .05; **p < .001. Open in new tab Table 1 Logistic and Count Components of Negative Binomial Logit Hurdle Predicting Countermedia Content Sharing on Facebook . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.34** 0.10 1.41 0.34* 0.17 1.41 Social trust −0.16† 0.09 0.85 −.27† 0.14 0.77 News media trust −0.27** 0.08 0.76 −0.04 0.10 0.96 Political interest 0.17 0.12 1.19 −0.02 0.19 0.98 Democrat (0)—Republican (1) contrast 0.26 0.32 1.30 −0.55 0.58 0.58 Democrat (0)—Independent (1) contrast 0.21 0.25 1.23 0.17 0.40 1.19 Democrat (0)—Other (1) contrast −0.10 0.52 0.90 1.27† 0.73 3.55 Ideological conservatism −0.11 0.07 0.90 0.12 0.11 1.12 Use SM for political information 0.07 0.07 1.08 0.02 0.11 1.02 Watch cable TV news 0.02 0.06 1.02 −0.06 0.09 0.94 Watch network TV news −0.09 0.07 0.91 0.03 0.10 1.03 Read newspapers (online or hardcopy) −0.05 0.06 0.95 −0.22* 0.09 0.80 Read political blogs 0.07 0.07 1.08 0.38** 0.11 1.46 Acceptable to talk about politics on SM 0.10 0.07 1.10 0.20† 0.12 1.22 Facebook usage intensity 0.44** 0.12 1.55 −0.38† 0.21 0.69 Facebook activity duration −0.02 0.05 0.98 0.00 0.08 1.00 Age 0.04** 0.01 1.04 0.08** 0.01 1.08 Sex (1 = female) −0.47* 0.20 0.63 0.13 0.28 1.14 LogLikelihood ratio χ2 = 131.68, df = 18** χ2 = 100.85, df = 18** McFadden R2 .16 .11 . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.34** 0.10 1.41 0.34* 0.17 1.41 Social trust −0.16† 0.09 0.85 −.27† 0.14 0.77 News media trust −0.27** 0.08 0.76 −0.04 0.10 0.96 Political interest 0.17 0.12 1.19 −0.02 0.19 0.98 Democrat (0)—Republican (1) contrast 0.26 0.32 1.30 −0.55 0.58 0.58 Democrat (0)—Independent (1) contrast 0.21 0.25 1.23 0.17 0.40 1.19 Democrat (0)—Other (1) contrast −0.10 0.52 0.90 1.27† 0.73 3.55 Ideological conservatism −0.11 0.07 0.90 0.12 0.11 1.12 Use SM for political information 0.07 0.07 1.08 0.02 0.11 1.02 Watch cable TV news 0.02 0.06 1.02 −0.06 0.09 0.94 Watch network TV news −0.09 0.07 0.91 0.03 0.10 1.03 Read newspapers (online or hardcopy) −0.05 0.06 0.95 −0.22* 0.09 0.80 Read political blogs 0.07 0.07 1.08 0.38** 0.11 1.46 Acceptable to talk about politics on SM 0.10 0.07 1.10 0.20† 0.12 1.22 Facebook usage intensity 0.44** 0.12 1.55 −0.38† 0.21 0.69 Facebook activity duration −0.02 0.05 0.98 0.00 0.08 1.00 Age 0.04** 0.01 1.04 0.08** 0.01 1.08 Sex (1 = female) −0.47* 0.20 0.63 0.13 0.28 1.14 LogLikelihood ratio χ2 = 131.68, df = 18** χ2 = 100.85, df = 18** McFadden R2 .16 .11 Notes: All variance inflation factors are below 2.75. IRR = incidence rate ratio; OR = odds ratio; SM = social media. †p < .10; *p < .05; **p < .001. Open in new tab Table 2 Logistic Regression Model Exploring Predicting Countermedia Content Sharing on Twitter . b . SE . OR . Ideological extremity −0.02 0.24 0.98 Social trust −0.53* 0.21 0.59 News media trust 0.04 0.17 1.04 Political interest 0.03 0.27 1.03 Democrat (0)—Republican (1) contrast 0.51 0.73 1.67 Democrat (0)—Independent (1) contrast 0.53 0.62 1.70 Democrat (0)—Other (1) contrast 1.59† 0.91 4.89 Ideological conservatism 0.16 0.15 1.17 Use SM for political information 0.12 0.16 1.12 Watch cable TV news 0.08 0.14 1.08 Watch network TV news −0.36* 0.15 0.70 Read newspapers (online or hardcopy) 0.05 0.13 1.06 Read political blogs 0.19 0.15 1.21 Acceptable to talk about politics on SM 0.07 0.17 1.08 Twitter usage intensity 0.32* 0.15 1.38 Twitter activity duration 0.06 0.11 1.06 Age 0.07** 0.02 1.08 Sex (1 = female) −0.80 0.48 0.45 LogLikelihood ratio χ2 = 52.77, df = 18** McFadden R2 .25 . b . SE . OR . Ideological extremity −0.02 0.24 0.98 Social trust −0.53* 0.21 0.59 News media trust 0.04 0.17 1.04 Political interest 0.03 0.27 1.03 Democrat (0)—Republican (1) contrast 0.51 0.73 1.67 Democrat (0)—Independent (1) contrast 0.53 0.62 1.70 Democrat (0)—Other (1) contrast 1.59† 0.91 4.89 Ideological conservatism 0.16 0.15 1.17 Use SM for political information 0.12 0.16 1.12 Watch cable TV news 0.08 0.14 1.08 Watch network TV news −0.36* 0.15 0.70 Read newspapers (online or hardcopy) 0.05 0.13 1.06 Read political blogs 0.19 0.15 1.21 Acceptable to talk about politics on SM 0.07 0.17 1.08 Twitter usage intensity 0.32* 0.15 1.38 Twitter activity duration 0.06 0.11 1.06 Age 0.07** 0.02 1.08 Sex (1 = female) −0.80 0.48 0.45 LogLikelihood ratio χ2 = 52.77, df = 18** McFadden R2 .25 Note: All variance inflation factors are below 2.93. OR = odds ratio; SM = social media. †p < .10, *p < .05; **p < .001. Open in new tab Table 2 Logistic Regression Model Exploring Predicting Countermedia Content Sharing on Twitter . b . SE . OR . Ideological extremity −0.02 0.24 0.98 Social trust −0.53* 0.21 0.59 News media trust 0.04 0.17 1.04 Political interest 0.03 0.27 1.03 Democrat (0)—Republican (1) contrast 0.51 0.73 1.67 Democrat (0)—Independent (1) contrast 0.53 0.62 1.70 Democrat (0)—Other (1) contrast 1.59† 0.91 4.89 Ideological conservatism 0.16 0.15 1.17 Use SM for political information 0.12 0.16 1.12 Watch cable TV news 0.08 0.14 1.08 Watch network TV news −0.36* 0.15 0.70 Read newspapers (online or hardcopy) 0.05 0.13 1.06 Read political blogs 0.19 0.15 1.21 Acceptable to talk about politics on SM 0.07 0.17 1.08 Twitter usage intensity 0.32* 0.15 1.38 Twitter activity duration 0.06 0.11 1.06 Age 0.07** 0.02 1.08 Sex (1 = female) −0.80 0.48 0.45 LogLikelihood ratio χ2 = 52.77, df = 18** McFadden R2 .25 . b . SE . OR . Ideological extremity −0.02 0.24 0.98 Social trust −0.53* 0.21 0.59 News media trust 0.04 0.17 1.04 Political interest 0.03 0.27 1.03 Democrat (0)—Republican (1) contrast 0.51 0.73 1.67 Democrat (0)—Independent (1) contrast 0.53 0.62 1.70 Democrat (0)—Other (1) contrast 1.59† 0.91 4.89 Ideological conservatism 0.16 0.15 1.17 Use SM for political information 0.12 0.16 1.12 Watch cable TV news 0.08 0.14 1.08 Watch network TV news −0.36* 0.15 0.70 Read newspapers (online or hardcopy) 0.05 0.13 1.06 Read political blogs 0.19 0.15 1.21 Acceptable to talk about politics on SM 0.07 0.17 1.08 Twitter usage intensity 0.32* 0.15 1.38 Twitter activity duration 0.06 0.11 1.06 Age 0.07** 0.02 1.08 Sex (1 = female) −0.80 0.48 0.45 LogLikelihood ratio χ2 = 52.77, df = 18** McFadden R2 .25 Note: All variance inflation factors are below 2.93. OR = odds ratio; SM = social media. †p < .10, *p < .05; **p < .001. Open in new tab H2 suggested that social trust would be negatively associated with countermedia content dissemination. In the binary component of the Facebook model, we observed a negative relationship between social trust and countermedia content that approached but did not meet the p < .05 criterion for statistical significance (b = −.16; SE = .09; p = .095; OR = .85). A similar effect was observed in the count component of the model (b = −.27; SE = .14; p = .065; IRR = .77). In the Twitter model, we observed a statistically significant and negative relationship between social trust and sharing at least one instance of countermedia content (b = −.53; SE = .21; p = .011; OR = .59). H3 posited that there would be a negative relationship between trust in the mainstream news media and countermedia content sharing on social media. In the binary component of the Facebook model, a negative, statistically significant relationship between trust in the mainstream news media and countermedia content sharing was observed (b = −.27; SE = .08; p < .001; OR = .76). In the count component of the model, however, the relationship between trust in the mainstream media and countermedia content sharing was not significantly different than zero (b = −.04; SE = .10; p = .672; IRR = .96). Looking next at the Twitter model, we failed to find a significant relationship between trust in the mainstream news media and the odds of sharing one or more instances of countermedia content (b = .04; SE = .17; p = .826; OR = 1.04).8 Notably, there were several outliers in the Facebook data. Specifically, three users had high countermedia share counts (58, 95, and 172). To assess the degree to which these cases affected our reported results, we removed them and re-estimated the model. The results of the outlier-excluded model were essentially identical: in the binary component of the model, we observed significant relationships between sharing one or more countermedia content items and both ideological extremity (b = .34; SE = .10; p < .001; OR = 1.41) and media trust (b = −.26; SE = .08; p < .001; OR = .77), while the relationship between sharing at least one countermedia content item and social trust was negative but not significant at the .05 level (b = −.16; SE = .09; p = .096; OR = .86). In the count component of the model, we observed a significant relationship between ideological extremity and countermedia sharing frequency (b = .33; SE = .16; p = .035; IRR = 1.39). The relationships between sharing frequency and both social trust (b = −.25; SE = .13; p = .062; IRR = .79) and trust in the mainstream news media (b = .06; SE = .10; p = .578; IRR = 1.06) were, again, not statistically significant. An additional NBLH model was generated by combining share totals across platforms. Included in this sample were those that had platform activity dates beginning before 1 August 2015 for both platforms (n = 525). In the logistic model component, a positive association between ideological extremity and sharing at least one countermedia content item was observed (b = .35; SE = .12; p = .002; OR = 1.43). A negative and statistically significant association was identified between sharing at least one bit of countermedia content and trust in the media (b = −.24; SE = .09; p = .002; OR = .79), while the association between countermedia sharing and social trust was negative and non-significant in nature (b = −.20; SE = .11; p = .083; OR = .82). In the count component of the model, social trust was negatively associated with the countermedia sharing rate (b = −.42; SE = .19; p = .025; IRR = .66). Neither ideological extremity (b = .28; SE = .19; p = .144; IRR = 1.32) nor trust in the mainstream news media (b = .02; SE = .11; p = .888; IRR = 1.02) were significantly related to the criterion variable. H4 proposed that the independent variables specified in H1–H3 would be comparatively poor predictors of mainstream news sharing. In the Facebook-based model (see Table 3), we failed to find evidence of statistically non-zero relationships between the criterion variable and ideological extremity (b = .12; SE = .10; p = .259; OR = 1.12), social trust (b = −.17; SE = .10; p = .081; OR = .85), and trust in the mainstream news media (b = −.12; SE = .08; p = .138; OR = .89) in the logistic component. In the count component of the Facebook model, ideological extremity was associated with the rate at which mainstream news content was shared (b = .18; SE = .09; p = .049; IRR = 1.20). However, neither social trust (b = −.07; SE = .09; p = .440; IRR = .93) nor trust in the mainstream news (b = .01; SE = .07; p = .917; IRR = 1.01) were statistically associated with the rate of sharing mainstream news articles. Within the Twitter sample (see Table 4), none of the identified independent variables were significantly associated with mainstream news sharing. Specifically, null relationships were observed between mainstream news content sharing and ideological extremity (b = .16; SE = .11; p = .127; OR = 1.18), social trust (b = −.14; SE = .10; p = .176; OR = .87), and trust in the mainstream news media (b = .06; SE = .08; p = .442; OR = 1.07). Table 3 Logistic and Count Components of Negative Binomial Logit Hurdle Model Predicting Mainstream News Content Sharing on Facebook . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.12 0.10 1.12 0.18* 0.09 1.20 Social trust −0.17† 0.10 0.85 −0.07 0.09 0.93 News media trust −0.12 0.08 0.89 0.01 0.07 1.01 Political interest 0.07 0.11 1.07 0.03 0.09 1.04 Democrat (0)—Republican (1) contrast −0.30 0.31 0.74 −0.12 0.29 0.89 Democrat (0)—Independent (1) contrast −0.10 0.24 0.91 0.15 0.23 1.16 Democrat (0)—Other (1) contrast 0.86 0.66 2.37 −0.33 0.39 0.72 Ideological conservatism 0.09 0.07 1.10 −0.07 0.06 0.93 Use SM for political information 0.02 0.08 1.02 0.14* 0.06 1.15 Watch cable TV news −0.09 0.07 0.92 −0.06 0.05 0.94 Watch network TV news −0.06 0.07 0.94 −0.11* 0.05 0.89 Read newspapers (online or hardcopy) 0.08 0.06 1.08 0.05 0.05 1.05 Read political blogs 0.03 0.08 1.03 0.03 0.06 1.03 Acceptable to talk about politics on SM 0.13† 0.07 1.14 0.09 0.06 1.10 Facebook usage intensity 0.42*** 0.08 1.53 0.37*** 0.09 1.45 Facebook activity duration 0.16** 0.05 1.17 0.08* 0.04 1.09 Age 0.01† 0.01 1.01 0.02** 0.01 1.02 Sex (1 = female) 0.64** 0.20 1.89 −0.12 0.17 0.89 LogLikelihood ratio χ2 = 78.67, df = 18 *** χ2 = 82.60, df = 18 *** McFadden R2 .10 .02 . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.12 0.10 1.12 0.18* 0.09 1.20 Social trust −0.17† 0.10 0.85 −0.07 0.09 0.93 News media trust −0.12 0.08 0.89 0.01 0.07 1.01 Political interest 0.07 0.11 1.07 0.03 0.09 1.04 Democrat (0)—Republican (1) contrast −0.30 0.31 0.74 −0.12 0.29 0.89 Democrat (0)—Independent (1) contrast −0.10 0.24 0.91 0.15 0.23 1.16 Democrat (0)—Other (1) contrast 0.86 0.66 2.37 −0.33 0.39 0.72 Ideological conservatism 0.09 0.07 1.10 −0.07 0.06 0.93 Use SM for political information 0.02 0.08 1.02 0.14* 0.06 1.15 Watch cable TV news −0.09 0.07 0.92 −0.06 0.05 0.94 Watch network TV news −0.06 0.07 0.94 −0.11* 0.05 0.89 Read newspapers (online or hardcopy) 0.08 0.06 1.08 0.05 0.05 1.05 Read political blogs 0.03 0.08 1.03 0.03 0.06 1.03 Acceptable to talk about politics on SM 0.13† 0.07 1.14 0.09 0.06 1.10 Facebook usage intensity 0.42*** 0.08 1.53 0.37*** 0.09 1.45 Facebook activity duration 0.16** 0.05 1.17 0.08* 0.04 1.09 Age 0.01† 0.01 1.01 0.02** 0.01 1.02 Sex (1 = female) 0.64** 0.20 1.89 −0.12 0.17 0.89 LogLikelihood ratio χ2 = 78.67, df = 18 *** χ2 = 82.60, df = 18 *** McFadden R2 .10 .02 Notes: All variance inflation factors are below 2.75. IRR = incidence rate ratio; OR = odds ratio; SM = social media. †p < .10, *p < .05, **p < .01, ***p < .001. Open in new tab Table 3 Logistic and Count Components of Negative Binomial Logit Hurdle Model Predicting Mainstream News Content Sharing on Facebook . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.12 0.10 1.12 0.18* 0.09 1.20 Social trust −0.17† 0.10 0.85 −0.07 0.09 0.93 News media trust −0.12 0.08 0.89 0.01 0.07 1.01 Political interest 0.07 0.11 1.07 0.03 0.09 1.04 Democrat (0)—Republican (1) contrast −0.30 0.31 0.74 −0.12 0.29 0.89 Democrat (0)—Independent (1) contrast −0.10 0.24 0.91 0.15 0.23 1.16 Democrat (0)—Other (1) contrast 0.86 0.66 2.37 −0.33 0.39 0.72 Ideological conservatism 0.09 0.07 1.10 −0.07 0.06 0.93 Use SM for political information 0.02 0.08 1.02 0.14* 0.06 1.15 Watch cable TV news −0.09 0.07 0.92 −0.06 0.05 0.94 Watch network TV news −0.06 0.07 0.94 −0.11* 0.05 0.89 Read newspapers (online or hardcopy) 0.08 0.06 1.08 0.05 0.05 1.05 Read political blogs 0.03 0.08 1.03 0.03 0.06 1.03 Acceptable to talk about politics on SM 0.13† 0.07 1.14 0.09 0.06 1.10 Facebook usage intensity 0.42*** 0.08 1.53 0.37*** 0.09 1.45 Facebook activity duration 0.16** 0.05 1.17 0.08* 0.04 1.09 Age 0.01† 0.01 1.01 0.02** 0.01 1.02 Sex (1 = female) 0.64** 0.20 1.89 −0.12 0.17 0.89 LogLikelihood ratio χ2 = 78.67, df = 18 *** χ2 = 82.60, df = 18 *** McFadden R2 .10 .02 . Logistic Component . Count Component . . b . SE . OR . b . SE . IRR . Ideological extremity 0.12 0.10 1.12 0.18* 0.09 1.20 Social trust −0.17† 0.10 0.85 −0.07 0.09 0.93 News media trust −0.12 0.08 0.89 0.01 0.07 1.01 Political interest 0.07 0.11 1.07 0.03 0.09 1.04 Democrat (0)—Republican (1) contrast −0.30 0.31 0.74 −0.12 0.29 0.89 Democrat (0)—Independent (1) contrast −0.10 0.24 0.91 0.15 0.23 1.16 Democrat (0)—Other (1) contrast 0.86 0.66 2.37 −0.33 0.39 0.72 Ideological conservatism 0.09 0.07 1.10 −0.07 0.06 0.93 Use SM for political information 0.02 0.08 1.02 0.14* 0.06 1.15 Watch cable TV news −0.09 0.07 0.92 −0.06 0.05 0.94 Watch network TV news −0.06 0.07 0.94 −0.11* 0.05 0.89 Read newspapers (online or hardcopy) 0.08 0.06 1.08 0.05 0.05 1.05 Read political blogs 0.03 0.08 1.03 0.03 0.06 1.03 Acceptable to talk about politics on SM 0.13† 0.07 1.14 0.09 0.06 1.10 Facebook usage intensity 0.42*** 0.08 1.53 0.37*** 0.09 1.45 Facebook activity duration 0.16** 0.05 1.17 0.08* 0.04 1.09 Age 0.01† 0.01 1.01 0.02** 0.01 1.02 Sex (1 = female) 0.64** 0.20 1.89 −0.12 0.17 0.89 LogLikelihood ratio χ2 = 78.67, df = 18 *** χ2 = 82.60, df = 18 *** McFadden R2 .10 .02 Notes: All variance inflation factors are below 2.75. IRR = incidence rate ratio; OR = odds ratio; SM = social media. †p < .10, *p < .05, **p < .01, ***p < .001. Open in new tab Table 4 Logistic Regression Model Predicting Mainstream News Content Sharing on Twitter . b . SE . OR . Ideological extremity 0.16 0.11 1.18 Social trust −0.14 0.10 0.87 News media trust 0.06 0.08 1.07 Political interest 0.00 0.12 1.00 Democrat (0)—Republican (1) contrast 0.31 0.33 1.36 Democrat (0)—Independent (1) contrast −0.06 0.26 0.94 Democrat (0)—Other (1) contrast 0.09 0.56 1.09 Ideological conservatism −0.08 0.07 0.92 Use SM for political information 0.09 0.08 1.10 Watch cable TV news −0.04 0.07 0.96 Watch network TV news −0.12 0.08 0.89 Read newspapers (online or hardcopy) 0.10† 0.06 1.11 Read political blogs −0.04 0.07 0.96 Acceptable to talk about politics on SM 0.05 0.08 1.05 Twitter usage intensity 0.30** 0.07 1.35 Twitter activity duration −0.09† 0.05 0.91 Age 0.02* 0.01 1.02 Sex (1 = female) 0.06 0.21 1.06 LogLikelihood ratio χ2 = 92.56, df = 18** McFadden R2 .08 . b . SE . OR . Ideological extremity 0.16 0.11 1.18 Social trust −0.14 0.10 0.87 News media trust 0.06 0.08 1.07 Political interest 0.00 0.12 1.00 Democrat (0)—Republican (1) contrast 0.31 0.33 1.36 Democrat (0)—Independent (1) contrast −0.06 0.26 0.94 Democrat (0)—Other (1) contrast 0.09 0.56 1.09 Ideological conservatism −0.08 0.07 0.92 Use SM for political information 0.09 0.08 1.10 Watch cable TV news −0.04 0.07 0.96 Watch network TV news −0.12 0.08 0.89 Read newspapers (online or hardcopy) 0.10† 0.06 1.11 Read political blogs −0.04 0.07 0.96 Acceptable to talk about politics on SM 0.05 0.08 1.05 Twitter usage intensity 0.30** 0.07 1.35 Twitter activity duration −0.09† 0.05 0.91 Age 0.02* 0.01 1.02 Sex (1 = female) 0.06 0.21 1.06 LogLikelihood ratio χ2 = 92.56, df = 18** McFadden R2 .08 Notes: All variance inflation factors are below 2.93. OR = odds ratio; SM = social media. †p < .10; *p < .01; **p < .001. Open in new tab Table 4 Logistic Regression Model Predicting Mainstream News Content Sharing on Twitter . b . SE . OR . Ideological extremity 0.16 0.11 1.18 Social trust −0.14 0.10 0.87 News media trust 0.06 0.08 1.07 Political interest 0.00 0.12 1.00 Democrat (0)—Republican (1) contrast 0.31 0.33 1.36 Democrat (0)—Independent (1) contrast −0.06 0.26 0.94 Democrat (0)—Other (1) contrast 0.09 0.56 1.09 Ideological conservatism −0.08 0.07 0.92 Use SM for political information 0.09 0.08 1.10 Watch cable TV news −0.04 0.07 0.96 Watch network TV news −0.12 0.08 0.89 Read newspapers (online or hardcopy) 0.10† 0.06 1.11 Read political blogs −0.04 0.07 0.96 Acceptable to talk about politics on SM 0.05 0.08 1.05 Twitter usage intensity 0.30** 0.07 1.35 Twitter activity duration −0.09† 0.05 0.91 Age 0.02* 0.01 1.02 Sex (1 = female) 0.06 0.21 1.06 LogLikelihood ratio χ2 = 92.56, df = 18** McFadden R2 .08 . b . SE . OR . Ideological extremity 0.16 0.11 1.18 Social trust −0.14 0.10 0.87 News media trust 0.06 0.08 1.07 Political interest 0.00 0.12 1.00 Democrat (0)—Republican (1) contrast 0.31 0.33 1.36 Democrat (0)—Independent (1) contrast −0.06 0.26 0.94 Democrat (0)—Other (1) contrast 0.09 0.56 1.09 Ideological conservatism −0.08 0.07 0.92 Use SM for political information 0.09 0.08 1.10 Watch cable TV news −0.04 0.07 0.96 Watch network TV news −0.12 0.08 0.89 Read newspapers (online or hardcopy) 0.10† 0.06 1.11 Read political blogs −0.04 0.07 0.96 Acceptable to talk about politics on SM 0.05 0.08 1.05 Twitter usage intensity 0.30** 0.07 1.35 Twitter activity duration −0.09† 0.05 0.91 Age 0.02* 0.01 1.02 Sex (1 = female) 0.06 0.21 1.06 LogLikelihood ratio χ2 = 92.56, df = 18** McFadden R2 .08 Notes: All variance inflation factors are below 2.93. OR = odds ratio; SM = social media. †p < .10; *p < .01; **p < .001. Open in new tab Follow-up analysis Studies on fake news have generally shown that those self-identifying as very conservative share the most fake news (Grinberg et al., 2019; Guess et al., 2019; Guess et al., 2018). However, in our study, we failed to find significant relationships between countermedia sharing and conservatism in either the Facebook or Twitter models (Tables 1 and 2). A similar set of results was observed in relation to identification as a Republican. To further explore this potential discrepancy with extant literature, we rounded the conservatism measure to the nearest whole number and examined the summed number of countermedia content shares across the scale range. In the Facebook sample, those scoring a 7 on the conservatism scale accounted for 25.6% of all countermedia shares, which was the highest per-category percentage observed. In the Twitter sample, those scoring a 7 on the conservatism measure accounted for 32.0% of all countermedia shares, again accounting for the highest per-category share percentage. That said, those self-identifying as very liberal (i.e., conservatism score = 1) also shared substantial amounts of countermedia in both samples: 17.5% on Facebook and 16.4% on Twitter. In the Facebook sample, those scoring at the extreme ends of the ideological spectrum shared 43.4% of all countermedia shares, despite the fact that these categories only represented 22.9% of the analytic sample. On Twitter, this pattern was nearly identical: 22.8% of the sample was responsible for 48.4% of all countermedia shares. Results summary H1 suggested that ideological extremity would be positively related to sharing countermedia on social media. This hypothesis was partially supported, as we observed positive and statistically significant coefficients for both the binary and count components of the Facebook model and a non-significant coefficient in the Twitter model. H2 predicted that social trust would be negatively associated with countermedia content sharing. This hypothesis was partially supported. In the Twitter model, a negative and statistically significant coefficient was observed. However, in the Facebook model, the association between social trust and content sharing was not statistically significant in either the binary or rate components. H3 predicted a negative relationship between trust in the mainstream media and countermedia sharing. This hypothesis was, again, partially supported. In the Facebook models, a statistically significant and negative association was observed in the logistic component. However, in the rate component and in the Twitter model, this association was not statistically different from zero. H4 suggested that the variables of primary interest in the countermedia models would be comparatively poor predictors of mainstream news sharing. This hypothesis was broadly supported; looking at both the Facebook and Twitter models, only one parameter estimate of interest was statistically associated with mainstream news sharing (ideological extremity was negatively associated with countermedia content sharing in the rate component of the Facebook model). The research question was interested in looking at cross-platform differences in regard to countermedia content sharing. The results generally indicated that Facebook was much more heavily used for countermedia content sharing and that the variables of primary interest were more consistently related to sharing outcomes on Facebook. Finally, a follow-up analysis generally conformed with prior research by showing that strong conservatives were the most likely to share countermedia content; having said that, however, we found that a substantial number of those that identified as strongly liberal were also frequent sharers of countermedia content. Discussion A handful of potentially important insights flow from our findings. First, we believe that our conceptualization of countermedia may be helpful for future research. Fake news, as a meaningful descriptor of real-world phenomena, is fraught with issues. Suggested alternatives, such as misinformation and disinformation, require frequently unknowable knowledge of communicator motives. Countermedia, in contrast, neither invokes potentially problematic frames (e.g., the implication that fake news is a form of “news”) nor requires knowledge of the motivations spurring content creators. Instead, this concept speaks to the observable epistemic functionality of communicated content. While the concept certainly needs future refinement, we believe it serves as an important step towards a more nuanced handling of so-called fake news. Second, our focus on the relationship between content sharing and granular, individual-level political and social factors is noteworthy. The approach used in this study can be contrasted with other quantitative-empirical studies on the topic, which have predominantly focused on the demographic patterns underlying democratically deviant information dissemination (e.g., Grinberg et al., 2019). These studies are useful and important, but do not directly speak to the nuances of social and political self. Broadly speaking, the demographic patterns observed in relation to countermedia sharing were consistent with those observed in recent, large-scale studies using representative samples to probe fake news sharing (Grinberg et al., 2019; Guess et al., 2019), particularly as it relates to countermedia sharing and age. At the same time, our results suggest that a comprehensive understanding of the dissemination of false, misleading, biased, and hyper-partisan content requires looking beyond demographic features and party identification and into the sometimes-complex aspects (e.g., personality traits, socialization factors, cognitive needs) of the political self. Therein, it should be noted that the independent variables of central interest did a generally poor job of predicting the rate at which countermedia content was shared in the Facebook model. Specifically, of these three variables, only ideological extremity was significant in both the count and rate components of the model. This suggests—as perhaps is to be expected in light of prior work on countermedia and related forms of information sharing (Grinberg et al., 2019; Guess et al., 2019)—that ideological sentiment may play an especially important role in the countermedia identity. More generally, the poor performance of the independent variables of interest suggests that countermedia content may play an establishing, rather than reinforcing, role in online self-articulation. Third, the present findings have implications for theorization on selective sharing. In its original articulation, Shin and Thorson (2017) focused somewhat narrowly on partisan, selective sharing. The rationale here was that social media–based content sharing is a “fundamentally social activity” (Shin & Thorson, 2017, p. 236) that is largely activated by group-based cues (e.g., party identification) that, when combined with dissonance reduction needs and the corrective potential of social media, ultimately elicit behaviors aimed at protection or enhancement of one's in-group. While, to a certain extent, ideological extremity, social trust, and trust in the mainstream news media all speak to activation of an “us versus them” mentality, they also (and perhaps more directly) speak to factors of the self as a political and social agent. Such consideration may be especially salient in the current context, given our inability to identify statistical associations between party identification and either mainstream or countermedia sharing. In this way, our results suggest that selective sharing may be differentially linked to social and personal identity cues on a contextual basis. In some cases, people may be motivated to share content that protects or enhances their perceived in-group. In other cases, people may share information because it articulates something important about their personal identity or allows them to exert and/or maintain personal influence over others. In both instances, the general mechanics are the same: (a) consumed media content that is both attitudinally consistent and is deemed to possess high informational, personal, or social utility is (b) shared with one's social media followers as (c) an externally facing corrective means of presenting some aspect of one's self to others. Fourth, the fact that the independent variables of primary interest served as relatively poor predictors of mainstream news sharing is notable. In this study, we argued that countermedia attempt to counter the knowledge produced by the mainstream press. Therein, we drew upon extant research that suggests people are unlikely to share content on social media that they disagree with (Shin & Thorson, 2017). The contextually asymmetric nature of the coefficients for ideological extremity, social trust, and trust in the mainstream press provide tentative support for this line of theorization. At the same time, caution should be applied when interpreting these findings. Many of the coefficient signs in the mainstream sharing modes were in same direction (albeit comparatively weak and non-significant in nature). And, more generally, null effects should not be over-interpreted (e.g., Leppink, O'Sullivan, & Winston, 2017). Fifth, we observed substantial disparities in countermedia sharing frequencies across platforms. On Facebook, a total of 1,152 bits of countermedia were shared, while on Twitter, only 128 countermedia share instances were detected. In the case of the Facebook sample, one outlying user shared more content (172 instances) than all analyzed cases in the Twitter sample. This finding comports with recent research (Allcott & Gentzkow, 2017; Guess et al., 2019; Guess et al., 2018), which shows that Facebook is a central conduit for the transfer of fake news. One interpretation of this frequency-based disparity may be linked to platform-specific affordances. Due to a combination of service policies (e.g., the so-called “real name policy”; Facebook, 2019), widespread diffusion, “group” spaces, and normative social application, Facebook may be understood as transmitting a more voluminous amount of personal and identity-linked information than many other social media platforms, including Twitter (Cho & Acquisti, 2013). Although prior research on identity performance in computer-mediated contexts is not conclusive on the subject, there is some indication that “rich interactions (i.e., interactions that allow the transmission of cues to identity such as face-to-face) are superior in that they make the interaction more personal” (Tanis & Postmes, 2007, p. 955). It may, therefore, be the case that for some purposes and in some contexts, Facebook simply offers a fuller, richer, and more immediately personalized means of task accomplishment. This conclusion may also explain—at least partially—why we observed different patterns of association between the independent variables of primary interest and countermedia sharing frequencies across platforms. If it is indeed the case that Facebook possesses greater identity-building potential (e.g., Cho & Acquisti, 2013) and that belief strength is an important part of political and social identity (e.g., Westfall, Van Boven, & Chambers, 2015), it follows that ideological extremity may be an especially important correlate of countermedia sharing on Facebook. Alternately, scholars have previously remarked that the de-identified nature of Twitter opens the platform up to increased levels of trolling and other, related forms of anti-social behavior (e.g., Oz, Zheng, & Chen, 2018). It may, therefore, be the case that social mistrust plays a disproportionally important role in political communication on Twitter. For its part, countermedia content often speaks to feelings of “social alienation” (Noppari et al., 2019, p. 28). Considered together, these two propositions provide a potential explanation for the differentially strong relationship between social mistrust and countermedia sharing on Twitter observed in this study. This study is subject to limitations. Although we approached the issue of countermedia dissemination from a deductive standpoint, this study is one of the first of its kind and, thus, should be considered exploratory in nature. Relatedly, we recognize that, for some readers, our normatively negative handling of the term countermedia may inadvertently invoke reference to other forms of normatively positive alternative media. Additionally, in terms of independent variables, we focused on a relatively small number of factors. There may be other features (e.g., news literacy) that may also be associated with countermedia dissemination on social media. Relatedly, our measure of social trust simply asked respondents to provide a summary of their bonded and bridged social ties. Other measures may have yielded different results. Additionally, our data collection approach, which combined survey and trace data, may have incurred a systematic sampling bias due to issues related to data privacy perceptions. The degree to which this may or may not affect our results is unknown. Also, the sample employed in this study is not representative of the general population of users on Facebook and Twitter or of the United States as a whole. Another limitation pertains to our use of a domain-level, list-based approach for content identification. Finally, drawing from the selective sharing approach (Shin & Thorson, 2017), this study assumed that people share content because it is attitudinally consistent and personally relevant. In some cases, however, people may share information that is attitudinally inconsistent as a means of expressing disagreement, expressing humor, or starting a conversation. Because we did not analyze user-generated text that may have accompanied posted links, we were not able to empirically evaluate the degree to which the sampled respondents may have shared attitudinally inconsistent countermedia content. To conclude, we believe that our theorizing and empirical work together provide valuable insights into the spread of ideologically extreme and often untrue political information on social media. Future work should build on these findings by replicating them in a larger, more representative sample. Efforts should also be taken to identify other potentially influential, individual-level factors that are associated with the spread of false, biased, and hyper-partisan political information online (e.g., news literacy, cognitive styles, political self-efficacy). It is also important that future research clarifies and refines the concept of countermedia, as we believe that such an approach offers great theoretical potential and is, perhaps, both a more nuanced and accurate means of understanding so-called fake news. Supporting Information Additional Supporting Information may be found in the online version of this article. Please note: Oxford University Press is not responsible for the content or functionality of any supplementary materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article. Notes 1 We instituted an approximate 50/50 sex split because our prior work with Qualtrics has shown us that samples without quotas can result in pronounced sex imbalances. 2 As it pertains to the active user criteria, we required that all users have a current account with at least 50 pieces of posted content on both platforms. This safeguard was put in place to avoid scenarios where a user may create an account simply to qualify for the study. 3 When constructing the recruitment language, our goal was to ensure that participants were interested in politics generally, as a sample comprised predominantly of politically ambivalent citizens would be of limited theoretical or practical interest. Self-report data indicated that the sample (and accompanying sub-samples) were comprised of politically interested and engaged citizens: the sample average on the political interest scale (range, 1–7) was 5.27 (SD = 1.44). In all, 87.2% of respondents had a political interest score equal to or above the composite scale center point of 4.00. 4 Of those respondents that engaged with the study materials, 13.5% both met our inclusion criteria and provided valid survey responses. 5 Little's Missing Completely at Random (MCAR; Little, 1988) test was used to assess the degree to which missing data in both the Facebook and Twitter samples were random in nature. Tests applied to the Facebook (χ2 = 306.79; df = 314; p = .604) and Twitter samples (χ2 = 286.23; df = 276; p = .323) were non-significant, suggesting that the data were MCAR. 6 All reported descriptive statistics are also derived from the final analytic samples (Facebook, n = 678; Twitter, n = 543). 7 Notably, many of these lists have been criticized for containing media sites that publish distorted, hyper-partisan, or ideologically extreme content rather than outright false information (e.g., Owen, 2017). This criticism, as illustrated above, was one motivating factor in our conceptualization of countermedia. 8 Given the very small number of countermedia shares in the Twitter sample, we estimated an additional logistic regression model that only included the three predictors of primary interest. Additionally, a Firth correction was applied to address potential issues associated with event rarity. In this model, non-significant parameter estimates were observed for ideological extremity and news media trust. The relationship between countermedia sharing status and social trust was negative and statistically significant. Given that both the larger, covariate-based model (reported in Table 2) and the smaller, Firth-corrected model resulted in identical conclusions, we substantively reported the former for the purposes of facilitating comparison between the Twitter and Facebook samples. References Allcott , H. , & Gentzkow , M. ( 2017 ). Social media and fake news in the 2016 election . Journal of Economic Perspectives , 31 , 211 – 236 . doi: 10.3386/w23089 Google Scholar Crossref Search ADS WorldCat Crossref Ardèvol-Abreu , A. , & Gil de Zúñiga , H. ( 2017 ). Effects of editorial media bias perception and media trust on the use of traditional, citizen and social media news . Journalism & Mass Communication Quarterly , 94 , 703 – 724 . doi: 10.1177/1077699016654684 Google Scholar Crossref Search ADS WorldCat Crossref Barnidge , M. , & Rojas , H. ( 2014 ). Hostile media perceptions, presumed media influence, and political talk: Expanding the corrective action hypothesis . International Journal of Public Opinion Research , 26 , 135 – 156 . doi: 10.1093/ijpor/edt032 Google Scholar Crossref Search ADS WorldCat Crossref Bobkowski , P. ( 2015 ). Sharing the news: Effects of information utility and opinion leadership on news sharing . Journalism & Mass Communication Quarterly , 92 , 320 – 345 . doi: 10.1177/1077699015573194 Google Scholar Crossref Search ADS WorldCat Crossref Brants , K. ( 2013 ). Trust, cynicism, and responsiveness: the uneasy situation of journalism in democracy. In C. Peters & M. Broersma (Eds.), Rethinking journalism (pp. 27 – 39 ). New York, NY : Routledge . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Bruns , A. ( 2018 ). Gatewatching and news curation: Journalism, social media, and the public sphere . New York, NY : Peter Lang . Google Scholar Crossref Search ADS Google Scholar Google Preview WorldCat COPAC Capara , G. V. , Barbaranelli , C., & Zimbardo , P. G. ( 1999 ). Personality profiles and political parties . Political Psychology , 20 , 175 – 197 . doi: 10.1111/0162-895X.00141 Google Scholar Crossref Search ADS WorldCat Crossref Cho , D. , & Acquisti , A. ( 2013, June ). The more social cues, the less trolling? An empirical study of online commenting behavior. Paper presented at the Twelfth Workshop on the Economics of Information Security , Washington, DC . Retrieved from https://www.econinfosec.org/archive/weis2013/papers/ChoWEIS2013.pdf. Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Colleoni , E. , Rozza , A., & Arvidsson , A. ( 2014 ). Echo chamber or public sphere? Predicting political orientation and measuring political homophily in Twitter using big data . Journal of Communication , 64 , 317 – 332 . doi: 10.1111/jcom.12084 Google Scholar Crossref Search ADS WorldCat Crossref Edelman ( 2017 ). 2017 Edelman trust barometer . Retrieved from https://www.edelman.com/research/2017-edelman-trust-barometer. Ekström , M. , & Westlund , O. ( 2019 ). The dislocation of news journalism: A conceptual framework for the study of epistemologies of digital journalism . Media & Communication, 7 , 259 – 270 . doi: 10.17645/mac.v7i1.1763 Google Scholar Crossref Search ADS WorldCat Crossref Ellison , N. B. , Steinfeld , C., & Lampe , C. ( 2011 ). Connection strategies: Social capital implications of Facebook-enabled communication practices . New Media & Society , 13 , 873 – 892 . doi: 10.1177/1461444810385389 . Google Scholar Crossref Search ADS WorldCat Crossref Facebook . ( 2019 ). What names are allowed on Facebook? Retrieved from https://www.facebook.com/help/112146705538576. Fukuyama , F. ( 1996 ). Trust: Social virtues and the creation of prosperity . New York, NY : Simon and Schuster . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Grinberg , N. , Joseph , K., Friedland , L., Swire-Thompson , B., & Lazer , D. ( 2019 ). Fake news on Twitter during the 2016 U.S. presidential election . Science , 25 , 374 – 378 . doi: 10.1126/science.aau2706 Google Scholar Crossref Search ADS WorldCat Crossref Greene , A. ( 1999 ). Understanding party identification: A social identity approach . Political Psychology , 20 , 393 – 403 . doi: 10.1111/0162-895X.00150 Google Scholar Crossref Search ADS WorldCat Crossref Guess , A. , Nagler , J., & Tucker , J. ( 2019 ). Less than you think: Prevalence and predictors of fake news dissemination on Facebook . Science Advances , 5 , 1 – 8 . doi: 10.1126/sciadv.aau4586 Google Scholar Crossref Search ADS WorldCat Crossref Guess , A. , Nyhan , B., & Reifler , J. ( 2018 ). Selective exposure to misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign. Retrieved from https://www.dartmouth.edu/∼nyhan/fake-news-2016.pdf. Habermas , J. ( 1991 ). The structural transformation of the public sphere: An inquiry into a category of bourgeois society . Cambridge, MA : MIT press . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Hallin , D. C. ( 1989 ). The uncensored war: The media and Vietnam . Berkeley, CA : University of California Press . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Hannan , J. ( 2018 ). Trolling ourselves to death? Social media and post-truth politics . European Journal of Communication , 33 , 214 – 226 . doi: 10.1177/0267323118760323 Google Scholar Crossref Search ADS WorldCat Crossref Harambam , J. , & Aupers , S. ( 2014 ). Contesting epistemic authority: Conspiracy theories on the boundaries of science . Public Understanding of Science , 24 , 466 – 480 . doi: 10.1177/0963662514559891 Google Scholar Crossref Search ADS PubMed WorldCat Crossref HerdaĞdelen , A. , Zuo , W., Gard-Murray , A., & Bar-Yam , Y. ( 2013 ). An exploration of social identity: The geography and politics of news-sharing communities in Twitter . Complexity , 19 , 10 – 20 . doi: 10.1002/cplx.21457 Google Scholar Crossref Search ADS WorldCat Crossref Higgins , A. , McIntire , M., & Dance , G. J. ( 2016 ). Inside a fake news sausage factory: “This is all about income.” New York Times . Retrieved from https://www.nytimes.com/2016/11/25/world/europe/fake-news-donald-trump-hillary-clinton-georgia.html. OpenURL Placeholder Text WorldCat Hilbe , J. M. ( 2014 ). Modeling count data . New York, NY : Cambridge University Press . Google Scholar Crossref Search ADS Google Scholar Google Preview WorldCat COPAC Hylton , W. S. ( 2017 ). Down the Breitbart hole . The New York Times Magazine . Retrieved from https://www.nytimes.com/2017/08/16/magazine/breitbart-alt-right-steve-bannon.html. OpenURL Placeholder Text WorldCat Knobloch-Westerwick , S. ( 2015 ). Choice and preference in media use: Advances in selective exposure theory and research . New York, NY : Routledge . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Kohring , M. , & Matthes , J. ( 2007 ). Trust in news media . Communication Research , 34 , 231 – 252 . doi: 10.1177/0093650206298071 Google Scholar Crossref Search ADS WorldCat Crossref Kreiss , D. ( 2017 ). The fragmenting of the civil sphere: How partisan identity shapes the moral evaluation of candidates and epistemology . American Journal of Cultural Sociology , 5 , 443 – 459 . doi: 10.1057/s41290-017-0039-5 Google Scholar Crossref Search ADS WorldCat Crossref Lazer , D. M. J. , Baum , M. A., Benkler , Y., Berinsky , A. J., Greenhill , K. M., Menczer , F., & Zittrain , J. L. ( 2018 ). The science of fake news . Science , 359 , 1094 – 1096 . doi: 10.1126/science.aao2998 Google Scholar Crossref Search ADS PubMed WorldCat Crossref Leppink , J. , O'Sullivan , P., & Winston , K. ( 2017 ). Evidence against vs. in favour of a null hypothesis . Perspectives on Medical Education , 6 , 115 – 118 . doi: 10.1007/s40037-017-0332-6 . Google Scholar Crossref Search ADS PubMed WorldCat Crossref Lewis , R. , & Marwick , A. ( 2017 ). Taking the red pill: Ideological motivations for spreading online disinformation . Understanding and addressing the disinformation system . Retrieved from https://firstdraftnews.org/wp-content/uploads/2018/03/The-Disinformation-Ecosystem-20180207-v2.pdf. OpenURL Placeholder Text WorldCat Liang , H. ( 2018 ). Broadcast versus viral spreading: The structure of diffusion cascades and selective sharing on social media . Journal of Communication , 68 , 525 – 546 . doi: 10.1093/joc/jqy006 . Google Scholar Crossref Search ADS WorldCat Crossref Little , R. J. A. ( 1988 ). A test of missing completely at random for multivariate data with missing values . Journal of the American Statistical Association , 83 ( 404 ), 1198 – 1202 . doi: 10.1080/01621459.1988.10478722 Google Scholar Crossref Search ADS WorldCat Crossref Mourão , R. R. , & Robertson , C. T. ( 2019 ). Fake news as discursive integration: An analysis of sites that Publish false, misleading, hyperpartisan and sensational Information . Journalism Studies , 20 ( 14 ), 2077 – 2095 . doi: 10.1080/1461670X.2019.1566871 Google Scholar Crossref Search ADS WorldCat Crossref Narayanan , V. , Kelly , J., Kollanyi , B., Neudert , L-M., & Howard , P. N. ( 2018 ). Polarization, partisanship, and junk news consumption over social media in the US . Retrieved from https://arxiv.org/abs/1803.01845. Noppari , E. , Hiltunen , I., & Ahva , L. ( 2019 ). User profiles for populist counter-media websites in Finland . Journal of Alternative and Community Media , 44 , 23 – 37 Retrieved from https://joacm.org/index.php/joacm/article/view/1138. Google Scholar Crossref Search ADS WorldCat Oeldorf-Hirsch , A. , & Sundar , S. S. ( 2015 ). Posting, commenting, and tagging: Effects of sharing news stories on Facebook . Computers in Human Behavior , 44 , 240 – 249 . doi: 10.1016/j.chb.2014.11.024 Google Scholar Crossref Search ADS WorldCat Crossref Ohlheiser , A. ( 2016 ). This is how Facebook's fake-news writers make money . Washington Post . Retrieved from https://www.washingtonpost.com/news/the-intersect/wp/2016/11/18/this-is-how-the-internets-fake-news-writers-make-money/?noredirect=on&utm_term=.f38e586595b4. OpenURL Placeholder Text WorldCat Owen , L. H. ( 2017 ). Harvard library gets slammed for its earnest fake news guide: Updates from the fake news world . NiemanLab . Retrieved from http://www.niemanlab.org/2017/03/harvard-library-gets-slammed-for-its-earnest-fake-news-guide-updates-from-the-fake-news-world/. OpenURL Placeholder Text WorldCat Oz , M. , Zheng , P., & Chen , G. ( 2018 ). Twitter versus Facebook: Comparing incivility, impoliteness, and deliberative attributes . New Media & Society , 20 , 3400 – 3419 . doi: 10.1177/1461444817749516 Google Scholar Crossref Search ADS WorldCat Crossref Rojas , H. ( 2010 ). “Corrective” actions in the public sphere: How perceptions of media and media effects shape political behaviors . International Journal of Public Opinion Research , 3 , 343 – 363 . doi: 10.1093/ijpor/edq018 Google Scholar Crossref Search ADS WorldCat Crossref Schlesinger , P. ( 1990 ). Re-thinking the sociology of journalism: Source strategies and the limits of media-centrism. In M. Ferguson (Ed.), Public communication: The new imperatives (pp. 61 – 83 ). London, England : Sage . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Schudson , M. ( 2001 ). The objectivity norm in American journalism . Journalism , 2 , 149 – 170 . doi: 10.1177/146488490100200201 Google Scholar Crossref Search ADS WorldCat Crossref Shane-Simpson , C. , Manago , A. M., Gaggi , N., & Gillespie-Lynch , K. ( 2018 ). Why do college students prefer Facebook, Twitter, or Instagram? Site affordances, tensions between privacy and self-expression, and implications for social capital . Computers in Human Behavior , 86 , 276 – 288 . doi: 10.1016/j.chb.2018.04.041 Google Scholar Crossref Search ADS WorldCat Crossref Shin , J. , & Thorson , K. ( 2017 ). Partisan selective sharing: The biased diffusion of fact-checking messages on social media . Journal of Communication , 67 , 233 – 255 . doi: 10.1111/jcom.12284 . Google Scholar Crossref Search ADS WorldCat Crossref Shoemaker , P. J. , & Vos , T. P. ( 2009 ). Gatekeeping theory . New York, NY : Routledge . Google Scholar Crossref Search ADS Google Scholar Google Preview WorldCat COPAC Tanis , M. , & Postmes , T. ( 2007 ). Two faces of anonymity: Paradoxical effects of cues to identity in CMC . Computers in Human Behavior, 23 , 955 – 970 . doi: 10.1016/j.chb.2005.08.004 Google Scholar Crossref Search ADS WorldCat Crossref Tong , J. ( 2018 ). Journalistic legitimacy revisited. Collapse or revival in the digital age? Digital Journalism , 6 , 256 – 273 . doi: 10.1080/21670811.2017.1360785 Google Scholar Crossref Search ADS WorldCat Crossref Torres , T. , Gerhart , N., & Negahban , A. ( 2018 ). Epistemology in the era of fake news: An exploration of information verification behaviors among social networking site users . ACM SIGMIS Database. The DATABASE for Advances in Information Systems , 49 , 78 – 97 . doi: 10.1145/3242734.3242740 Google Scholar Crossref Search ADS WorldCat Crossref Tsfati , Y. , & Capella , J. N. ( 2003 ). Do people watch what they do not trust? Exploring the association between news media skepticism and exposure . Communication Research , 30 , 504 – 529 . doi: 10.1177/0093650203253371 Google Scholar Crossref Search ADS WorldCat Crossref Usher , N. , & Carlson , M. ( 2018 ). The midlife crisis of the network society . Media and Communication , 6 , 107 – 110 . doi: 10.17645/mac.v6i4.1751 Google Scholar Crossref Search ADS WorldCat Crossref Van Duyn , E. , & Collier , J. ( 2018 ). Priming and fake news: The effects of elite discourse on evaluations of news media . Mass Communication & Society , 22 , 29 – 48 . doi: 10.1080/15205436.2018.1511807 Google Scholar Crossref Search ADS WorldCat Crossref Vargo , C. , J. & Guo , L. ( 2017 ). Networks, big data, and intermedia agenda setting: An analysis of traditional, partisan, and emerging online U.S. news . Journalism and Mass Communication Quarterly, 94 , 1031 – 1055 . doi: 10.1177/1077699016679976 Google Scholar Crossref Search ADS WorldCat Crossref Waisbord , S. ( 2018 ). Truth is what happens to news . Journalism Studies , 19 , 1866 – 1878 . doi: 10.1080/1461670X.2018.1492881 Google Scholar Crossref Search ADS WorldCat Crossref Wardle , C. , & Derakhshan , H. ( 2017 ). Information disorder: Towards an interdisciplinary framework for research and policymaking. Retrieved from https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c. Wasilewski , K. ( 2019 ). US alt-right media and the creation of the counter-collective memory . Journal of Alternative and Community Media, 4 , 77 – 91 . Retrieved from https://joacm.org/index.php/joacm/article/view/1141. Google Scholar Crossref Search ADS WorldCat Weeks , B. E. , & Holbert , R. L. ( 2013 ). Predicting dissemination of news content on social media: A focus on reception, friending, and partisanship . Journalism and Mass Communication Quarterly , 90 , 212 – 232 . doi: 10.1177/1077699013482906 Google Scholar Crossref Search ADS WorldCat Crossref Weinberg , L. ( 2019 ). Fascism, populism, and American democracy . New York, NY : Routledge . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Westfall , J. , Van Boven , L., Chambers , J. R., & Judd , C. M. ( 2015 ). Perceiving political polarization in the United States: Party identity strength and attitude extremity exacerbate the perceived partisan divide . Perspectives on Psychological Science , 10 ( 2 ), 145 – 158 . doi: 10.1177/1745691615569849 Google Scholar Crossref Search ADS PubMed WorldCat Crossref Ylä-Anttila , T. ( 2018 ). Populist knowledge: “Post-trust” repertoires of contesting epistemic authorities . European Journal of Cultural and Political Sociology , 5 , 356 – 388 . doi: 10.1080/23254823.2017.1414620 Google Scholar Crossref Search ADS WorldCat Crossref Ylä-Anttila , T. , Bauvois , G., & Pyrhönen , N. ( 2019 ). Politicization of migration in the countermedia style: A computational and qualitative analysis of populist discourse . Discourse, Context, & Media . doi: 10.1016/j.dcm.2019.100326 OpenURL Placeholder Text WorldCat Crossref © The Author(s) 2020. Published by Oxford University Press on behalf of International Communication Association. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) TI - Why Do People Share Ideologically Extreme, False, and Misleading Content on Social Media? A Self-Report and Trace Data–Based Analysis of Countermedia Content Dissemination on Facebook and Twitter JF - Human Communication Research DO - 10.1093/hcr/hqz022 DA - 2008-03-01 UR - https://www.deepdyve.com/lp/oxford-university-press/why-do-people-share-ideologically-extreme-false-and-misleading-content-Wr6q00GwkZ SP - 1 VL - Advance Article IS - DP - DeepDyve ER -