2,417
Views
4
CrossRef citations to date
0
Altmetric
Articles

The Influence of Media Trust and Normative Role Expectations on the Credibility of Fact Checkers

ORCID Icon

ABSTRACT

Fact-checking has been granted a pivotal role in mitigating the effects of online disinformation, but its effectiveness has nonetheless been questioned (Lee and Shin Citation2019). Like any persuasive communication, fact checkers depend on their recipients perceiving both their messages and them as credible (Lombardi, Seyranian, and Sinatra Citation2014; Lombardi, Nussbaum, and Sinatra Citation2016). This study investigates the role of the perceived credibility of the fact checker as possible detriment to the effectiveness of fact-checking efforts by means of an online survey-embedded experiment. Results show that the perceived credibility of the fact checker and fact-checking messages is best explained by normative expectations of the roles of fact checkers and trust in traditional media. Some users perceive fact checkers as elite power structures in journalism or, in other words, as collaborative-facilitators for state propaganda (Hanitzsch and Vos Citation2018; see also Fawzi Citation2020). Further, low trust in media and politics predicts perceived credibility of disinformation better than political partisanship. The findings suggest that fact checkers should be more transparent and proactive in communicating their motives and identities. Further implications are discussed.

Introduction

Fake news has become a pervasive trend on the Internet with, as some argue, dire consequences for democratic societies that depend on citizens making elaborate decisions based on the best available information (Powell Citation2000; Lewandowsky et al. Citation2012). Misinformation and disinformation not only influence how people think about a focal issue, event, or person, but also feed media scepticism, fragmentation, and political apathy whilst lowering trust in experts (Weeks Citation2018; Lee and Shin Citation2019). These messages have a self-perpetuating effect as they reach individuals who already mistrust institutions such as traditional media (Jackob Citation2010; Decker and Brähler Citation2016; Shin and Thorson Citation2017), reject a shared understanding of reality (Dahlgren Citation2018) and have faith in intuition rather than experts and evidence (Garrett and Weeks Citation2017). Fact-checks has been granted a pivotal role in mitigating these effects. Their effectiveness has nonetheless been questioned as they overall do not resort a lasting effect on readers’ beliefs and are subject to manifold limitations (Nyhan and Reifler Citation2015; Lee and Shin Citation2019; Walter and Tukachinsky Citation2020). Like any persuasive message the effectiveness of fact-checking is bound to its recipients perceiving both the source of it and the message itself as credible (Lombardi, Seyranian, and Sinatra Citation2014; Lombardi, Nussbaum, and Sinatra Citation2016). The credibility ascribed to an information source, however, is based on recipients’ predispositions which can lead to “an epistemic circularity, whereby no opposing information is ever judged sufficiently credible to overturn dearly held prior knowledge” (Lewandowsky et al. Citation2012, 119).

Whether recipients always perceive Covid-19 fact checks as credible is questionable considering the growing distrust in traditional or mainstream media in some parts of the public. In Germany, this distrust becomes most visible in the antagonistic friend vs. enemy construction of the term lying press which refers to the press as an actor that tries to harm the people. Journalists are, in that view, mere distributers of elite propaganda. They lie in order to steer public opinion and facilitate the state’s or other elites’ interests (Voigt Citation2018; Holt and Haller Citation2017; Lilienthal and Neverla Citation2017). In that sense, “[…] traditional distrust of media has turned into an assault on basic Enlightenment premises, eroding shared understandings of reality and compatible discourse” (Dahlgren Citation2018, 20). It would seem illogical for recipients who disregard traditional or mainstream media as “lying press” to resort to fact-checking organizations that disseminate fact-based and mainstream accounts of the severity of Covid-19. In this study, I, therefore, propose distrust in the media and political institutions as a simple, yet strong, explanator for why corrections fail for some parts of the public. Following from that, I also include the recipients’ perception of fact checkers being merely disseminators of state propaganda in a collaborative-facilitative fashion (Hanitzsch and Vos Citation2018) as a potential detriment for their corrective efforts.

Media Trust

Media trust, as trust in general, functions as a reduction of complexity (Luhmann Citation1989). From a systems theory point of view, trust in the media is particularly important because media provide the subsystems of a society with information about each other (Kohring Citation2004). Further, media trust also describes “the willingness of the audience to be vulnerable to news content based on the expectation that the media will perform in a satisfactory manner” (Hanitzsch, van Dalen, and Steindl Citation2018, 5). In the face of a pandemic with extreme uncertainties and knowledge gaps between societal systems, the aspect of vulnerability becomes more salient. Not only is the latest knowledge about the virus, necessary measures, and their effects immensely complex, which is why recipients depend on the reliable transfer of knowledge via the media, but also are the stakes particularly high. Vulnerability increases with the risk induced by false media information about the virus, whereas false can mean for the recipients both downplaying the severity of the pandemic as well as overstating it. The latter is what concerns many of those involved in Covid-19 protests which became increasingly hostile towards journalists over the course of the pandemic (BR Citation2020; Baumgärtner et al. Citation2020; ZDF Citation2020).

It is a long-observed phenomenon that a growing number of people increasingly reject the epistemic authority of traditional media and experts and demand further integration and participation into the knowledge order (Neuberger et al. Citation2019; Weingart Citation2009). That this demand can often be justified, for example by marginalized under-represented and misrepresented groups, is beyond question (see e.g., Mejia, Beckermann, and Sullivan Citation2018; Awad Citation2011). Nevertheless, are epistemic authorities such as epidemiologists or journalists as trust-intermediaries (Neuberger et al. Citation2019, 177) essential for topics such as Covid-19 or vaccination that exceed individual capacity (see e.g., Kata Citation2012; Navin Citation2013).

While overall trust in media is high in Germany (European Comission Citation2017; Reinemann, Fawzi, and Obermaier Citation2017, 81; Hanitzsch, van Dalen, and Steindl Citation2018, 15; Newman et al. Citation2020, 71), large parts of the German population express some suspicion towards the media and the tendency towards sweeping media criticism and the belief in conspiracy theories are significantly correlated (Schultz et al. Citation2017, 252–253; Stempel, Hargrove, and Stempel Citation2007). Growing distrust in media is, therefore, a “self-perpetuating phenomenon” through which already media-sceptical recipients radicalize or immerse themselves in conspiracy ideologies and alternative realities constructed by fringe media (Marwick and Lewis Citation2017, 45; see also Jackob Citation2010; Tsfati Citation2010). Online, diminishing hurdles for many to become speakers (Baym and boyd Citation2012, 326) have led to a new knowledge order in which journalists as trust-intermediaries can be bypassed in a system of self-mass communication and the temporal precedence of knowledge verification becomes variable (Neuberger et al. Citation2019, 177). It is in this environment that alternative representations of reality collide, rendering the respective opposite, in the case of Covid-19, traditional medias’ representation implausible. In this sense, fact checkers in the pandemic are not simply debunking false claims but taking a stand for the same version of reality propagated by governments and traditional media, i.e., from the top of the elites down to the bottom of the common folks. This is likely to make fact checkers untrustworthy for all those who distrust media and politics anyway, while information that deviates from these official narratives becomes more credible simply by challenging media reality.

Hypothesis 1: The more people trust media and politics, the more they perceive fact-checking sources and messages as credible. The less they trust media and politics, the more they perceive disinformation to be credible.

The Nexus of Trust in Media and Politics

I already touched the idea of the entanglement of (trust in) media and politics. I argue that the current pandemic and the conspiracist discourse revolving around it have once again shown that the two cannot be separated. This observation is in accordance with empirical research by, for example, Hanitzsch, van Dalen, and Steindl (Citation2018) who conceptualize media trust as being first and foremost linked to trust in politics and ideological polarization (6). Political trust plays a role in media trust in so far as it reflects “a public disenchantment with and widespread sense of disdain for social institutions more generally but for political institutions most particularly” (Hanitzsch, van Dalen, and Steindl Citation2018, 7). The concept of media trust as a “trust nexus”, that is as a connection between trust in media and other (political) institutions (7), is integral to the conspiracy ideologues’ notion that the media and politics are working together to overthrow the world. It is important to note that so far, sweeping media critique could most strongly be observed among far-right citizens (Reinemann, Fawzi, and Obermaier Citation2017; Schultz et al. Citation2017; Decker and Brähler Citation2016). A political orientation to the right is also associated with increased gullibility and, accordingly, with increased susceptibility to fake news (Fessler Citation2017; Arendt, Haim, and Beck Citation2019). Arendt, Haim, and Beck’s (Citation2019) already hypothesized that right-leaning people ascribe higher credibility to fake news in the German context (188). In light of the trust nexus, I argue that they should also perceive fact-checking, that is information that contradicts disinformation and supports official narratives, as less credible.

Hypothesis 2: People oriented to the right of the political ideology spectrum are more likely to believe fake news and are less likely to assert credibility to fact-checking sources and messages.

Fact Checker Roles

Research on the roles of fact checkers has so far rather focussed on fact checker perspectives (Graves, Nyhan, and Reifler Citation2016; Graves and Cherubini Citation2016; Graves Citation2017; Graves Citation2018), its contested epistemological limitations (Amazeen Citation2015; Uscinski Citation2015) or user evaluations of fact-checking messages, e.g., their usefulness and effect, (Brandtzaeg and Følstad Citation2017; 70 Brandtzaeg, Følstad, and Chaparro Domínguez Citation2018; Walter and Tukachinsky Citation2020). I argue that audience views on normative roles are essential to an understanding of why some recipients disregard fact-checking messages as untrustworthy. It does matter, of course, to what extent the epistemology of fact-checking, the interpretative schemes, the issue framing and discourses they operate with and within should reasonably scrutinized (see e.g., the debate between Uscinski (Citation2015) and Amazeen (Citation2015)). Much more banal, however, is the observation that audience perceptions matter whether rationally justified or not, as reactions to fact-checking on Facebook in general and to a recent ruling regarding a fact-checking link on Facebook from CORRECTIV (Citation2020) in particular, underscore.

To approach audience perceptions, I will start from self-assessments of journalistic roles. Fact checkers aim to provide the most accurate account of an issue by means of thorough verification (Brandtzaeg, Følstad, and Chaparro Domínguez Citation2018, 1114). Nowadays this verification task could be technically done by anyone. The rapid update circle of online news and the oversupply of news in the online environment, however, offer “unfavourable conditions for the permanent collection, checking and presentation of news on a voluntary basis” (Neuberger Citation2018, 8). Fact-checking has, therefore, rather developed as a form of accountability journalism (Amazeen Citation2015, 3) than as an individual or user-driven effort and is mainly based in news organizations or NGO structures (Graves and Cherubini Citation2016, 8). This has an impact on the role self-perception of fact checkers. According to Graves and Cherubini (Citation2016) most European fact checkers define themselves primarily as journalists, activists, academics and technologists.

From these basic overlapping identities, Graves and Cherubini (Citation2016) derive three main fact checker roles: Reporters, reformers and experts (p.12). As for any other journalistic contexts, every role comes with normative functions. Reporters perceive their task as mainly explanatory and as a service of information provision for citizens. Holding politicians accountable is an important function in this category as well (Graves and Cherubini Citation2016, 12–13). They are in that regard most close to how one could define professional journalists. As a working definition I apply Hanitzsch and Seethaler’s (Citation2009) approach here who offer six defining conditions:

Journalism is a professional activity that takes place in organizational structures. In addition, journalism is a “service to society” or to the public, provides current and relevant information, is mainly fact-based and requires a minimum of intellectual independence and autonomy […]. (466)

Reformers understand themselves as activists that reject the narrow framework of journalism. They promote policy change and aim to hold politicians accountable as well as to improve public discourse. Independence from mainstream media organizations is seen as an important indicator of credibility by these fact checkers (Graves and Cherubini Citation2016, 14–16). Finally, experts identify themselves as mainly policy experts or academics and believe politics or economics to be the best suitable background of fact checkers. Fact-checking organizations devoted to this role identify rather as think tanks than as journalists or activists while often still promoting policy change (Graves and Cherubini Citation2016, 17).

Because these role concepts were derived from self-assessments of fact checkers in Europe they are lacking a non-liberal dimension which is to be expected in audience perceptions considering the rejection of other “mainstream” media by some (see e.g., Lilienthal and Neverla Citation2017; Holt and Haller Citation2017; Voigt Citation2018). It is reasonable to assume that some people perceive fact checkers as part of a propaganda press that functions as the extended arm of the government. Fitting role concepts can be derived from traditional journalism research (Hanitzsch Citation2007; Hanitzsch and Vos Citation2018; Standaert, Hanitzsch, and Dedonder Citation2019). Of interest here is the collaborative-facilitative dimension of journalism by Hanitzsch and Vos (Citation2018) which formalizes the view of state-supporting propagandistic journalism that might be held by some recipients about fact checkers. It

emphasizes an understanding of journalists acting as constructive partners of government and supporting it in its efforts to bring about national development and socio-economic well-being. In such a role, journalists may be defensive of authorities and routinely engage in self-censorship, and they tend to exhibit a paternalistic attitude toward “the people”. (Standaert, Hanitzsch, and Dedonder Citation2019, 5)

This dimension can be subdivided into the facilitator, the collaborator and the mouthpiece (Hanitzsch and Vos Citation2018, 156).

Facilitators assist the government which, in their opinion, offers stability and unity. As a form of development journalism, facilitative journalism focusses on unification and nation building. More relevant in the light of lying press accusations in the western democracies are the collaborator and the mouthpiece. The collaborator is part of the state apparatus and defends the government in a propagandistic way. The mouthpiece functions as a channel for the government to disseminate official information. It aims to improve communication between the government and the people and to legitimize its policies (Hanitzsch and Vos Citation2018, 156). I summarize these attributes in a role dimension I call the collaborative facilitator here. In sum, I collected 4 fact checker role dimensions: The reporter, expert, activist and the collaborative facilitator. I expect user perceptions to mostly match the categories identified by Graves and Cherubini (Citation2016). However, I also expect the perception of fact checkers to contain an additional, collaborative-facilitative dimension.

Hypothesis 3: Fact checker roles as perceived by users will match the fact checker self-perception as reporters, reformers and experts. Fact checker role perceptions by recipients will also contain a fourth, collaborative-facilitative, dimension.

Method

In an online survey-embedded experiment, German social media users rated the credibility of a set of 6 fact checks and their source (dpa, CORRECTIV or the “Facebook Community”) referring to 6 Covid-19 disinformation posts collected from Facebook. The users recruited themselves by means of clicking on the survey link posted from Monday (18.05.2020) to Friday (29.05.2020) in the biggest German Corona debate/denial groups as well as in the researcher’s private feeds on Facebook. Friends and acquaintances were encouraged via Facebook and Instagram to share the survey link in their Facebook feeds. Participants were incentivised to invite others with the offer of an additional lottery ticket for the participation prize raffle of 100€.

Dependent Variables

Trust and credibility are often measured with a simple one-item Likert scale or by means of semantic differentials (Rössler, Citation2011; Arendt, Haim, and Beck Citation2019; Seo et al., Citation2019). In order to achieve accurate measurement while still keeping the message credibility rating task simple, Appelman and Sundar's (Citation2016) approach of three items, accuracy, authenticity and believability (71), was applied here. The five-item scale (1 = strongly disagree, 5 = strongly agree) achieved an excellent internal consistency according to Cronbach’s Alpha (α = .90). Following Appelman and Sundar (Citation2016) source credibility, i.e., the credibility participants ascribe to the fact-checking organization providing the fact check, is most efficiently measured with the four items, authoritative, reliable, reputable, and trustworthy (72). Source credibility reached its highest internal consistency (α = .93) without the item authoriative but with reliable, reputable and trustworthy. All items were translated to their, in everyday language use, best fitting expression in German.

The roles of fact checkers were adapted from the dimensions of the reporter, the reformer and the expert found by Graves and Cherubini (Citation2016) and the collaborative-facilitative dimension of journalism derived from Hanitzsch (Citation2007; see also Hanitzsch and Vos Citation2018). While the dimensions do overlap, each role also comes with unique characteristics that can be operationalized as items on a fact checker characteristic scale. Because the field of professional journalism is difficult to demarcate (Vos Citation2018) and participants’ ideas about it can vary, a definition by Hanitzsch and Seethaler was operationalized as a latent measure for professional journalism which is one main characteristic of the reporter role:

Journalism is (1) a professional activity that takes place in (2) organizational structures. In addition, journalism (3) is a “service to society” or to the public, (4) provides current and relevant information, (5) is mainly fact-based and (6) requires a minimum of intellectual independence and autonomy […]. (Hanitzsch and Seethaler Citation2009, 466)

The attributes of every role category were paraphrased as a set of 21 statements the participants could agree or disagree with on a 5-point Likert scale (see in the results section).

Independent Variables

The measurement of trust in media and politics was based on the World Values Survey (Inglehart et al. Citation2014) and the European Values Study (European Comission Citation2017). Participants were asked to rate their confidence in several institutions by indicating their agreement with “overall I trust … ” for each item on a 5-point Likert scale from “not at all” to “very much”. Political trust was operationalized as confidence in the national and the European parliament, the government, and the traditional political parties. Media trust was measured as trust in traditional print media, their online appearances and public broadcasters. Trust in alternative news media was measured additionally. Trust in media and politics as one concept (trust nexus) reached an excellent internal consistency (α = .96) with the items traditional print media, traditional media online, public broadcasters, national parliament, European parliament, the government and traditional parties.

Other Factors

The political orientation of participants was measured on a ten-point scale from “political left” (1) to “political right” (10) without semantic cues in-between the extrema. The participants were asked to indicate where on that scale they would position themselves, as usual for this type of measurement (Nyhan and Reifler Citation2010; Inglehart et al. Citation2014; Arendt, Haim, and Beck Citation2019). Participants were also asked about their familiarity with the topic because topic knowledge can relate to perceived trustworthiness of a text (Strømsø, Bråten, and Britt Citation2011). Participants were asked how well they were informed about the topic Covid-19 on a five-point scale from not at all to very well. Also, the topic of Covid-19 calls for an inclusion of trust in science which is generally high but displays partisan gaps between the left and right (Funk et al. Citation2020). Trust in science and conventional medicine, by which I mean medicine taught at university, were included as potential predictors for belief in disinformation and fact checks on Covid-19. Finally, socio-demographic factors were surveyed including age, gender, education, occupation status and country of residence.

Procedure

Participants were first surveyed on socio-demographic data and then randomly sorted into three groups. On the following pages every group was presented with the same 6 disinformation posts, their fact-checking corrections and 4 real news stories as fillers in random order whereas every disinformation post was always followed by its correction. Group 1 saw fact checks provided by CORRECTIV, group 2 got presented with fact checks by the dpa and the fact checks of group 3 were provided by the fictious source “Facebook community”. The participants were asked to rate the credibility of the content displayed on every page and the credibility of the fact-checking source after the set of stimuli. The questionnaire was pre-tested by 12 participants whose notes were used to improve the questionnaire where necessary.

The Covid-19 news posts that functioned as fillers were composed in the typical style of a Facebook post, that is they came with a headline, cover picture and teaser text. However, because disinformation on social media often comes in the format of so-called share-pics, these were mainly used for the disinformation stimuli. Share-pics are an appropriation of meme culture and usually consist of an image, or neutral coloured background, on which a short text message is written. The messages are typically simple and invite for sharing which makes them particularly “spreadable” (Jenkins, Ford, and Green Citation2013). Fact checks (corrective messages) on Facebook distinguish between different degrees of truth (Facebook Citation2020). For this study, only disinformation posts that were labelled “false” or “partly false” were used. Facebook’s definition for these two categories is displayed in . Fact checks contained an initial assessment “Bewertung”, an explaination of what that assessment means, the source of the fact check and a corrective claim. They were displayed on the page after the respective false claim, which differs from the way they are presented on Facebook (semi-transparent and/or underneath the post), for the sake of internal validity. The fact checks used here were all derived from Facebook via CORRECTIV which provides a list of debunked disinformation about Covid-19 on Facebook (https://correctiv.org/faktencheck/coronavirus/).

Table 1. Facebook’s rating guidelines for false and partly false content.

Sample

Before the survey data was further processed, data quality was ensured by excluding those participants who did not fill out the questionnaire completely, that is they did not reach relevant questions on political ideology and media trust, or filled it out extremely fast according to the degradation time index provided by Leiner’s (Citation2019) survey tool SoSci Survey (values above 100 are considered suspicious by convention), for instance because they were solely interested in the financial incentive for participation. 76 participants were excluded by this procedure. Most dropouts (25.7%) occurred at the beginning after the informed consent and data protection information. A noticeable accumulation of dropouts occurred only at one particular point in the questionnaire: 18.9% dropped out at news filler 4, a news post on the number of corona deaths in New York (see appendix). Why can only be speculated. Participants could have felt overwhelmed by the grim news, they might also have felt offended by the narrative of virus severity or might have just lost interest in the questionnaire. The final sample included 123 participants (68.3% female) of which 71.5% were younger than 40 years old (M = 32.82, SD = 14.10). The youngest participant was 18 and the oldest 69. Most of the participants had a university degree (39.8%) or a high school degree (35%), followed by advanced technical college entrance qualification (8.1%) and secondary school leaving certificate (7.3%). Most participants were students (46.3%), followed by employees (31.7%) and self-employed participants (7.3%). On a 10-point political ideology scale, the majority (62.6%) identified at least slightly to the left (M = 3.92, SD = 1.69). Age (above or below 40 years old) χ2 (N = 123, 2) = .174, p = .916, gender χ2 (N = 123, 4) = 2.23, p = .694, education χ2 (N = 123, 12) = 16.04, p = .190, occupation χ2 (N = 123, 16) = 20.55, p = .196 and political ideology (left or conservative/right) χ2 (N = 123, 2) = .42, p = .809 did not differ significantly between the three fact-checking stimuli groups CORRECTIV, dpa and Facebook community.

Results

Fact Checker Roles

I hypothesized user perceptions of fact checker roles to match the role dimensions derived from fact checker self-assessment by Graves and Cherubini (Citation2016) with an additional, collaborative-facilitative dimension. To test this hypothesis, the 21 role function items were entered into factor analysis using Principal Components extraction with Varimax rotation based on Eigenvalues (>1.00), KMO = .86, χ2 (N = 121, 210) = 1288.59, p < .001. The resultant model of four factors explained 61.5% of the variance in role perception (see ).

Table 2. Factor loadings for fact checker roles.

The reporter. The first factor includes 11 items all related to the role of the reporter identified by Graves and Cherubini (Citation2016) and the definition of professional journalism by Hanitzsch and Seethaler (Citation2009). Fact checkers are providers of information, offer a service to society and work fact based. Fact checkers’ independence from both politics and traditional media as well as their expertise in economy, their status as academics and generally knowledgeable persons underlines the journalistic ideal of intellectual independence and autonomy. An assumption of slightly diminished criticism of the government is implied by reporters wanting to improve communication between the government and the people and mainly relying on official information, whereas official could have been understood as correct or objective by the participants. One of the reporter's most important functions is to improve public discourse.

The expert: The second factor contains three items related to fact checkers being mainly policy experts who practice fact-checking as a profession and work together in organizational structures. This reflects closely the expert role identified by Graves and Cherubini (Citation2016) who found that many fact-checking organizations first and foremost were founded by professionals from research or policy that organized themselves, initially often without an explicit editorial role or clear distinctions between research and editorial functions (17). The internal consistency of this factor is rather low and indicates that for future research an improvement of the scale is necessary.

The reformer: The third factor contains two items and describes fact checkers as promoters of policy change who aim to hold politicians accountable. It matches closely the role of the reformer identified by Graves and Cherubini (Citation2016) who found that fact-checking outlets that see themselves mainly as reformers tend to reject the narrow framework of journalism. Those fact checkers often have protest or NGO backgrounds who want to “involve average people into the process of accountability of officials” (Graves and Cherubini Citation2016, 14). The internal consistency of this factor is acceptable.

The collaborative-facilitator: The fourth factor includes five items all related to fact checkers being collaborative-facilitators for state propaganda. They function as uncritical collaborators who support the government as facilitators for its agenda, defend the ruling elites and steer public opinion. By disseminating information from the government to the people they are a mouthpiece for the people in power. They are not independent journalists but part of the state and as such they legitimize its policy and actions (Hanitzsch and Vos Citation2018, 156). In the face of the currently accumulating lying press accusations by corona protesters, this dimension was to be expected. The internal consistency of this factor is high.

Responses from an open text field confirm that some people see fact checkers as part of a propaganda system. The option was hardly used, but 9 participants left revealing comments. One participant wrote

if you take a look behind the scenes of these fact checkers, you will immediately see who sponsored them and whose opinion they represent. For me totally unbelievable and even dangerous, because they deliberately manipulate the opinion of most people.

Another participant noted that “[…] meanwhile acknowledged scientists are presented as untrustworthy if it does not correspond to the mainstream! And I am not talking about conspiracy theorists!” Yet another participant noted that fact checkers would mainly criticize critics of the mainstream and tread inferiors under foot. Two participants expressed less sweeping critique about fact checkers and noted that one should question their independence when they work profit oriented and that one should be careful to assert more truth to fact-checking than to other journalistic content. Finally, one participant asked whether humanity has become “so stupid” that it even needed fact-checking in the first place and two others only expressed whether they knew fact-checking services.

The greatest support by the participants is given to the role of the reporter with 61.8% perceiving this role as fitting for fact checkers (M = 3.20, SD = .68). The collaborative facilitator is perceived as fitting by the lowest number of participants (18%). However, 37.7% of the participants at least latently agree (agreement + undecided) with that role being fitting for fact checkers. The agreement with each dimension is displayed in . In conclusion, hypothesis 3 can be supported. Fact checker roles as perceived by users match fact checker self-perception as reporters, reformers and experts but also contain a fourth, collaborative-facilitative, dimension. The reporter role is most salient among the participants of this study and only a minority shares the manifest belief of fact checkers being collaborative-facilitators. Belief and disbelief in fact checkers being reformers or experts is approximately equally held by the participants.

Table 3. Agreement (rather & strongly), disagreement (rather & strongly) for fact checker roles with mean (five-point scale) and standard deviation.

Credibility & Media Trust

For analyses of perceived credibility, first the perceived message credibility scale items (α = .90) for the fact-checking messages, the disinformation messages and the real news posts, as well as the source credibility measure items (α = .93) were recoded as one variable each. Average scores of 4 and 5 were treated as high and very high credibility, averages of 2 and 1 as low or very low credibility. A tendency to lean towards low or high credibility is indicated by digressing from the scale midpoint. The individual items for media trust (α = .93) and trust in politics (α = .95) were recoded as one joined variable for trust in media and politics (α = .96). The perceived credibility of fact-checking messages overall was high with 75.6% of the participants tending to perceive them as credible (M = 3.74, SD = .79). The perceived credibility of the disinformation posts was low with 83.7% of the participants tending to rate their credibility low (M = 2, SD = .83).

The perceived credibility of fact-checking sources was good, with 56.9% of the participants tending to rate them as credible. 38.2% rated the credibility of the fact-checking sources as high or very high (M = 3.19, SD = 1.15). As displayed in , the most credible fact-checking source was the dpa (M = 3.79, SD = 1.01) followed by CORRECTIV with (M = 3.15, SD = 1.10). Facebook community as a fictious crowd-sourced fact-checking source was perceived as least credible (M = 2.64, SD = 1.04).

Figure 1. Perceived credibility of fact-checking sources in percent. Note: Mean values >3 are considered credible and values <3 not credible.

Figure 1. Perceived credibility of fact-checking sources in percent. Note: Mean values >3 are considered credible and values <3 not credible.

To test the hypothesis 2, “people oriented to the right of the political ideology spectrum are more likely to believe fake news and are less likely to assert credibility to fact-checking sources and messages”, a linear regression with fake news credibility as criterium and political ideology as predictor was conducted. When political ideology was used as a single predictor, the model was found to be significant, F(1, 121) = 26.31, p < .001, R2 = .18. Political ideology had a positive significant influence on perceived credibility of fake news (β = .42, p < .001). The effect of political ideology on the perceived credibility of fact-checking messages is reversed F(1, 121) = 13.79, p < .001, R2 = .10. Political ideology had a significant negative influence on perceived credibility of fact-checking messages (β = −.32, p < .001). The same accounts for the influence of political ideology (β = −.32, p < .001) on fact-checking source credibility F(1, 121) = 13.76, p < .001, R2 = .10. In line with hypothesis 2, participants who are oriented to the right of the political ideology spectrum are more likely to believe fake news and are less likely to assert credibility to fact-checking sources and messages. Coefficients for model quality can be found in . However, hypothesis 2 can only be partially supported by the results of the present study. As shown in Table , political ideology is no longer significant in a more complex model with trust in media and politics.

Table 4. Coefficients of model quality for political ideology.

The first aim of this study was to find out to what extent the perceived credibility of fake news and fact-checking could be explained by trust in media and politics. In this sample, overall media trust was high (M = 3.59, SD = 1.15). Trust in politics was equally strong (M = 3.36, SD = 1.14) as was trust in science (M = 4.47, SD = .74) and conventional medicine (M = 4.01, SD = .79). However, a considerably high number (23.6%) of the participants leaned towards distrust in media and politics. Very few people tended to trust alternative media (M = 1.96, SD = 1.05). Approximately one third of the participants each trusted and distrusted fact-checking websites (M = 2.99, SD = 1.15). All trust scores are displayed in .

Figure 2. Trust in institutions in percent.

Figure 2. Trust in institutions in percent.

To test hypothesis 1, “the more people trust media and politics, the more they perceive fact-checking sources and messages as credible. The less they trust media and politics, the more they perceive disinformation to be credible”, for each, fake news, fact checker and fact check credibility a multiple regression analysis was conducted. Next to political ideology, trust in media and politics was added together with the collaborative-facilitative role dimension of fact-checking as well as trust in science, trust in conventional medicine and trust in alternative media. Age, gender (dummy), education (dummy) and topic familiarity were included as controls.

For fact-checking source credibility the model was found to be significant, F(10, 110) = 11.94, p < .001, R2 = .52. Only trust in media and politics (β = .37, p < .001) and trust in conventional medicine (β = .25, p = .004) were found to be significant predictors while political ideology was not significant (β = −.14, p = .061). Because the perceived credibility of Facebook community as a fact-checking source was that much lower than for dpa and CORRECTIV (see ), the same model was calculated again without the fictious fact-checking source. The model was found to be significant, F(10, 73) = 9.81, p < .001, R2 = .57. This time, additionally the collaborative-facilitator (β = −.27, p = .015) was found to be a significant predictor which indicates that it is more important for the perceived credibility of professional fact checkers than for laypersons. Coefficients of model quality can be found in .

Table 5. Coefficients of model quality for media trust and fact-checking source.

Also for fact-checking message credibility the model was found to be significant, F(10, 110) = 12.81, p < .001, R2 = .54. Only trust in media and politics (β = .31, p = .001), trust in alternative media (β = −.19, p = .027) and trust in conventional medicine (β = .19, p = .030) were found to be significant predictors while political ideology was not significant (β = −.08, p = .264). Without “Facebook community” the model was found to be significant, F(10, 73) = 10.40, p < .001, R2 = .59. This time, trust in media and politics (β = .25, p = .026), perceiving fact checkers as collaborative-facilitators (β = −.24, p = .026) and trust in conventional medicine (β = .26, p = .011) were found to predictors while trust in alternative media was not significant anymore (β = −.17, p = .111). Coefficients of model quality can be found in .

Table 6. Coefficients of model quality for media trust and fact-checking message.

Finally, for fake news credibility the model was found to be significant, F(10, 110) = 13.44, p < .001, R2 = .55. Only trust in media and politics (β = −.42, p < .001), trust in alternative media (β = .24, p = .006) and political ideology (β = .18, p = .013) were found to be significant. As a comparison, for real news credibility, the model was found to be significant, F(10, 110) = 3.58, p < .001, R2 = .25 with only trust in conventional medicine as a significant predictor (β = .27, p = .013). However, it needs to be pointed out again here that the real news fillers used did not include source cues for credibility as opposed to the fact checks and a comparison of both was not the aim of this study. Coefficients of model quality can be found in .

Table 7. Coefficients of model quality for media trust and fake news with comparison real news posts.

Concluding, hypothesis 1 can be supported with these findings. Trust in media and politics is a strong predictor for perceived credibility of fact-checking sources, fact-checking messages and fake news. All three models explain between 52% and 55% of the variance in perceived credibility. The more the participants trusted media and politics, the more they perceived fact-checking sources and messages as credible. The less they trusted media and politics, the more they perceived disinformation to be credible. Those who perceive fact checkers as collaborative facilitators have less trust in dpa and CORRECTIV fact checks. Trust in alternative media strengthens the perceived credibility of disinformation, as well as political right orientation does. The relationship between trust in media and politics and the perceived credibility of fact-checking is displayed again in . An overview of hypotheses and their results in form of support or rejection can be found in .

Figure 3. Percentage of perceived fact-checking and disinformation credibility as a function of media trust.

Figure 3. Percentage of perceived fact-checking and disinformation credibility as a function of media trust.

Table 8. Overview of hypotheses and results.

Discussion

The aim of this paper was to achieve a better understanding of trust in fact-checking services on social media. It was found that large parts of the sample in this study view fact checkers as journalists who focus on facts, are independent and first and foremost provide a service to society. Nevertheless, it also became clear that many perceive fact checkers as the extended arm of the government - a propaganda department that collaborates with the political elite and facilitates its interests. People who see fact checkers as collaborative facilitators trust them and consequently their fact checks less. This effect is particularly pronounced for professional fact-checking services which could be explained by a perceived higher entitativity and efficacy and hence higher threatening potential of the professional fact checking sources (Clark and Wegener Citation2013, 214). Future research should test for this possibility. A perceived closeness of fact checkers to the state could indicate that Facebook’s long held policy not to subject politicians to fact checks might have had a negative effect on the perception of the objectivity of fact checkers. Another plausible explanation is that distrust in traditional media and politics spills over into fact checking. The literature has long indicated self-reinforcing spirals of media trust and consequent media use and political conviction. Those who sweepingly distrust the media tend to rely on broad repertoires of media-cynical alternative media and they are more likely to believe in conspiracist ideation (Jackob Citation2010; Tsfati Citation2010; Schultz et al. Citation2017).

For most participants here, stronger media trust reduced the credibility of disinformation and strengthened trust in fact-checking. Consequently, the most trusted fact-checking source here was the renowned news agency dpa, followed by the established fact-checking site Correctiv. Distrust in media, however, predicted distrust in fact-checking. Crowd-sourced fact checks, i.e., fact-checks that have no connection to mass media, do not solve this trust problem. In the “parallel worlds of knowledge” (Pantenburg, Reichardt, and Sepp Citation2021) of some recipients the word of fact checkers counts for nothing. These findings thwart the efforts of fact checkers to be perceived as independent actors (Graves and Cherubini Citation2016, 14–16). On the one hand, they benefit from the established reputation of major media brands, but on the other hand, as part of the traditional media, they are seen by some as untrustworthy from the outset. While definitive solutions for this issue cannot be offered here, there are some implications for fact checkers to consider:

  1. More boundary work will be necessary for fact checkers to establish a clearer understanding of who they are and, more importantly, who they are not. The term “fact-check” is at risk to become as inflated as “fake news” if fact checkers don’t find ways to distinguish themselves and their debunking work for their audience from political actors (politicians, think-tanks, activists, etc.) appropriating it.

  2. In the long term, fact checkers must build up the necessary reputation to maintain their role as gatewatchers in the networked public. In the medium term, stronger ties to well-known media offer obvious advantages of immediate reputation and should be considered, as long as critical observation of media reporting also remains possible.

  3. Finally, the resistance of parts of the audience against fact checkers debunking efforts begs the question of whether these parts of the audience should be viewed as a target group at all. In other words, without any shared understanding of reality fact checkers have truly little common ground to work on and there might be little to gain for fact checkers with recipients who sweepingly distrust them. The normative issues that arise when fact checkers discriminate between tiers of target groups go beyond the scope of this paper. It is, however, a very practical question of where to focus limited resources on that fact checkers must answer for themselves. Clear criteria for what one might call debunkworthiness, that is for when a (false) claim ought to be verified or debunked could help in that regard. Such criteria should include (next to established criteria of newsworthiness) both the potential harm a claim could cause and some sort of weighted spread indicator based on the target audience.

As any study, the one presented here is also subject to a number of limitations such as the non-representative sample—an issue which recedes behind the explorative objective of the study. Ultimately, it would always be difficult to define a population for recipients of fact checks and disinformation. More important are the concrete problems the convenience snowball sampling strategy brought with it. For example, due to data protection concerns, no possibility to track how individual participants got to the questionnaire was included. Therefore, a control variable that could be important is missing: Whether a participant belongs to a certain community of, for instance, corona protestors or not. Alternatively, one could have limited the study to a certain community from the outset. Distortions due to self-recruitment and deliberate manipulation, as is often done by the Covid protestor scene in popular science online surveys of traditional news media, for example, cannot be ruled out that way either. My suggestion on the quality of ecological validity in future studies would, therefore, be to fall back on unobtrusive measurements and, for example, to carry out content analyses in relevant groups, where quite extensive debates on the work of fact checkers do in fact exist. To broaden the scope, upcoming research could additionally test the credibility effects of fact checks with verified information. Only debunked Facebook posts were used as stimuli here. However, as Luengo and García-Marín (Citation2020) argue, fact checking can fulfil different functions of de-fusion or re-fusion of citizens and institutions depending on the type of claim to be checked (top-down or bottom-up; debunked or verified).

One could also argue that the roles and functions of fact checkers used here might not be exhaustive as they were deductively derived from self-descriptions of fact checkers and theory on journalism. However, it was not the aim of this study to produce an exhaustive list of perceived fact-checking functions but rather to confirm the self-ascribed roles of fact checkers as found by Graves and Cherubini (Citation2016) and to complement them with an expected negative role perception from journalism theory. Nevertheless, for future studies, it might be beneficial to qualitatively collect recipient perceptions of fact-checking first and to focus more on basic functions (see e.g., Fawzi Citation2020) than on pre-defined roles, in order to achieve an even more nuanced account of fact-checking role perceptions.

Finally, an important limitation is found in the simple operationalization of the trust nexus between media trust and political trust. Future research could adapt the four factors of trust in journalism (trust in topic selectivity, selectivity of facts, accuracy of descriptions and valuations) by Matthes and Kohring (Citation2003) in order to arrive at more nuanced conclusions about where exactly mistrust or trust in media is located for the particular topic. These locations might differ by societal groups and milieus. In conclusion, the results obtained fulfill mainly one function: They shed some light on perspectives that deserve more attention in the future. Trust in media and politics and user perceptions and expectations of fact-checking are decicive factors and could determine the success of a still young profession that tries to find its place in-between traditional news providers and the participatory web.

Supplemental material

Supplemental Material

Download MS Word (4.6 MB)

Disclosure Statement

No potential conflict of interest was reported by the author(s).

References

  • Amazeen, M. 2015. “Revisiting the Epistemology of Fact-Checking.” Critical Review 27 (1): 1–22. doi:10.1080/08913811.2014.993890.
  • Appelman, A., and S. Sundar. 2016. “Measuring Message Credibility: Construction and Validation of an Exclusive Scale.” Journalism & Mass Communication Quarterly 93 (1): 59–79. doi:10.1177/1077699015606057.
  • Arendt, F., M. Haim, and J. Beck. 2019. “Fake News, Warning Messages, and Perceived Truth Value: Investigating the Differential Susceptibility Hypothesis Related to Political Orientation.” Publizistik 64 (2): 181–204. doi:10.1007/s11616-019-00484-4.
  • Awad, I. 2011. “Latinas/os and the Mainstream Press: The Exclusions of Professional Diversity.” Journalism 12 (5): 515–532. doi:10.1177/1464884911408221.
  • Baumgärtner, M., F. Bohr, R. Höfner, T. Lehmann, A.-K. Müller, S. Röbel, M. Rosenbach, J. Schaible, W. Widemann-Schmidt, and S. Winter. 2020, May 14. Protests in Germany See Fringe Mix with the Mainstream. DER SPIEGEL. https://www.spiegel.de/international/germany/the-corona-conspiracy-theorists-protests-in-germany-see-fringe-mix-with-the-mainstream-a-8a9d5822-8944-407a-980a-d58e9d6b4aec.
  • Baym, N., and D. boyd. 2012. “Socially Mediated Publicness: An Introduction.” Journal of Broadcasting & Electronic Media 56 (3): 320–329. doi:10.1080/08838151.2012.705200.
  • Bayrischer Rundfunk [BR]. 2020, May 6. “ARD-Team auf Demo angegriffen [ARD Team Attacked on Demo].” Bayrischer Rundfunk. https://www.br.de/nachrichten/deutschland-welt/ard-team-auf-demo-angegriffen,RyEu4cc.
  • Brandtzaeg, P., and A. Følstad. 2017. “Trust and Distrust in Online Fact-Checking Services.” Communications of the ACM 60 (9): 65–71. doi:10.1145/3122803.
  • Brandtzaeg, P., A. Følstad, and M. Chaparro Domínguez. 2018. “How Journalists and Social Media Users Perceive Online Fact-Checking and Verification Services.” Journalism Practice 12 (9): 1109–1129. doi:10.1080/17512786.2017.1363657.
  • Clark, J. K., and D. T. Wegener. 2013. “Message Position, Information Processing, and Persuasion: The Discrepancy Motives Model.” In Advances in Experimental Social Psychology, edited by P. Devine, and A. Plant, 189–232. Amsterdam: Elsevier.
  • CORRECTIV- Recherchen für die Gesellschaft. 2020, June 5. “Urteil: OLG Karlsruhe beanstandet nicht Faktencheck von CORRECTIV, sondern nennt einzelne Verknüpfung missverständlich [Ruling: OLG Karlsruhe Does Not Complain About the Fact Check of CORRECTIV, But Names Individual Links Misleadingly].” https://correctiv.org/in-eigener-sache/2020/06/05/urteil-des-olg-karlsruhe/.
  • Dahlgren, P. 2018. “Media, Knowledge and Trust: The Deepening Epistemic Crisis of Democracy.” Javnost 25 (1-2): 20–27. doi:10.1080/13183222.2018.1418819.
  • Decker, O., and E. Brähler. 2016. “Ein Jahrzehnt der Politisierung: Gesellschaftliche Polarisierung und gewaltvolle Radikalisierung in Deutschland zwischen 2006 und 2016 [A Decade of Politicization: Social Polarization and Violent Radicalization in Germany Between 2006 and 2016].” In Die enthemmte Mitte. Autoritäre und rechtsextreme Einstellungen in Deutschland. Die Leipziger Mitte Studie 2016. The uninhibited center. Authoritarian and extreme right-wing attitudes in Germany, edited by O. Decker, J. Kiess, & E. Brähler, 95–136. The Leipzig Centre Study 2016. Psychosozial-Verlag. doi:10.30820/9783837972337-95.
  • European Comission. 2017, November 5. Standard Eurobarometer 88: Media Use in the European Union. Publications Office of the European Union. https://ec.europa.eu/commfrontoffice/publicopinion/index.cfm/ResultDoc/download/DocumentKy/82787.
  • Facebook. 2020. Fact-Checking on Facebook: What Publishers Should Know. Facebook. https://www.facebook.com/business/help/182222309230722.
  • Fawzi, N. 2020. “Objektive Informationsquelle, Watchdog und Sprachrohr der Bürger? Die Bewertung der Gesellschaftlichen Leistungen von Medien Durch die Bevölkerung [Objective Source of Information, Watchdog and Mouthpiece for Citizens? The Evaluation of the Social Perfor].” Publizistik 65: 187–207. doi:10.1007/s11616-020-00572-w.
  • Fessler, D. P. 2017. “Political Orientation Predicts Credulity Regarding Putative Hazards.” Psychological Science 28: 651–660. doi:10.1177/0956797617692108.
  • Funk, C., A. Tyson, B. Kennedy, and C. Johnson. 2020, September 29. Science and Scientists Held in High Esteem Across Global Publics. Pew Research Center. https://www.pewresearch.org/science/2020/09/29/science-and-scientists-held-in-high-esteem-across-global-publics/.
  • Garrett, R., and B. Weeks. 2017. “Epistemic Beliefs’ Role in Promoting Misperceptions and Conspiracist Ideation.” PLoS ONE 12 (9). doi:10.1371/journal.pone.0184733.
  • Graves, L. 2017. “Anatomy of a Fact Check: Objective Practice and the Contested Epistemology of Fact Checking.” Communication, Culture & Critique 10 (3): 518–537. doi:10.1111/cccr.12163.
  • Graves, L. 2018. “Boundaries Not Drawn.” Journalism Studies 19 (5): 613–631. doi:10.1080/1461670X.2016.1196602.
  • Graves, L., and F. Cherubini. 2016. The Rise of Fact-Checking Sites in Europe. Oxford: Reuters Institute for the Study of Journalism.
  • Graves, L., B. Nyhan, and J. Reifler. 2016. “Understanding Innovations in Journalistic Practice: A Field Experiment Examining Motivations for Fact-Checking.” Journal of Communication 66 (1): 102–138. doi:10.1111/jcom.12198.
  • Hanitzsch, T. 2007. “Deconstructing Journalism Culture: Toward a Universal Theory.” Communication Theory 17 (4): 367–385. doi:10.1111/j.1468-2885.2007.00303.x.
  • Hanitzsch, T., and J. Seethaler. 2009. “Journalismuswelten – Ein Vergleich von Journalismuskulturen in 17 Ländern [Journalism Worlds – A Comparison of Journalism Cultures in 17 Countries].” Medien und Kommunikationswissenschaft 57 (4): 464–483. doi:10.5771/1615-634x-2009-4-464.
  • Hanitzsch, T., A. van Dalen, and N. Steindl. 2018. “Caught in the Nexus: A Comparative and Longitudinal Analysis of Public Trust in the Press.” The International Journal of Press/Politics 23 (1): 3–23. doi:10.1177/1940161217740695.
  • Hanitzsch, T., and T. Vos. 2018. “Journalism Beyond Democracy: A New Look into Journalistic Roles in Political and Everyday Life.” Journalism 19 (2): 146–164. doi:10.1177/1464884916673386.
  • Holt, K., and A. Haller. 2017. “What Does “Lügenpresse” Mean?: Expressions of Media Distrust on PEGIDA’s Facebook Pages.” Tidsskriftet Politik 20 (4): 42–57.
  • Inglehart, R., C. Haerpfer, A. Moreno, C. Welzel, K. Kizilova, and J. Diez-Medrano. 2014. World Values Survey. Retrieved April 27, 2020, from Round Six – Country-Pooled Datafile Version: www.worldvaluessurvey.org/WVSDocumentationW.
  • Jackob, N. 2010. “No Alternatives? The Relationship Between Perceived Media Dependency, Use of Alternative Information Sources, and General Trust in Mass Media.” International Journal of Communication 4 (18): 589–606. doi:10.7146/politik.v20i4.101534.
  • Jenkins, H., S. Ford, and J. Green. 2013. Spreadable Media: Creating Value and Meaning in a Networked Culture. New York: NYU Press.
  • Kata, A. 2012. “Anti-Vaccine Activists, Web 2.0, and the Postmodern Paradigm – An Overview of Tactics and Tropes Used Online by the Anti-Vaccination Movement.” Vaccine 30 (25): 3778–3789. doi:10.1016/j.vaccine.2011.11.112.
  • Kohring, M. 2004. Vertrauen in Journalismus: Theorie und Empirie [Trust in Journalism: Theory and Empirics]. Konstanz: UVK Verlagsgesellschaft.
  • Lee, E., and S. Shin. 2019. “Mediated Misinformation: Questions Answered, More Questions to Ask.” The American Behavioral Scientist 65 (2): 259–276. doi:10.1177/0002764219869403.
  • Leiner, D. J. 2019. SoSci Survey (Version 3.1.06) [Computer software]. https://www.soscisurvey.de; https://www.soscisurvey.de/help/doku.php/en:results:variables.
  • Lewandowsky, S., U. Ecker, C. Seifert, N. Schwarz, and J. Cook. 2012. “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest 13 (3): 106–131. doi:10.1177/1529100612451018.
  • Lilienthal, V., and I. Neverla. 2017. “Lügenpresse“. Anatomie eines politischen Kampfbegriffs [“Lying Press.” Anatomy of a Political Combat Term]. Kiepenheuer & Witsch.
  • Lombardi, D., E. Nussbaum, and G. Sinatra. 2016. “Plausibility Judgments in Conceptual Change and Epistemic Cognition.” Educational Psychologist 51 (1): 35–56. doi:10.1080/00461520.2015.1113134.
  • Lombardi, D., V. Seyranian, and G. Sinatra. 2014. “Source Effects and Plausibility Judgments When Reading About Climate Change.” Discourse Processes: Comprehension and Validation of Text Information 51 (1-2): 75–92. doi:10.1080/0163853X.2013.855049.
  • Luengo, M., and D. García-Marín. 2020. “The Performance of Truth: Politicians, Fact-Checking Journalism, and the Struggle to Tackle COVID-19 Misinformation.” American Journal of Cultural Sociology 8: 405–427. doi:10.1057/s41290-020-00115-w.
  • Luhmann, N. 1989. Vertrauen: Ein Mechanismus der Reduktion Sozialer Komplexität [Trust: A Mechanism for Reduction of Social Complexity]. Stuttgart: Ferdinand Enke.
  • Marwick, A., and R. Lewis. 2017. Media Manipulation and Disinformation Online. Data & Society. https://datasociety.net/library/media-manipulation-and-disinfo-online/.
  • Matthes, J., and M. Kohring. 2003. “Operationalisierung von Vertrauen in Journalismus [operationalization of trust in journalism].” M&K Medien Und Kommunikationswissenschaft 51 (1): 5–23. doi:10.5771/1615-634x-2003-1-5.
  • Mejia, R., K. Beckermann, and C. Sullivan. 2018. “White Lies: A Racial History of the (Post)Truth.” Communication and Critical/Cultural Studies 15 (2): 109–126. doi:10.1080/14791420.2018.1456668.
  • Navin, M. 2013. “Competing Epistemic Spaces: How Social Epistemology Helps Explain and Evaluate Vaccine Denialism.” Social Theory and Practice 39 (2): 241–264. doi:10.5840/soctheorpract201339214.
  • Neuberger, C. 2018, August 31. “Journalismus und Digitalisierung: Profession, Partizipation und Algorithmen; Expertise für die Eidgenössische Medienkommission EMEK [Journalism and Digitization: Profession, Participation and Algorithms; Expertise for the Federal Media Commission EMEK].” LMU Universitätsbibliothek, Open Access LMU. https://epub.ub.uni-muenchen.de/42804/.
  • Neuberger, C., A. Bartsch, C. Reinemann, R. Fröhlich, T. Hanitzsch, and J. Schindler. 2019. “Der Digitale Wandel der Wissensordnung. Theorierahmen für die Analyse von Wahrheit, Wissen und Rationalität in der öffentlichen Kommunikation.” Medien & Kommunikationswissenschaft (M&K) 67: 167–186. doi:10.5771/1615-634X-2019-2-167.
  • Newman, N., R. Fletcher, A. Schulz, and R. K. Nielsen. 2020. Reuters Institute Digital News Report 2020. R. I. Journalism, & O. University. http://www.digitalnewsreport.org/survey/2020/resources-2020/.
  • Nyhan, B., and J. Reifler. 2010. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32 (2): 303–330. doi:10.1007/s11109-010-9112-2.
  • Nyhan, B., and J. Reifler. 2015, April 28. Estimating Fact-Checking’s Effects. Evidence from a Long-Term Experiment During Campaign 2014. American Press Institute. https://www.americanpressinstitute.org/wp-content/uploads/2015/04/Estimating-Fact-Checkings-Effect.pdf.
  • Pantenburg, J., S. Reichardt, and B. Sepp. 2021. “Wissensparallelwelten der “Querdenker” [Parallel Worlds of Knowledge of the “Querdenker”].” In Die Misstrauensgemeinschaft der “Querdenker”. Die Corona-Proteste aus Kultur- und und Sozialwissenschaftlicher Perspektive [The Community of Distrust of the “Querdenker”. The Corona Protests from a Cultural and Social Science Perspective], edited by S. Reichardt, 29–66. Frankfurt/Main: Campus Verlag.
  • Powell, G. B. 2000. Elections as Instruments of Democracy: Majoritarian and Proportional Visions. New Haven: Yale University Press.
  • Reinemann, C., N. Fawzi, and M. Obermaier. 2017. “Die Vertrauenskrise der Medien – Fakt Oder Fiktion? Zu Entwicklung, Stand und Ursachen des Medienvertrauens in Deutschland [The “Crisis of Confidence” of the Media – Fact or Fiction? On the Development, Status and Causes of Media Trust in Germany].” In “Lügenpresse”. Anatomie Eines Politischen Kampfbegriffs [“Lying Press.” Anatomy of a Political Combat Term], edited by V. Lilienthal, and I. Neverla, 77–94. Köln: Kiepenheuer & Witsch.
  • Rössler, P. 2011. Skalenhandbuch Kommunikationswissenschaft [Scale Handbook Communication Science]. Wiesbaden: VS Verlag für Sozialwissenschaften.
  • Schultz, T., N. Jackob, M. Ziegele, O. Quiring, and C. Schemer. 2017. “Erosion des Vertrauens Zwischen Medien und Publikum? [Erosion of Trust Between Media and Audience?].” Media Perspektiven 47 (5): 246–259.
  • Seo, H., A. Xiong, and D. Lee. 2019. “Trust It or Not: Effects of Machine-learning Warnings in Helping Individuals Mitigate Misinformation.” Proceedings of Websci ’19: 11th ACM Conference on Web Science. doi:10.1145/3292522.3326012.
  • Shin, J., and K. Thorson. 2017. “Partisan Selective Sharing: The Biased Diffusion of Fact-Checking Messages on Social Media.” Journal of Communication 67 (2): 233–255. doi:10.1111/jcom.12284.
  • Standaert, O., T. Hanitzsch, and J. Dedonder. 2019. “In Their Own Words: A Normative-Empirical Approach to Journalistic Roles Around the World.” Journalism 22 (4): 919–936. doi:10.1177/1464884919853183.
  • Stempel, C., T. Hargrove, and G. Stempel. 2007. “Media Use, Social Structure, and Belief in 9/11 Conspiracy Theories.” Journalism & Mass Communication Quarterly 84 (2): 353–372. doi:10.1177/107769900708400210.
  • Strømsø, H. I., I. Bråten, and M. A. Britt. 2011. “Do Students’ Beliefs About Knowledge and Knowing Predict Their Judgement of Texts’ Trustworthiness?” Educational Psychology 31 (2): 177–206. doi:10.1080/01443410.2010.538039.
  • Tsfati, Y. 2010. “Online News Exposure and Trust in the Mainstream Media: Exploring Possible Associations.” American Behavioral Scientist 54 (1): 22–42. doi:10.1177/0002764210376309.
  • Uscinski, J. E. 2015. “The Epistemology of Fact Checking (is Still Naìve): Rejoinder to Amazeen.” Critical Review 27 (2): 243–252. doi:10.1080/08913811.2015.1055892.
  • Voigt, M. 2018. Die “Lügenpresse” – Ein Nützliches Instrument für den (Rechts-)Populismus ? Politische Kommunikation der AfD. Eine Qualitative Analyse Anhand von Interviews mit Politikern der Alternative für Deutschland. [The “Lügenpresse” – A Useful Tool for (Right-Wing) Populism? Political Communication of the AfD. A Qualitative Analysis Based on Interviews with Politicians of the Alternative for Germany.]. Marburg: Büchner Verlag.
  • Vos, T. 2018. “Journalism.” In Journalism (Vol. Handbooks of Communication Science [HoCS], edited by T. Vos, P. J. Schulz, and P. Cobley, 1–18. Boston, Berlin: De Gruyter Mouton.
  • Walter, N., and R. Tukachinsky. 2020. “A Meta-Analytic Examination of the Continued Influence of Misinformation in the Face of Correction: How Powerful Is It, Why Does It Happen, and How to Stop It?” Communication Research 47 (2): 155–177. doi:10.1177/0093650219854600.
  • Weeks, B. E. 2018. “Media and Political Misperceptions.” In Misinformation and Mass Audiences, edited by B. G. Southwell, E. A. Thorson, and L. Sheble, 140–156. University of Texas Press. doi:10.7560/314555-010
  • Weingart, P. 2009. Die Stunde der Wahrheit? Zum Verhältnis der Wissenschaft zu Politik, Wirtschaft und Medien in der Wissensgesellschaft [The Hour of Truth? On the Relationship of Science to Politics, Business and the Media in the Knowledge Society]. Weilerswist: Velbrück Wiss.
  • Zweites Deutsches Fernsehen [ZDF]. 2020, May 2. Angriff auf heute-Show-Team [Attack on Today Show Team]. ZDF. https://www.zdf.de/nachrichten/panorama/zdf-team-angegriffen-100.html.