Reflecting Party Agendas, Challenging Claims: An Analysis of Editorial Judgements and Fact-checking Journalism during the 2019 UK General Election Campaign

ABSTRACT To counter mis/disinformation, fact-checking organisations are used as sources by journalists to challenge false or misleading statements, especially during election campaigns. But how different fact-checkers editorially construct their analysis and question dubious claims remains under-researched. Drawing on a case study of reporting during the UK’s 2019 General Election campaign, we interviewed senior editors and journalists, and conducted a systematic content analysis of 238 fact-checking stories produced by BBC’s Reality and Channel 4’s Full Fact, along with a fact-checking organisation, Full Fact, in order to critically assess their editorial judgements about the selection of news and use of sources. Our study revealed that fact-checking services at the BBC and Channel 4 were not closely integrated into their routine news production, and that the independent fact-checker, Full Fact, questioned claims differently to broadcasters. We also found that the broadcast agenda of fact-checkers centred on party political agendas and drew on a narrow range of institutional sources to question claims. Overall, we argue that if broadcasters relied more heavily on their fact-checking in routine coverage—beyond election campaigns—they would more effectively counter mis/disinformation, especially if a wider range of expert sources were drawn upon to scrutinize claims.


Introduction
Over recent years, social media platforms and other Internet technologies have made it easier and more common for false or misleading claims about politics and social issues to circulate in the media and amongst public discourse.This can potentially influence public understanding, and lead to misconceptions about issues and policies that have an impact on people's lives.To counter mis/disinformation, news organisations have begun to draw more regularly on fact-checking organisations as sources to challenge false or misleading information and claims (Graves 2016).
Fact-checking is broadly celebrated as a "professional reform movement" in journalism in the fight against the spread of disinformation (Amazeen 2017;Graves and Cherubini 2016).Whilst fact-checking initiatives vary globally, the core aim is one that seeks to revitalise traditional journalism by holding public figures accountable for spreading disinformation and falsehoods (Graves 2016, 6).Not only can it act as a journalistic mechanism of accountability, it also constitutes a tool to help the public navigate misinformation and falsehoods circulating in high-choice media environments (Kyriakidou et al. 2022;Morani et al. 2022).As such, it is viewed as a central development in restoring journalistic legitimacy and enhancing the quality of news reporting at a time of widespread public mistrust in media and civic society (Cushion et al. 2022a).
Fact-checking journalism is viewed as particularly useful in improving journalistic practices and addressing misinformation during election campaigns (Hughes et al. 2022).Scholarship on fact-checking during elections has predominantly focused on the impact fact-checking has on audiences and its corrective potential (Amazeen et al. 2018;Nyhan and Reifler 2010, 2015a, 2015c;Nyhan et al. 2019) as well as explorations, evaluations and critiques of fact-checking practices (Graves 2016(Graves , 2018;;Lim 2018;Uscinski and Butler 2013).However, the way fact-checking is employed by journalists in broadcast media in election reporting remains largely unexplored.This includes the internal practices of news broadcasters with separate fact-checking sites such as BBC Reality Check and Channel 4 FactCheck, as well as the everyday practice of fact-checking in routine reporting itself.More generally, how fact-checking has been embedded in mainstream reporting-as in the case of UK broadcast media-is under-researched (Cushion et al. 2022b).To address this, we combine interviews with some of most senior editors and journalists in the UK with a systematic content analysis of 238 fact-checking stories produced by broadcasters and a fact-checking organisation, Full Fact, during the 2019 election campaign in order to explore their judgements about the selection of news and use of sources.

Disinformation, Fact-Checking and Elections
Although fact-checking has long been part of news reporting (Graves 2016), recently it has become a more prominent part of journalism, largely as a response to increasing concerns about the emergence of the disinformation disorder (Bennett and Livingston 2018).Fact-checking has featured prominently in election campaigns, especially in the aftermath of the UK vote to leave the European Union and the US presidential election in 2016 (Lazer et al. 2018;Lewandowsky, Ecker, and Cook 2017).With the aim of determining the accuracy and "truthfulness" of claims made by public actors and publicising instances of misinformation, fact-checking is distinct from traditional journalistic practices which aim to verify news sources and eliminate inaccuracies in reporting (Amazeen 2019;Graves 2016).This approach of challenging misinformation has been celebrated as a "professional movement" with the potential to revitalize traditional journalism by more effectively and consistently holding public figures accountable for spreading falsehoods (Graves 2016, 6).It has also led to the adoption of fact-checking by legacy media as a distinct practice within their organisations.This has been especially notable in Northern and Western Europe, whereby fact-checking has been led by legacy newsrooms (Graves and Cherubini 2016, 8).In doing so, embedding fact-checking in mainstream news has blurred the boundaries between what it means to be a journalist and a fact-checker, a point that became evident in our interviews.
However, the distinction is still useful to hold, even in cases where fact-checkers are journalists employed by media organisations.The distinction lies, of course, in the differences in the editorial choices of fact-checkers, who focus on factual arguments and their dissection rather than the entirety of political debates and statements of opinion (Graves 2017).At the same time, the adoption of explicit fact-checking practices can increase favourable public attitudes towards the media (Amazeen et al. 2018;Barthel, Gottfried, and Lu 2016;Kyriakidou et al. 2022).
Research on fact-checking has focused on the specificities of editorial choices and their outcomes (Uscinski and Butler 2013).For instance, inconsistencies in the practices of reviewing and presenting claims might undermine the usefulness of fact-checking during election campaigns.This can occur when some fact-checkers review a few claims in one article, whereas others might split investigating one claim over a few articles (Graves 2017;Uscinski and Butler 2013).Related to this, studies have identified different judgement outcomes on the same claim, which might confuse the public and limit public understanding (Marietta, Barker, and Bowser 2015).On the other hand, a review of the two most prominent US sites, Fact Checker and Politifact, revealed that sites rarely factcheck the same statements, making validating fact-checking outcomes even more difficult (Lim 2018).
The above research, however, has been predominantly US-centric.In their meta-analysis of existing literature, Neimenen and Rapeli found that 77% of the studies on factchecking published (by 9 April 2018 at the point of research) were focussed on the US.Increasingly, steps have been taken to look towards other countries and their fact-checking practices, such as France (Barrera et al. 2017), Norway and Spain (Brandtzaeg et al. 2017), Africa (Cheruiyot and Ferrer-Conill 2018), and, most recently, the UK (Birks 2019a).
To advance these debates, the contribution of this study is twofold.First, it examines the role of fact-checking during elections beyond the US context.This can further develop future comparative work on the journalistic practice of fact-checking across different media and political systems.Secondly, there are currently no studies to our knowledge that evaluate fact-checking editorial perspectives and consider their judgements in light of a systematic analysis of output produced by a news organisation.In our view, more research is needed to understand the editorial aims of fact-checking, the production processes, and how this compares to day-to-day output produced during a set period or election campaign.
Developing a case study of UK political reporting during the 2019 General Election campaign, this article asks the following research questions: Fact-Checking Elections in the UK In the US, fact-checking is characterised by the distinctiveness between professional and partisan fact-checkers, and the proximity of the field to academic and non-profit organisations (Graves 2016).By contrast, UK fact-checking services are largely embedded in legacy media and, in particular, public service broadcasters (Graves and Cherubini 2016;Kyriakidou and Cushion 2021).They have adopted fact-checking practices during election periods to address and counter the spread of misinformation during campaigning (Graves and Cherubini 2016).Channel 4's FactCheck was among the first imitators of FactCheck.org-launched in 2003-with its blog covering the 2005 General Election, becoming a permanent feature of its website in 2010 (Graves and Cherbini 2016: 9).The BBC's fact-checking service, Reality Check, began with limited resources in 2015, but during and after the 2016 Brexit campaign was made permanent with its own editorial team (Samuels 2017).Both BBC and Channel 4's services represent prominent fact-checkers in the UK, alongside the independent organisation, Full Fact.The UK's largest independent fact-checker, Full Fact, launched in 2010 as a registered charity (Graves and Cherubini 2016).Whilst other media companies employ fact-checking during election campaigns, what distinguishes these three UK fact-checkers is their consistent fact-checking outside of election periods and their objective to be impartial.For Full Fact, this is due to their independent organisational status, whereas in the case of Reality Check and FactCheck, their impartiality credentials stem from their role as public service broadcasters (Kyriakidou and Cushion 2021).
However, interpretations of how impartiality is achieved by in public service broadcasters can be inconsistent with the epistemology of fact-checking.The "he said, she said" approach to reporting-which is often a symptom of journalists wanting to avoid accusations of bias-can potentially allow for misinformation (Nyhan 2013).It can create false equivalence between competing claims rather than-as fact-checking journalism aims to do-focus on what is accurate or not, and explaining where statements are dubious or misleading.At the same time, the temporality of television news does not often allow for the type of analytical and explanatory information provided by fact checking and their concern with "truthfulness" is largely focused on the accuracy of statements (Ekström 2002) rather than the factual coherence of them (Graves 2017).In this context, and given broadcasters attempts to adopt fact-checking services, how journalists view the role of fact-checking as part of their profession and how it is employed in television news reporting of elections, become significant questions to explore.Scholarship on fact-checking during election campaigns in the UK remains fairly limited.In her work reviewing fact-checking in the 2017 UK General Election campaign, Birks (2019a) drew on a content analysis of 176 articles and 232 tweets from FactCheck, Reality Check and Full Fact, and found that factual claims (measures of current circumstances) were most commonly and unproblematically checked.But, she discovered, theoretical claims (predictions) and social facts (definitions) were also tackled, with mixed results.Birks identified 67% of fact-checks included a clear verdict about the validity of the claims investigated, with the remaining found to be "explainer" articles akin to news analytical articles found in mainstream news (2019a: 41).In a brief comparison with the 2019 UK General Election, Birks (2019b) found that in both election campaigns, claims investigated mostly aligned with the media agenda, with a focus on claims made by senior political figures in 2019.Important as this study is in providing an overview of election fact-checking in the UK, it does not tell us much about the context of these editorial decisions.We explore editorial judgements by combining content analysis of election news with understanding the production processes in our interviews with practitioners.

Data and Method
Two methods were used to explore the role of fact-checking during the 2019 UK election campaign.First, our study examined the perspectives of broadcast journalists on factchecking.We carried out semi-structured interviews with nine senior editors, journalists and fact-checkers, who worked during the 2019 election campaign, from four UK broadcasters, including the BBC, Channel 4, Channel 5 and Sky News.The aim of the interviews was to understand how the threat of disinformation was handled during the election campaign and the role of fact-checking in broadcast news during that period.BBC and Channel 4, as discussed earlier, have their own fact-checking services.Sky News ran a regular series of fact-checks during the 2019 election campaign, named Campaign Check (Conway 2019).Given Channel 5 has limited resources, it did not have a similar separate fact-checking section.
Table 1 provides a description of interviewees.
The semi-structured interviews offered a reflexive, iterative data gathering process as participants' understanding and attitudes towards mis/disinformation and fact-checking could be explored rather than testing the researchers' pre-assumptive beliefs (Byrne 2004;Bryman 2012).Interviews, lasting approximately an hour, were carried out between January and February 2020.Interview questions included asking about the role of fact-checking in addressing disinformation and processes of fact-checking in the different broadcasters, as well as how fact-checking services fit into general news reporting during and outside election periods.
To understand the types of fact-checking articles that were published during this period, we complemented the interviews with an analysis of election fact-checking articles on BBC Reality Check, Channel 4 FactCheck, and FullFact.The fact-checking services of BBC and Channel 4 represents two different models of public service broadcasting-the BBC as a public service broadcaster and Channel 4 as a commercially-funded public service broadcaster.While one part of the study was focussed on how broadcasters were fact-checking, in the content analysis study we added a fact-checking organization -Full Fact-to compare and contrast its approach with coverage by the UK's major broadcasters.Although Full Fact is an independent fact-checking organisation that operates online, the rationale for including it in the sample was to assess any differences between fact-checking produced by professional journalism and a dedicated fact-checking site.Graves and Cherubini (2016, 8) have explored the different types of fact-checking services across Europe, noting that "legacy news media remain the dominant source of political fact-checking".But they also compared the practices of NGO or independent fact-checking services with mainstream media, observing that while they have "different organisational forms, and different self-identified orientations, they share a common commitment to publicly evaluating the truth claims made by powerful actors like politicians and in some cases news media" (Graves and Cherubini 2016, 30).Their analysis, however, relied on individual or group interviews with more than 40 practitioners, site visits to fact-checkers in eight different European countries, and an online survey of 30 organisations.To extend this comparative research approach, we added the UK's main independent fact-checker-Full Fact-to the content analysis study in order to systematically compare the editorial selection and characteristics of its factchecking compared to legacy news media organisations.In other words, the rationale for the sample selection of Full Fact in the content analysis study was to go beyond relying on interview or observational data in order to offer a comparative textual assessment of fact-checking practices across an independent organisation and broadcasters from mainstream media.Building on previous scholarship on fact-checking journalism by Birks (2019aBirks ( , 2019b) and Uscinski and Butler (2013), we developed a content analysis study that would quantify not just the agenda of topics addressed but the nature and character of coverage.For example, we went beyond standard measures of categorising fact-checking articles, such as explainers.We developed original categories according to their specific formats, such as briefs and audience questions.A total of five categories of articles were identified (see Table 2).
Our content analysis examined these sites from the official start of the campaign, 6 November 2019, to Polling Day, 12 December 2019, including weekends.A research team of two coders coded all articles within the election campaign timeframe, generating 238 items from Reality Check, FactCheck and Full Fact.In total, Reality Check published 112 articles, 93 of which were election-related; FactCheck published 23 articles, all of them were election-related and Full Fact published 123 articles, 97 of which were election related.One full article was used as a unit of analysis and coded according to a number of Category Definition

Fact-checking
The article examines the claim(s) critically and challenges it, and attempts to reach a verdict about its accuracy.

Brief
A short piece with a few sentences (five sentences or less, like a blurb) directly addressing the issue or claim.

Analysis
The article breaks down a claim by analysing it but does not challenge the claim(s) explicitly and has no clear verdict about its accuracy.

Explainer
The article only explains what a specific issue is, and its background, but has no analysis.

Audience question
The article answers a specific fact-checking question sent in by a member of public.This was only relevant for Full Fact.
variables including whether an item was election or non-election related, the election topic category (NHS, Economy, Brexit/EU, etc.), the format/type of article (fact-checking, analysis, explainer, etc.) and overall outcome of the fact-checking/analysis (challenged, verified, unclear).These variables achieved high inter-coder reliability scores according to Cohen's Kappa coefficient (see Appendix 1).
The content analysis study was designed to develop a nuanced analytical framework that critically examined whether any claims were challenged in each fact-check article.This included investigating individual claims scrutinised in the articles.However, our analysis went further by considering the kind of sources cited when examining these claims and evaluating the degree to which they were challenged or not.Variables included whether each individual article investigated one or multiple claims, the author of the claim (that is the source making the claim under scrutiny, largely political sources), the sources used to scrutinise/challenge the claim (e.g., politician, government department, think tank, etc.) and the extent to which a claim was challenged (explicit, partial/implicit, validation, no challenge).Again, these variables achieved credible to robust inter-coder reliability scores according to Cohen's Kappa coefficient (see Appendix 1).

Findings
Disinformation and the Role of Fact-Checking Political fact-checking emerged as a new journalistic practice and medium as a reaction to burgeoning disinformation (Fridkin, Kenney, and Wintersieck 2015).Given this relatively new context, we started the interviews with journalists and editors by discussing the concept of disinformation.Our interviewees held a fairly consistent definition of disinformation, interpreting it as the intentional circulation of false information to the public in order to mislead them, specifically with a political agenda or angle, and especially during election campaigns.Intentions behind disinformation were thought to be persuading the public to cast a favourable vote, distracting from the main issues at hand, or discrediting opponents.Although the presence of disinformation during election campaigning is not something new, according to journalists and editors its current speed and extent meant there was a need for more "alert" journalistic practices, such as fact-checking.Interviewees also acknowledged the speed by which disinformation can now be easily spread online.Jon Snow, News anchor at Channel 4, defined disinformation and its influence in the following way: (A) I think disinformation is a deliberate effort to divert you from the real issues; (B) to disinform you about various issues; (C) to twist to suit a particular political end.I think all parties are guilty of it, but I think actually what made disinformation more dangerous this time was the social network, which both the parties and individuals used on a scale we've never seen before.
The journalists interviewed also revealed how disinformation in politics and elections was not viewed as a completely new phenomenon.For example, Rupert Carey, BBC Reality Check Editor stated: "I suppose in terms of the election, it's something that we are used to seeing", while Ben de Pear, Channel 4 News Editor, suggested it is combination at which the speed of false information is spread through the large number of platforms available that is novel.He also cited a shorter audience attention span as a contributing factor.
Interviewees pointed out that ensuring what is being published and how accurately it is presented during elections were especially important today as more attention is being paid to the campaign-related information.Glaister, Sky News Producer explained, "We need to be careful about how we report that and whether we end up being a tool in someone's campaign."Meanwhile, Katie Searle, BBC Westminster Editor, echoed similar sentiments when explaining how there needed to be a balance between fact-checking and news reporting: "You've got to find the middle way between not completely discounting what their campaign line is, but equally not giving it too much airtime or prominence when you think that's quite difficult to stand up."Overall, interviewees acknowledged disinformation was not a new phenomenon but countering it was far more challenging today, which reinforced the importance of impartiality and accuracy in political coverage.

Practices and Outcomes: Fact-Checking during the UK 2019 General Election
The growing importance and development of fact-checking journalism were evident in all interviews, during and outside an election campaign.This was associated with the rise of false or misleading information from unreliable sources, as well as established actors.For all interviewees, this was why fact-checking was essential for broadcasters.Jon Snow regarded fact-checking as "Fundamental.It's one of the new resources we do have.We have invested in fact-checking and at least then you have an objective truth which you can work to."Rupert Carey explained how the BBC's fact-checking service had expanded to become a routine platform for scrutinising statistics and challenging claims: [BBC Reality Check] was in response to a desire to kick the tyres of stories not just in an election but all year round.Should we be running this?Should we be leading with this story?Are the assumptions behind this story or the stats or the line that's been giving us all by Number 10, is it robust enough to lead a news bulletin or even be on a news bulletin?
Despite acknowledging the increasingly valued role of fact-checking at the BBC, the practice of integrating the Reality Check team's output with other news divisions was regarded as challenging.Paul Royall explained that the morning news bulletins incorporated the least amount of fact-checking input and scrutiny largely due to time constraints.Rupert Carey and Katie Searle shared the view that although Reality Check articles were usually disseminated internally during BBC News editorial meetings or informally through private networks, it was difficult to integrate Reality Check more prominently in live broadcast news.Attempting to pursue answers on unclear claims from politicians on air within a limited time frame was viewed as challenging, especially when it's a campaign message.As Katie Searle put it, "You can't stop every five seconds and say hang on a minute, what we really mean is this.You have to try to the best of your ability within the structure of programme, headlines and all the challenges within that to be very clear about the questions surrounding any claim."There were also concerns that adopting this approach might put politicians off appearing on live television.
On the other hand, integration between online news articles and internal fact-checking articles appeared to be more commonplace.Ben de Pear explained fact-checks were referred to daily by their journalists during electoral coverage, whilst Rupert Carey pointed out several online articles linked both news reporting and internal fact-checking services regularly during the election and many included an analysis box from Reality Check.Although fact-checking was viewed as a vital practice, we found varying levels of resources were needed to routinely fact-check across news organisations, even when there were separate fact-checking sites.In newsrooms without separate fact-checking teams, interviewees reported electoral fact-checking carried out as part of standard news reporting practice.For instance, Peter Diapre of Sky News explained that social media monitoring and election related news were monitored with the help of colleagues, but "nothing beyond what would be my normal fact-checking as a journalist" was carried out during the election.
However, BBC Reality Check editor, Rupert Carey, revealed his permanent team of 13 editors and fact-checkers included international fact-checking, sitting alongside their colleagues in Analysis and Research divisions.Known as "Global Reality Check", the team are able to investigate claims beyond the UK, specifically claims that can then be translated into the BBC's 40 language services.By contrast, Channel 4 FactCheck has an editorial of team of two: Editor, Patrick Worrall and researcher, Georgina Lee.Responsibility for factchecking claims and writing fell between the two of them, even during events such as elections.Although Channel 4's investigative scope is global, the size of the team meant that during domestic elections or events, it was not a primary focus.The disparity in resources between fact checking teams contributed to how many (or few) claims they could practically investigate and, as a result, the number of articles published over our sample period.Whilst this does not necessarily have an impact on the quality of Channel 4's fact-checking, FactCheck provided far fewer articles than Reality Check (see Table 3).
As Table 3 shows, Reality Check produced the largest number of fact-checking articles, followed by Full Fact.Out of the 238 articles we examined on Reality Check, FactCheck, and Full Fact during the sample period, 214 fact-checking items related to the 2019 UK General Election campaign.The remaining 24 articles fact-checked claims related to the environment and international news.Across the three sites, we found that Full Fact had the highest proportion of fact-checking articles, whereas Reality Check had the largest proportion of brief-type of articles.

Topics Investigated
Overall, a wide range of topics were covered by Reality Check, FactCheck and Full Fact (see Figure 1).But the topics that were most fact-checked reflected political party agendas.For example, the parties' manifesto pledges, but also their policies about Health/NHS-related, Brexit/EU, taxation and economic issues.A manifesto pledge example relating to the National Health Service (NHS) was a campaign promise by Conservative Party leader, Boris Johnson, to build 40 extra hospitals and provide jobs for 50,000 nurses.However, it was revealed that the UK government had committed the money to upgrade six hospitals by 2025.Up to 38 other hospitals have received money to plan for building work between 2025 and 2030, but not to actually begin any work.As for the figures for new nurses, it was revealed that they refer to training provided for 31,000 new nurses and successfully encouraging 19,000 existing nurses to stay in their roles (FullFact 2019).Other popular claim topics investigated include those about taxation and the economy.Notably, Full Fact was the only site that analysed claims about edited images or stories found on social media.Almost all of these claims-13 out of 15-were about political candidates being misrepresented in some form, such as an edited video of Labour's Keir Starmer, which appeared to show him unable to answer questions about Labour's Brexit policy stance in a live broadcast interview (5 November 2019).The remaining two stories were political but not party-related, with one investigating whether a photo of a boy lying on the floor of Leeds General Infirmary was doctored (10 December 2019).Given that broadcasters did not engage with social media content, their fact-checking was largely shaped by the parties' political agendas.
Interviewees broadly agreed that selecting which claims to investigate was a difficult part of the fact-checking process.Our content analysis revealed that the most prominent topic selected was the parties' manifestos and their main policy agendas.Or, put differently, the stories appearing on fact-checking sites during the election campaign was largely driven by the agendas of the UK's main political parties.According to Rupert Carey, party manifestos provided valuable fact-checking opportunities.Patrick Worrall described the process of selecting stories as "always looking for a lie to expose."He explained how the Channel 4 team scanned what was being publicly said and announced by politicians and other political actors through official means such as manifestos, blogs, or through their tweets.Once again, we can see how fact-checking centres on party political agendas.The team also considered potential issues that might be "bubbling away … under the radar" not just on politics, but anything that could be happening globally, including celebrities.However, perhaps due to our focus on the election campaign, our analysis did not uncover any evidence of this agenda.
According to our content analysis of all fact-checking sites, it was mostly single specific claims that were investigated.They made up twice as many articles published that analysed multiple claims at once (see Table 4 below).Articles which fell into "Brief" and "Audience Question" categories were not included in this analysis as they did not challenge or analyse claims in the same format as the other categories (as described in Table 2

above).
Significantly, how complicated a claim is influences the editorial process, including how it is investigated and subsequently presented.As Patrick Worrall pointed out, some claims are "so complicated that it really is just too simplistic to try and say it's straightforwardly a lie or the opposite of that, or even any kind of rating".This can result in factchecking stories that broke down a list of multiple claims to be examined and evaluated as part of a single article or where multiple individual claims were analysed and published in separate articles.

Sources
The process of selecting sources for fact-checking stories was viewed by interviewees as being broadly similar to the practices of routine reporting.This often entailed looking at data perceived to be authoritative, such as official data published by the UK Government.Carey from BBC Reality Check described the process as the "most straightforward if there are stats involved.We would go to ONS [Office of National Statistics], we go to the OBR [Office for Budget Responsibility], we go to the IFS [Institute of Fiscal Studies]-we tend to use those an awful lot.We would go to obviously individual Government departments.If it's crime, you go to the Home Office for the latest stats.If it's health, NHS Digital."This approach to identifying sources in fact-checking stories can be viewed as reinforcing the relatively narrow range of selecting elite and authoritative actors long established in journalism studies (Franklin and Carlson 2010).Interviewees mentioned the time-sensitive nature of their work as the reason they do not tend to deviate from this working pattern when analysing new claims.Describing these sources and experts as "the gold standard" used generally across the BBC, Rupert Carey also expressed some reluctance when engaging with academic research.This was due, he explained, to their funding sources, and a suspicion of studies being biased.Specifically, both fact-checking editors also highlighted the importance of being journalistically open and transparent, such as making sure sources were hyperlinked to evidence specific points in their investigation.
Our content analysis of sources across single and multiple claim articles revealed a substantial number of sources hyper-linked or quotes cited in fact-checks.It showed that the types of sources used were generally the same across Reality Check, FactCheck and Full Fact (see Tables 5 and 6), drawing most regularly on UK government departments and politicians.This confirmed the editorial preferences of the fact-checkers we interviewed, demonstrating an overwhelming reliance on data from non-ministerial government departments, public bodies, or statutory agencies (as officially defined by the UK government), such as the House of Commons Library and Office of National Statistics.These bodies, whilst part of the UK government, work at arm's length from ministers by carrying out regulatory or executive functions (UK Government, 2021).The next most common source was ministerial government departments (such as the Department of Work and Pensions), politicians or political parties, then think tanks, journalists or the media, and EU institutions or regulators.Similar results can be found in multiple claim articles reflecting more complex investigations.We excluded hyperlinks and sources to internal articles (e.g., Reality Check citing other BBC articles).Strikingly, single and multiple claim articles by independent Full Fact were largely supported by non-ministerial government sources, with 34.4% and 26.5% of their cited sources falling in that category.This was significantly more than Reality Check and FactCheck, which suggests that the sources used to check claims by a dedicated fact-checking organisation is different to journalists working at news media outlets.

Verdicts
Investigating claims made by political figures involved fact-checkers dissecting sentences and trying to explain to audiences the range of possibilities of these claims.This meant articles were presented in various formats and with different fact-checking outcomes.
Stories and claims related to Brexit were the third most fact-checked topic across all three fact-checking sites, and the second most fact-checked topic for Reality Check.A story that stood out during the campaign was the promise by Boris Johnson and the Conservative Party to "Get Brexit Done".The slogan and associated claims that the Conservatives would be able to deliver Brexit was splashed across headlines and consistently mentioned in articles reviewing the party's promises.We found the lack of clarity surrounding the statement resulted in different approaches and perspectives by fact- checkers in our interviews and content analysis study.Rupert Carey acknowledged the challenges associated with analysing a broad statement that centres on "investigating opinions rather than facts".But he revealed that the decision taken to investigate the slogan was based on the fact it was a soundbite that was going to be repeatedly used during the campaign and a focal part of the Conservative's party's agenda.He conceded that it was much more straightforward to fact-check an event that had already occurred because, "who knows what's going to happen in the future?"A Reality Check article published on 6 November 2019 unpacked the "Get Breixt Done" claim in the context of the potential economic impact.But Rupert Carey admitted that the fact-check was difficult to assess in the long term because it "could be true, it could be false, it could lie in the middle somewhere".Finally, our content analysis examined every claim and assessed whether any verdicts on claims investigated were challenged, verified, or unclear.As discussed above, the interpretation of dubious statements and opinions can make it highly challenging for fact-checkers to evaluate the "truthfulness" of a claim.This can subsequently influence the editorial clarity of an article's verdict.Examining articles classified as fact-checking and analysis across the three sites, we found over two-thirds of articles clearly challenged existing claims (see Table 7).Across Reality Check, FactCheck, and Full Fact, 21%, 17%, and 19% of articles analysed were found to have unclear verdicts respectively.This meant fact-checking articles with unclear verdicts made up approximately one-fifth of the total published across the three sites.These were articles which did not have a clear outcome after an investigation into the claim.Articles with explanations but without any clear sources or links used were also classified as unclear.Finally, articles which verified claims made up approximately 12%, 4%, and 17% for BBC Reality Check, Channel 4 Fact Check, and Full Fact respectively.Verification of claims, in this context, was classified as an that investigation into potentially suspicious facts or opinions that were later found to be accurate.
A key fact-checking practice Partrick Worrall emphasised was being even-handed not only when investing a claim, but also in how findings were presented.This meant moving away from rating their verdicts using scales, as many fact-checking sites provide in the US.Rupert Carey echoed a reluctance to adopt this type of fact-checking practice: "I'm not particularly a fan of the truthometer … Trump is great for US broadcasts because it's either ten, he's completely wrong or one or two, he's quite but not completely right".Critically, these nuances make fact-checking statements over statistical facts complicated, which can result in unclear verdicts.

From Reflecting Party Agendas to Challenging Their Claims
This study examined the editorial practices of fact-checking in detail across three sites, focusing on the 2019 UK General Election campaign as a case study.First, we undertook in-depth interviews with nine senior journalists working in newsrooms and internal factchecking organisations including Reality Check and FactCheck, where we uncovered consistent views about the important role fact-checking played in providing accurate and impartial reporting, and countering the spread of mis/disinformation during electoral coverage.When comparing fact-checking produced by news media at two public service broadcasters with a dedicated fact-checking organisation, Full Fact, we found some similarities but also some major differences..In particular, Full Fact examined social content for any false or misleading information, whereas BBC and Channel 4 focussed largely on analysing party political claims.This was consistent with the often narrow agenda of broadcasters during election campaigns, which have historically tended to centre on party political debates (Cushion and Thomas 2018).Further comparative research is needed to explore the editorial motivations between fact-checking organisations including how mainstream media fact-check.However, Singer's (2021) research has provided some insight into the priorities of independent fact-checkers.After interviewing and surveying fact-checking organisations around the world, her analysis suggested that independent fact-checkers "see themselves as addressing perceived shortcomings of legacy media" (Singer 2021(Singer , 1943)).Among broadcasters, the findings revealed that there was a degree of separation between the fact-checking divisions of the BBC and Channel 4, and other news reporting departments.Whilst factchecking information was shared across teams, there appeared to be limited integration between reporters working in the production of routine news and journalists focused on fact-checking stories.
Second, we uncovered similar fact-checking practices across internal fact-checking teams.Our systematic content analysis of 214 articles provided a nuanced overview of how fact-checking was carried out by the three main fact-checking sites in the UK.Building on previous studies by Birk (2019aBirk ( , 2019b) ) and Uscinski and Butler (2013), we expanded their analytical framework by introducing new variables such as the source selection of every claim fact-checked, how it contributed to the article's argument and its overall verdict.We found a similarity in topics addressed across all the three main UK fact-checking organisations.The top three issues examined-manifestos, Brexit and health-largely reflected the campaign agendas of the main UK political parties, and our analysis of sources revealed a heavy reliance on politicians and political parties, the UK government and non-ministerial departments, particularly in the use of statistical data.We further found that two-thirds of fact-checks challenged claims, but there were differences beween Reality Check, FactCheck and Full Fact in how often they did so and their degree of clarity when explaining a fact-checking verdict.The narrow choice of institutional sources drawn upon by fact-checking sites reinforced previous studies about which actors inform political coverage (Franklin and Carlson 2010), highlighting what editors believe is reliable and credible data to verify claims, and challenge false or misleading statements.
However, we found in our analyses that Full Fact, one of the main independent factchecking organisation in the UK, investigated claims beyond the political party's political agenda, such as the falsification of media content posted by members of the public circulating on social media platforms.This was a clear editorial difference between the approach of an independent fact-checker and legacy news media organisations, prompting the need for further assessment about why there were contrasting choices about what should be fact-checked during an election campaign and beyond.Aligned with Birks' (2019a, 2019b) findings of fact-checking during the 2019 UK general election, we found that claims investigated by fact-checkers from BBC and Channel 4 were generally aligned with UK party political agendas, and focused on the interpretation of party manifestoes.This was particularly the case in the broadcasters' fact-checking sites, revealing a similarity to election news reporting more generally (Cushion and Thomas 2018;Deacon et al. 2019).
Overall, we would argue that our findings represent a missed opportunity to more effectively use fact-checking to counter mis/disinformation.After all, well-resourced broadcasters have the potential to critically inform the public on unclear or vague political statements by drawing more regularly on their dedicated fact-checking services.The BBC, for example, was criticized by the UK's main regulator-Ofcom (2019)-for a reluctance to challenge dubious political claims, and explaining to broadcasters more generally that rules about impartial reporting should not prevent them from countering false or misleading statements.Ofcom recommended: "They [broadcasters] should feel able to challenge controversial viewpoints that have little support or are not backed up by facts, making this clear to viewers, listeners and readers" (Ofcom 2019: 17).We would argue that fact-checking journalism can embolden journalists when challenging claims if factcheckers, news reporters and journalists worked more effectively together.Whilst editorial decisions made by the fact-checking team can have an impact on the types of claims and stories investigated, our interviews revealed internal structures and other practical reasons, such as time constraints, impeded the integration of fact-checking findings with mainstream broadcast news.We would argue that if broadcasters relied more heavily on their fact-checking editorial teams, it would not only enhance routine news reporting, it could strengthen the degree to which journalists hold politicians to account, especially if they drew on a more diverse range of sources to question and scrutinise political claims.

Table 1 .
Interviewee sample and information.

Table 2 .
Categories and definition of fact-checking articles.

Table 3 .
Breakdown of article types across fact-checking sites.

Table 4 .
Single and multiple claims across fact-checking websites.

Table 5 .
Proportion of sources used to scrutinise single claim articles.

Table 6 .
Proportion of sources used to scrutinise multiple claim articles.

Table 7 .
Breakdown of fact-checking and analysis article verdicts.Percentages may not add up to 100% due to rounding up.
Appendix 1: Inter-Coder Reliability Scores Total Article Sample (Election and Non-election): 238 Inter-Coder Reliability Sample: 26 stories across BBC Reality Check, Channel 4 and Full Fact.