Post-Fact Culture

Introduction
Post-fact culture is a socio-political environment that creates circumstances where misinformation influences public opinion more than objective fact [1]. The term “post-fact” is used interchangeably with “post-truth,” "post-truth politics," and “post-reality.” A post-fact culture is one in which technical communicators must combat misinformation in order to reach their intended audience. Technical communicators who understand how they are positioned within a post-fact culture can combat skepticism and rejection of their expertise. Technical communicators have just begun to reveal the degree that a post-fact culture breeds distrust in scholarly and scientific information. This developing area of research seeks to find solutions to modern-day problems like:
 * Political misinformation
 * COVID-19 risk communication
 * Climate change denial

Background
Post-truth (as defined) first occurred in a political opinion piece in The Nation, a NY-based news media publication. [1] Resurgences in the term's popularity occurred in other news media publications during 2012 and 2016. Oxford Languages named post-truth the 2016 word of the year after its prevalence soared in United Kingdom and United States politics. [2] Many experts now believe that a post-fact culture is here to stay.[5][7][10]

Users traditionally associate news articles with factual information about current events written with varying levels of bias. The rise of social media as a news source diminished the integrity of news media by allowing articles packed with misinformation to circulate unchecked. In 2016, another term that surged in popularity was "fake news," used to describe misinformation that mimics news media in form but not organizational process or intent.’[9]

Fake news on social media creates problems for technical communicators because it diminishes trust in expert sources. Social media sites use algorithms to track users’ clicks and find similar articles to display on their page.[3] When a user clicks on a fake news article, they continue to see similar articles that may also contain misinformation. Users interpret the continuous misinformation as evidence of their existing beliefs through a process called confirmation bias. Technical communicators who present factual information that contradicts users with confirmation bias may suffer from the "backfire effect." This occurs when users are presented with facts that contradict their strongly-held beliefs, causing their beliefs to become stronger.[19]

Political Misinformation
Truth in political communications creates trust from constituents. Misinformation perpetuated by news and social media can influence the outcome of political movements and campaigns. Technical communicators who work in this field often work against misinformation and confirmation bias to reach their users.

Brexit Referendum
On June 23,2016, a majority vote in the UK’s Brexit Referendum initiated the nation’s departure from the European Union.[4] This referendum saw 1.2 million previously disengaged voters at the polls, causing increased overall turnout. Polls showed a rise in ‘value-based’ decision making, or voting based on personal conviction and emotion rather than factually-justified arguments. Research has shown that this emotional voting was fueled to some degree by misinformation obtained from news sources and distrust in official messages.[5]

2016 US Presidential Campaign
In 2016, US political candidates Donald Trump (Republican) and Hillary Clinton (Democrat) ran for the presidential seat, resulting in a win for Trump. Research suggests that social media was the second-most popular news source for information about this election.[6] Facebook was the most commonly-used social media platform for viewing news articles.[7] A 2010 study conducted on 61 million Americans showed a correlation between news media obtained through Facebook and altered voting outcomes. [8] Research also shows that fake news articles outperformed truthful news articles on Facebook's news platform. [18]

Climate Change Denial
Despite the prevailing scientific evidence that climate change is human-driven, many still deny this fact [10]. Misinformation about climate change is abundant on social media platforms. Technical Communicators studying climate change perform research on the role they play in preventing–or contributing to–climate change denial.

Research suggests that audience analysis is necessary to provide users with information about environmentalism and climate change [11]. Technical communicators who write about environmentalism present evidence-based advice on how to reduce users’ environmental impact. These technical communicators must be aware of the potential for their work to have the opposite impact on users who already have been subjected to misinformation. Seeing information that directly opposes existing beliefs about climate change may increase confirmation bias or cause the backfire effect in some users. [10]

COVID-19 Risk Communication
The COVID-19 pandemic created an information crisis for public health officials. Risk communication was integral in preventing the virus’s spread. [12] Technical communicators in public health worked diligently to spread information about the communicability of the virus and the benefit of wearing masks, social distancing, and becoming fully vaccinated.

Misinformation spread throughout news and social media dulled the effectiveness of this risk communication. This challenged technical communicators to share risk prevention strategies while simultaneously correcting the misinformation.

Fact Checking
Fact Checking is the act of verifying the correctness of information that is presented as factual. Fact checking quickly became a possible solution for misinformation presented on news or social media. The fact checking websites like factcheck.org, politifact.com, and snopes.com claim to offer “evidence-based and contextualized analysis” of sources.[13] Snopes and FactCheck offer a rating system that utilizes different colors, shapes, and symbols to create identifiable graphics for users. [14][15]These visual symbols create a binary of factual and non-factual that users can quickly interpret:

Symbols that represent a source containing truthful claims are often green and round. They resemble a green traffic light, indicating the user may safely proceed to the source. They also contain check marks–another symbol creating a positive association with the source.

When a source contains some degree of misinformation, the symbols resemble a red light or stop sign. They often contain an “X” or a crossed-out symbol. These symbols indicate that the user should not proceed to the source.

Facebook provides links to independent fact-checking websites who are certified by the International Fact-Checking Network. They have also changed their algorithm to display news articles containing misinformation “lower” in users’ news feeds.[16] Social media companies’ work to create factual news environments may help technical communicators spread factual information.[17] However, some users claim that this limits individuals’ rights to free speech by deleting posts and banning users who are flagged for spreading misinformation.

Dissociative Framing
Research suggests that combining methods of rhetorical dissociation and framing may help technical communicators to present new facts to users who already believe misinformation. Rhetorical dissociation involves separating the association of ideas within a system of thought. Framing involves how the writer organizes and presents information. Combining these two strategies has shown promise in countering deeply embedded misinformation.[10]

For example, dissociative framing may help combat deeply-rooted misinformation about climate change in rural communities. The method involves minimizing the association between individual actions that the individuals in rural communities can take to reduce environmental harm with the science connecting aspects of their lifestyle to climate change.