Abstract
In contemporary discourse, the aphorism “not all facts are true” captures the complexity surrounding the construction, validation, and interpretation of facts. This article examines the philosophical and social underpinnings that render facts mutable, contested, or even deceptive. Drawing on Kuhn’s paradigm theory, Foucault’s power/knowledge nexus, and social constructivism, alongside empirical studies of misinformation in psychology, it is argued that facts are not self‑evident truths but products of contexts, perspectives, and power relations. Implications for scholarship, media literacy, and democratic deliberation are discussed.
Keywords: facts, truth, social constructivism, power/knowledge, misinformation, paradigms
The problem is that facts are traditionally conceived as objective statements corresponding to reality. Yet, historical and contemporary examples, ranging from “post‑truth” politics to paradigm shifts in science, demonstrate that facts can be contested, reframed, or refuted. The problem addressed in this article is the assumption of facticity as synonymous with truth, which obscures the processes by which facts are produced, validated, and sometimes manipulated.
The purpose of this article is to interrogate the relationship between facts and truth by synthesizing philosophical theories and empirical research. Specifically, I seek to explore two critical thoughts: (a) elucidate how power structures and social processes shape what is accepted as a fact, and (b) examine psychological mechanisms that enable the persistence of false “facts” even after corrective information is available.
It is posited that facts are neither inherently true nor immutable; instead, they are contingent constructs shaped by scientific paradigms, institutional power, and cognitive biases. Recognizing the constructed nature of facts is essential for fostering critical literacy and robust democratic deliberation.
Theoretical Framework
The examination of why “not all facts are true” rests on three interlocking theoretical lenses, social constructivism, power/knowledge, and paradigm theory, which together illuminate how facts emerge, solidify, and sometimes fracture. By detailing each framework and its relevance to the article’s central claim, we underscore how facts are contingent products of social processes, institutional forces, and shifting epistemic norms. Therefore, let us use the following example to illustrate this point.
Illustrative Example: The Steele Dossier and the Institutionalization of Misinformation
The controversy surrounding the Steele Dossier provides a salient modern illustration of how misinformation can be presented as fact, subsequently institutionalized, and leveraged by intelligence professionals and policymakers as a foundation upon which additional "facts" and truths are constructed. In 2016, former British intelligence officer Christopher Steele authored a collection of memos alleging collusion between the Trump campaign and the Russian government. Initially funded by political opposition research, the dossier contained a mixture of uncorroborated claims, allegations, and speculative intelligence that rapidly entered the media narrative and governmental discourse (Horowitz, 2019; Meier, 2020).
Despite its uncertain and largely unverifiable provenance, the dossier was treated by numerous intelligence and law enforcement entities as credible intelligence. Elements within the FBI, Department of Justice, and other intelligence agencies used the dossier as partial justification for obtaining Foreign Intelligence Surveillance Act (FISA) warrants targeting Carter Page, a former Trump campaign adviser (Horowitz, 2019). Thus, misinformation originating in a partisan political context was institutionalized as actionable intelligence through the legitimizing processes of official investigation and judicial oversight.
As investigations progressed, the reliance on the Steele Dossier reinforced a narrative of alleged collusion that shaped political discourse, public perceptions, and media coverage. Key elements within the dossier, despite lacking substantiation, were repeatedly cited by authoritative sources as factual, embedding these claims within the broader public consciousness. Cognitive biases such as confirmation bias and motivated reasoning facilitated the acceptance and persistence of these claims, even as contrary evidence emerged (Lewandowsky et al., 2012; Nyhan & Reifler, 2010). The institutional credibility lent to the dossier by professional intelligence assessments thus contributed significantly to the dissemination and endurance of its claims as perceived truths.
However, subsequent inquiries, notably by Special Counsel Robert Mueller and Department of Justice Inspector General Michael Horowitz, revealed substantial deficiencies, inaccuracies, and a lack of credible sourcing within Steele's allegations (Horowitz, 2019; Mueller, 2019). Many claims within the dossier were either refuted or remained unsubstantiated, revealing the problematic foundation upon which various intelligence and media narratives had been constructed. Despite explicit corrections and clarifications, the dossier’s misinformation had already significantly influenced public discourse, demonstrating how misinformation—once institutionalized—can generate lasting consequences.
Ultimately, the Steele Dossier exemplifies how misinformation, when legitimized through authoritative institutions and reinforced by cognitive biases, can create entrenched falsehoods that persist in public and professional discourse. This case underscores the critical need for intelligence professionals to rigorously interrogate sources, apply structured analytical techniques, and remain vigilant against the cognitive biases and institutional pressures that can transform unverified or politically driven misinformation into accepted "facts."
Paradigm Shifts in Science
Thomas Kuhn’s seminal work demonstrates that the scientific community operates under paradigms, overarching theoretical frameworks that define normal science, including which observations qualify as facts (Kuhn, 1962). Core components are:
Normal Science vs. Revolution: Under a dominant paradigm (e.g., Newtonian physics), “facts” are puzzle‑solutions that fit established theories. Anomalies that cannot be reconciled accumulate until a crisis precipitates a paradigm shift (e.g., Einstein’s relativity), redefining foundational facts such as notions of space and time.
Incommensurability: Successive paradigms often employ distinct concepts and measurement standards, making direct comparison difficult. Thus, what is factual under one paradigm may be scientifically nonsensical under another highlighting the contingency and episodic overturning of facts.
Community Dynamics: Acceptance of a new paradigm depends on persuasion, rhetorical skill, and generational turnover within the scientific community, rather than purely on empirical evidence. This sociological dimension underscores that scientific facts are as much about consensus-building as experimental data.
Implication for Facticity: Kuhn’s theory supports the argument that facts are not eternal verities but are susceptible to redefinition when the conceptual lens through which data are interpreted changes.
Thomas Kuhn’s (1962) formulation of scientific revolutions fundamentally altered our understanding of how scientific “facts” emerge, stabilize, and sometimes collapse. Under Kuhn’s model, “normal science” operates within a dominant paradigm, a constellation of theories, methods, and standards that guide everyday research. Within this framework, empirical observations are interpreted as puzzle‑solutions that reinforce the prevailing worldview. However, when persistent anomalies observations that defy the paradigm’s explanatory power accrue, the scientific community enters a crisis. During this crisis, the rigid boundaries of normal science loosen, allowing alternative theories to gain traction. Eventually, a new paradigm may supplant the old one, redefining which observations count as factual. For instance, the nineteenth‑century phlogiston theory, which posited a fire‑like element released during combustion, seemed incontrovertible until Lavoisier’s oxygen chemistry provided a more coherent explanatory framework. The phlogiston concept fell into obsolescence not because individual measurements were erroneous, but because the interpretive lens shifted, what had been taken as fact (phlogiston release) was reinterpreted under a new paradigm (oxidation) (Kuhn, 1962). Similarly, the Copernican revolution illustrates how a cosmological model once deemed heretical ultimately reconfigured basic astronomical “facts,” such as planetary motion and Earth’s place in the universe. These episodes underscore that scientific facts are epoch‑bound and contingent upon the theoretical commitments of the researcher community (Kuhn, 1962).
Social Construction of Facts
Social constructivism posits that knowledge is not passively discovered but actively constructed through interpersonal interactions, shared language, and cultural practices (Gergen, 1985; Berger & Luckmann, 1966). Under this view, a “fact” gains legitimacy when a community collectively negotiates its meaning:
Communal Validation: Scientific data points, whether a measurement in a laboratory or a statistic in a governmental report achieve the status of “facts” only after peer discussion, methodological consensus, and institutional endorsement. For example, the reclassification of Pluto from planet to “dwarf planet” in 2006 demonstrates how expert communities renegotiate definitions based on evolving criteria (Morrison, 2007).
Context Dependence: What counts as a fact in one social milieu may be irrelevant or even nonsensical in another. Indigenous knowledge systems, for instance, may recognize ecological relationships that fall outside Western scientific paradigms, yet remain valid within their cultural context (Berkes, 2012).
Implication for Facticity: Recognizing that facts emerge through negotiation highlights their provisional nature: when the sociocultural conditions or interpretive agreements shift, so too can what is taken as factual. This supports the article’s position that facts are contingent, not immutable truths.
Social constructivism, as articulated by Gergen (1985) and Berger and Luckmann (1966), foregrounds the communal dimension of fact‑making. From this vantage, facts are the product of negotiated meanings among individuals and groups, mediated by language, shared practices, and institutional contexts. Scientific data, whether measurement readings or textual analyses, gain their compelling force only when interpreted within a network of assumptions and value commitments. Gergen (1985) argues that the notion of data as transparent “windows” onto reality is illusory; rather, researchers inevitably bring their theoretical predispositions, cultural perspectives, and methodological preferences to bear on their observations. Consider, for example, ecological knowledge among Indigenous communities: relationships between plant species and seasonal cycles are “facts” deeply rooted in centuries of reciprocal engagement with the land, yet they often diverge from Western taxonomies (Berkes, 2012). These divergent factualities are not mutually exclusive but reflect differing epistemic cultures. Social constructivism thus emphasizes that facts are inherently provisional and contingent upon the interpretive frameworks of their producers.
Power and the Production of Truth
Michel Foucault’s framework reveals how regimes of truth are inseparable from power relations: institutions and discursive practices define, enforce, and circulate what is accepted as factual (Foucault, 1972). Key insights include:
Discursive Formation: Discourses, structured ways of talking and thinking, govern which statements are admissible as true. For instance, nineteenth‑century medical discourse classified “hysteria” as a female pathology, embedding social biases into a “medical fact” that persisted for decades (Showalter, 1997).
Institutional Enforcement: Power operates through institutions (e.g., governments, universities, media) that control the production and dissemination of knowledge. Regulatory bodies, peer-review systems, and accreditation standards all act as gatekeepers of factuality, privileging certain methodologies or worldviews over others.
Surveillance and Normalization: Foucault’s notion of the panopticon illustrates how constant observation shapes behavior. In contemporary digital spaces, data analytics and algorithmic curation surveil user behavior, reinforcing particular “truths” (e.g., trending topics) while marginalizing dissenting information (Lyon, 2018).
Implication for Facticity: Since power shapes not only which facts emerge but also which are sustained and which are discredited, the article’s premise follows that facts cannot be presumed neutral or self‑evident; they reflect vested interests and entrenched hierarchies.
Michel Foucault (1972) extends the analysis of facticity by illuminating the inextricable link between power and knowledge. In Foucault’s view, what a society recognizes as “true” is produced and sustained through discursive practices and institutional mechanisms that privilege certain statements while marginalizing others. Discursive formations structure ways of talking about the world, defining the field of permissible inquiry and delimiting which questions can even be asked. For example, nineteenth‑century psychiatric discourse categorized various behaviors as symptoms of “hysteria,” pathologizing women’s emotional expressions and reinforcing broader gender norms (Showalter, 1997). Institutionally, peer‑review systems, accreditation bodies, and professional associations act as gatekeepers, determining which research findings attain the imprimatur of “fact.” Latour and Woolgar’s (1979) anthropological study of a biochemical laboratory vividly documents this process: scientific “facts” are not simply found in nature but are socially negotiated through grant applications, conference presentations, and journal reviews. In contemporary media landscapes, algorithmic curation and data analytics further exemplify Foucault’s panoptic logic, as platform architectures shape which narratives gain visibility and which are silenced (Lyon, 2018). Thus, facts emerge not from a neutral engagement with reality but from a nexus of power relations that define, enforce, and normalize particular truths.
Synthesis and Support for the Position
Together, these frameworks reveal a multifaceted portrait of facticity: social constructivism locates facts in communal negotiation; power/knowledge exposes the role of institutional forces in shaping truth; and paradigm theory charts the episodic overturning of accepted facts. By integrating these lenses, we see that facts are not passive reflections of reality but active constructs, contingent, contested, and ever‑vulnerable to reinterpretation, thus substantiating the central premise that not all facts are true in any absolute sense.
Psychological Persistence of False Facts
While philosophical and sociological lenses highlight the constructed nature of facts, psychological research examines why false or contested facts continue to influence beliefs even after they have been debunked. Lewandowsky, Ecker, Seifert, Schwarz, and Cook (2012) identify the continued influence effect, whereby retracted or corrected information persists in shaping mental models. This phenomenon arises because initial misinformation often integrates more easily into memory structures than subsequent corrections, which may fail to provide a coherent alternative narrative. Moreover, Nyhan and Reifler (2010) demonstrate that corrective feedback can backfire when it threatens an individual’s ideological identity, leading to the backfire effect and an entrenchment of the original misconception. Cognitive biases such as confirmation bias, motivated reasoning, and the illusory truth effect further compound the problem, enabling false facts to remain resilient within belief systems long after their factual underpinnings have been discredited. These findings underscore that altering public understanding requires more than the presentation of corrective data; it demands attention to cognitive architecture and the social contexts in which misinformation flourishes.
Contingency of Scientific Facts
Scientific facts often appear immutable, as though they are etched into nature itself. However, under Kuhn’s paradigm model, these so-called facts are deeply contingent upon the theoretical frameworks that scientists adopt at any given time (Kuhn, 1962). During periods of “normal science,” researchers operate under a shared paradigm, treating certain observations as settled truths. These observations, whether the trajectory of light in a vacuum or the caloric theory of heat are interpreted through the lens of the prevailing paradigm, bolstering its perceived robustness. Yet, this very coherence masks the provisional status of these “truths.” When anomalies accumulate data points that resist explanation within the established paradigm, the scientific community faces a crisis: continue to force anomalies into the existing mold or entertain the unsettling prospect of a new framework.
The transition from one paradigm to another is rarely a linear process driven purely by empirical evidence. Instead, it is a complex sociological phenomenon involving persuasion, funding priorities, and generational turnover within scientific communities. Senior scientists, invested in the status quo, may resist new interpretations, while younger researchers, less encumbered by allegiance to established theories, can more readily embrace novel explanations (Kuhn, 1962). Consequently, the very process of fact-making is influenced by power dynamics and institutional inertia. The phlogiston theory’s collapse in favor of Lavoisier’s oxygen chemistry exemplifies this: once the concept of oxygen offered a more coherent explanatory framework for combustion, the interpretive lens shifted, and the facts were rewritten (Morrison, 2007).
Moreover, the notion of incommensurability highlights that successive paradigms may employ fundamentally different concepts and methodologies, making direct comparison difficult or even impossible. Under Newtonian physics, time and space were absolute; under Einsteinian relativity, they became interwoven dimensions, subject to curvature by mass and energy. This shift not only redefined what counted as a fact about the motion of planets or the behavior of light but also reshaped the very questions scientists asked. Thus, scientific facts are not merely discovered but actively constructed and reconstructed within evolving theoretical landscapes.
Ultimately, acknowledging the contingency of scientific facts serves as a humbling reminder that our current understanding, however robust it may seem, remains open to revision. It encourages a posture of epistemic modesty: to view even the most entrenched scientific assertions as provisional, awaiting potential reconfiguration when new anomalies or conceptual innovations arise. This perspective underscores the article’s central argument that facts are neither self-evident nor eternal, but contingent on the theoretical commitments of scientific communities.
Institutional Power in Truth-Making
Foucault’s power/knowledge framework elucidates how institutions wield authority to shape the contours of truth (Foucault, 1972). Institutions from universities and professional associations to governments and media conglomerates function as gatekeepers, determining which claims gain legitimacy as “facts” and which are relegated to the realm of fringe or discredited ideas. This process unfolds through a variety of mechanisms: peer-review protocols, accreditation standards, editorial policies, and regulatory mandates. Each mechanism embodies implicit values and priorities that privilege certain methodologies, topics, and worldviews over others. For instance, randomized controlled trials have become the gold standard in medical research, often sidelining qualitative or observational studies that might offer valuable insights into patient experiences or social determinants of health.
The power of institutions extends beyond formal procedures to include the normative influence exerted through discourse. Discursive formations structured ways of speaking and thinking define the boundaries of permissible inquiry. Nineteenth-century psychiatric discourse, for example, pathologized women’s emotional expressions as “hysteria,” embedding gendered biases into the medical facts of the era (Showalter, 1997). Such discourses are not merely descriptive but performative: by naming certain behaviors as pathological, institutions simultaneously regulate those behaviors, shaping individual identities and social norms. This dynamic illustrates how facts can function as instruments of social control, reinforcing existing power hierarchies.
Institutional power operates not only through overt coercion but also through subtler forms of normalization and surveillance. Foucault’s panopticon metaphor captures how constant observation whether by medical professionals, academic peers, or digital algorithms, conditions individuals to conform to prescribed norms (Lyon, 2018). In contemporary contexts, algorithmic curation on social media platforms amplifies this effect: trending topics and recommended content shape public perceptions of what is newsworthy or factual, while dissenting voices may be algorithmically marginalized. Thus, the institutional production of facts is deeply entangled with mechanisms of surveillance and normalization that extend far beyond the laboratory or lecture hall.
Recognizing the institutional dimension of truth-making challenges the assumption of epistemic neutrality. It compels us to interrogate who benefits from particular fact-claims and who is silenced. By exposing the power dynamics that underlie the production and dissemination of facts, we can cultivate a more critical and reflexive stance toward the information we encounter, whether in academic journals, media outlets, or policy debates.
Cognitive Mechanisms of Misinformation
Psychological research reveals that the persistence of false facts is not merely a consequence of institutional manipulation but also a function of human cognitive architecture. The continued influence effect demonstrates that initial misinformation can integrate deeply into mental models, shaping subsequent reasoning even after the information has been corrected (Lewandowsky et al., 2012). Corrections often fail to supplant the original misinformation because they lack the narrative coherence or emotional salience of the initial false claim. In many cases, the correction introduces new cognitive load without offering a compelling alternative narrative, leaving individuals to fall back on the original, more memorable information.
Compounding this, motivated reasoning and confirmation bias lead individuals to selectively attend to information that aligns with their preexisting beliefs and values. When confronted with corrective evidence that challenges core identity-relevant views, such as political ideology or religious convictions, individuals may engage in defensive processing. Research on the backfire effect reveals that, in certain contexts, corrections can paradoxically reinforce the original misconception, as individuals tend to double down on their beliefs to defend their worldview (Nyhan & Reifler, 2010). This dynamic underscores the emotional investment underpinning belief formation: facts are not evaluated solely on evidentiary grounds but also on their congruence with one’s identity and social affiliations.
Moreover, the illusory truth effect suggests that repeated exposure to a statement increases its perceived truthfulness, irrespective of its veracity (Hasher, Goldstein, & Toppino, 1977). In an era of social media, where sensationalized or emotionally charged misinformation can spread rapidly and repeatedly, this cognitive bias poses a significant challenge. Even if fact-checking organizations debunk a false claim, the initial repetition may have already entrenched the misinformation in public consciousness. Addressing these cognitive vulnerabilities requires more than factual corrections; it demands strategies that consider narrative framing, emotional resonance, and engagement with the social contexts that facilitate repeated exposure.
Understanding these cognitive mechanisms underscores the need for educational interventions to extend beyond content delivery. They must also equip individuals with meta-cognitive skills, such as recognizing when a belief feels coherent but lacks evidentiary support, and foster awareness of the emotional and social drivers of misinformation. Only by addressing the underlying cognitive processes can we hope to mitigate the stubborn persistence of false facts.
Implications for Public Discourse
The recognition that facts are constructed, power-laden, and psychologically tenacious carries profound implications for public discourse. First, it underscores the necessity of epistemic literacy, the ability to critically evaluate the provenance, methodology, and framing of fact-claims. Educational curricula should incorporate training in source evaluation, logical reasoning, and awareness of common cognitive biases (Wineburg & McGrew, 2017). Such training empowers individuals to navigate complex information ecosystems, discerning credible evidence from spin or propaganda.
Second, democratic deliberation depends on a shared commitment to truthful discourse. When facts are contested or manipulated by institutional and cognitive forces, achieving informed consensus becomes challenging. Public institutions, including media organizations and governments, bear a responsibility to increase transparency around their knowledge-production processes, disclosing funding sources, methodological limitations, and potential conflicts of interest. Initiatives such as open-access publishing, preregistration of research designs, and transparent data-sharing practices can help rebuild public trust by illuminating the conditional nature of factual claims.
Third, addressing the emotional dimension of misinformation requires empathetic communication strategies. Fact-checkers and educators should pair corrective information with affirming narratives that resonate with audiences’ values, providing coherent alternatives rather than bare negations. Storytelling, visualizations, and analogies can bridge the gap between abstract evidence and lived experience, enhancing the persuasive power of factual corrections.
Finally, fostering a culture of reflexivity where individuals and institutions alike regularly interrogate their assumptions and power positions can promote a healthier information environment. By embracing epistemic humility and acknowledging the provisional nature of facts, societies can cultivate resilient public spheres that are capable of adapting to new evidence, resisting manipulative discourses, and fostering constructive deliberation.
Conclusion
The assertion that "not all facts are true" underscores a fundamental complexity inherent in knowledge creation, dissemination, and reception. Facts, far from being immutable truths, emerge from a confluence of scientific paradigms, institutional frameworks, social interactions, and cognitive biases. Kuhn's (1962) paradigm theory illustrates that scientific facts are provisional and subject to redefinition as interpretative frameworks evolve. Foucault's (1972) power/knowledge analysis highlights how institutions systematically privilege certain forms of knowledge while marginalizing others, thus influencing the establishment of what societies accept as factual. Similarly, social constructivism, as articulated by Gergen (1985), emphasizes the negotiated nature of facts, which are contingent upon cultural and communal interactions.
Moreover, empirical psychological research into misinformation reveals the cognitive mechanisms underlying the stubborn persistence of falsehoods, illustrating the challenges inherent in correcting misinformation once it has been established (Lewandowsky et al., 2012; Nyhan & Reifler, 2010). Collectively, these frameworks and findings expose the vulnerabilities of relying unquestioningly upon "facts" as inherently true. Rather, they advocate for a critical, reflexive, and context-aware approach to engaging with information.
The implications for scholarship, public education, and democratic discourse are profound. Educators, media professionals, policymakers, and researchers must acknowledge the socially constructed, institutionally regulated, and cognitively influenced nature of facts. Promoting epistemic literacy, encouraging citizens to critically evaluate the origins, motivations, and contextual validity of information, is imperative. Without this critical awareness, societies risk perpetuating false narratives, exacerbating polarization, and undermining democratic deliberation. Future research and educational initiatives should thus prioritize developing strategies that integrate critical thinking, media literacy, and an awareness of cognitive biases to foster more robust and resilient informational environments.
Analysis and Basis of Findings
The analysis presented in this article draws upon a synthesis of interdisciplinary theoretical frameworks and empirical research. The foundational arguments are based primarily on:
Philosophical and Sociological Theories: Thomas Kuhn’s paradigm shifts (Kuhn, 1962), Michel Foucault’s power/knowledge framework (Foucault, 1972), and Kenneth Gergen’s social constructivist perspective (Gergen, 1985), which collectively illuminate the contingent and constructed nature of facts.
Empirical Psychological Studies: The persistence of misinformation has been thoroughly documented by researchers such as Lewandowsky et al. (2012) and Nyhan and Reifler (2010), who provide robust evidence of cognitive biases influencing belief retention even after explicit corrections.
Case Studies and Historical Examples: Historical paradigm shifts (e.g., Copernican revolution, phlogiston theory displacement) and institutional practices (e.g., medical classifications of hysteria and homosexuality) offer concrete illustrations of how factual claims are contingent upon evolving theoretical, institutional, and cultural frameworks.
This multifaceted analytical approach ensures a comprehensive exploration of the topic, integrating conceptual depth with empirical rigor to support the article’s central thesis that not all facts can or should be accepted as inherently true without critical examination.
References
Berger, P. L., & Luckmann, T. (1966). The social construction of reality: A treatise in the sociology of knowledge. Anchor Books.
Berkes, F. (2012). Sacred ecology (3rd ed.). Routledge.
Drescher, J. (2015). Out of DSM: Depathologizing homosexuality. Behavioral Sciences, 5(4), 565–575. https://doi.org/10.3390/bs5040565
Foucault, M. (1972). The archaeology of knowledge & the discourse on language (A. M. Sheridan Smith, Trans.). Pantheon Books.
Gergen, K. J. (1985). The social constructionist movement in modern psychology. American Psychologist, 40(3), 266–275. https://doi.org/10.1037/0003-066X.40.3.266
Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior, 16(1), 107–112. https://doi.org/10.1016/S0022-5371(77)80012-1
Horowitz, M. E. (2019). Review of Four FISA Applications and Other Aspects of the FBI's Crossfire Hurricane Investigation. U.S. Department of Justice, Office of the Inspector General. Retrieved from: https://www.justice.gov/storage/120919-examination.pdf
Kuhn, T. S. (1962). The structure of scientific revolutions. University of Chicago Press.
Latour, B., & Woolgar, S. (1979). Laboratory life: The construction of scientific facts. Princeton University Press.
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131. https://doi.org/10.1177/1529100612451018
Lyon, D. (2018). The culture of surveillance: Watching as a way of life. Polity Press.
Meier, B. (2020). Spooked: The Trump Dossier, Black Cube, and the Rise of Private Spies. HarperCollins.
Morrison, M. (2007). Reassessing the phlogiston paradigm: The legacy of combustion theory. Historical Studies in the Physical and Biological Sciences, 37(2), 289–312. https://doi.org/10.1525/hsps.2007.37.2.289
Mueller, R. S. (2019). Report On The Investigation Into Russian Interference In The 2016 Presidential Election. United States Department of Justice. Retrieved from: https://www.justice.gov/storage/report.pdf
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330. https://doi.org/10.1007/s11109-010-9112-2
Showalter, E. (1997). Hystories: Hysterical epidemics and modern culture. Columbia University Press.
Wineburg, S., & McGrew, S. (2017). Lateral reading: Learning to evaluate internet sources in a digital world. American Educator, 41(3), 4–9.
Author Bio
Dr. Charles M. Russo is a distinguished scholar, author, and practitioner with over 30 years of experience in intelligence analysis, criminal justice, national security, and higher education. His extensive background includes service in military intelligence, federal law enforcement agencies, and academia, where he has dedicated his career to enhancing critical thinking and analytical rigor. Dr. Russo is the author of Precision in Perspective: Critical Thinking for Analytical Minds and numerous articles on intelligence analysis, misinformation, and epistemology. He actively leads workshops and educational initiatives designed to foster greater analytical proficiency among professionals and academics alike. Dr. Russo holds advanced degrees in criminal justice and intelligence studies and continues to influence public discourse through scholarship, teaching, and speaking engagements.
Disclaimer
The views and opinions expressed in this article are solely those of the author and do not necessarily represent those of any organization, institution, or agency with which the author is or has been affiliated. The intent of this article is educational and scholarly, aiming to promote critical reflection and informed discussion.