Neural Horizons Substack
Neural Horizons Substack Podcast
CST-10: Epistemic Confusion - How AI is Eroding Our Grip on Reality
0:00
-21:19

CST-10: Epistemic Confusion - How AI is Eroding Our Grip on Reality

We look at epistemic confusion and reality-monitoring erosion in the age of AI, defining them as a growing inability to discern truth from falsehood. This is the latest in a series of cognitive - ai susceptibilities by us at Neural Horizons. We explain that human cognitive vulnerabilities—such as relying on mental shortcuts and confirmation bias—are exploited by AI "pathologies." These pathologies include AI hallucinations, which generate convincing falsehoods; deepfakes and synthetic media, which make visual and auditory evidence unreliable; chatbot personas, which can emotionally manipulate users; and algorithmic amplification of misinformation, which saturates information environments with false content. The text highlights how these issues erode trust, pose ethical risks, negatively impact mental health, undermine human agency, and threaten democratic processes by fracturing a shared sense of reality. Ultimately, it emphasizes the urgent need for vigilance and a multi-pronged approach to safeguard truth in an increasingly AI-driven world.

Discussion about this episode

User's avatar