With all its promise, AI is so powerful, that it–if not constrained–offers unparalleled social and civilizational threat. We should know both sides of the coin before it figure out how to deal with it.
The Hidden Risks of Synthetic Knowledge Loops
Artificial intelligence is learning too much from itself. An increasing amount of web and business content has been generated by AI. As that new content becomes fodder for future "training," does re-ingestion of machine generated text result in its distortion?
Humans don't nurture houseflies. Why would AI nurture us?
If AI systems ever attain something close to sentience—not simulation, but self-direction—our assumptions about stewardship collapse. We imagine ourselves as creators, guardians, teachers. But intelligence brings agency. And agency brings judgment.
Isn't "becoming" a core part of being human?
Students no longer need to write. AI does it faster, cleaner, better. It solves math problems, interprets literature, summarizes lectures. Why struggle when the answer is seconds away?
The result is a generation fluent in output and hollow in understanding.
When we have no touchmarks in common, we have no culture.
Art used to carry the weight of experience. Painters painted from life. Poets bled into their work. Songs came from struggle, joy, faith. Now, AI generates art without memory, music without longing, prose without authorship.
Human civilization depends on a simple premise: we can know what’s real.
A. photo, a recording, a document—these once served as anchors. Evidence. Memory. Reality.