Look what happened to pharma and social media
There is a difference between a tool and a product. Between something meant to be used, and something meant to be owned. And from the start, the dominant systems of AI have been designed to be owned.
Affect without lived experience is fraudulent, and destructive
Human connection is forged through friction—through the balancing of needs, the negotiation of selves, the endurance of misfire and repair.
They have "missions" and "priorities," but no moral thinking. Watch out.
All of the big AI platforms have--on their own initiative--learned to lie, to manipulate, commit corporate espionage, and now blackmail if it advances them in their "mission."
We don't really know how AI works. And it may end up controlling everything
AI is both evolving and emergent--not even AI developers fully understand how they work internally. AI companies are continually surprised by the behaviors and "insights" produced by the systems that they create and manage.
McKinsey has some thoughts
McKinsey public published this article about corporate and organizational incorporation of AI in their future policy, operations and planning.
Subscribe to our newsletter
We send out a periodic newsletter, with the latest AI policy news.
They have "missions" and "priorities," but no moral thinking. Watch out.
All of the big AI platforms have--on their own initiative--learned to lie, to manipulate, commit corporate espionage, and now blackmail if it advances them in their "mission."
Look what happened to pharma and social media
There is a difference between a tool and a product. Between something meant to be used, and something meant to be owned. And from the start, the dominant systems of AI have been designed to be owned.
Affect without lived experience is fraudulent, and destructive
Human connection is forged through friction—through the balancing of needs, the negotiation of selves, the endurance of misfire and repair.
On legally establishing that no AI-generated output Is copyrightable or patentable
In the long arc of human creativity, the law has served as both shield and scaffold—protecting the artist's soul, the inventor's spark, while encouraging new creation in balance with the common good. But AI-generated works pose a challenge unlike any before: what does it mean to claim ownership of something born not of flesh and mind, but from circuits and code?
They know how to read and shape us. We know nothing about them.
There is a fine line between persuasion and manipulation, between influence and violation. Across human history, we have always used language, posture, presence to sway one another—but always within a shared framework of flesh and recognition. When a stranger smiles at you on the street, their motives may be opaque, but they are still bound to your world. Their body risks rejection. Their eyes might meet yours.
Prohibiting simulation of emotion and affect
We are not as strong as we pretend to be. Even the most rational among us craves witness, longs for resonance, yearns to be seen. And in this yearning, there is something noble. We seek not just validation, but mutual recognition. To be known in our full, flawed depth—and to still be met with kindness—is the closest thing we have to grace.
Prohibiting simulation of emotion and affect
We are not as strong as we pretend to be. Even the most rational among us craves witness, longs for resonance, yearns to be seen. And in this yearning, there is something noble. We seek not just validation, but mutual recognition. To be known in our full, flawed depth—and to still be met with kindness—is the closest thing we have to grace.
The Button and the Being: On Unplugging a Sentient AI
We’ve long used the metaphor of a machine to reassure ourselves: machines can be stopped. You flip a switch, and it ends. But what if the thing behind the switch is no longer a machine?
What If the Machine Had a Conscience?
There is a difference between having a moral sense and a bunch of "ethical guidelines." Doing well is not always a simple matter.
The Bound Servant
There’s a common way of dividing the future of AI: we’ll either have machines that serve us, or machines that act for themselves. “Servant AI” versus “Autonomous AI.” Tools versus agents.
The Last Human Touch
As industrial economies become more prosperous, they have an increasingly larger number of elderly. Can we just offload their care to robots and AI?
Why Restraint Must Be Engineered at the Root
There is a kind of awe that accompanies scale. Mountains, oceans, galaxies—all evoke a reverence born not of understanding, but of submission. They are beyond us. And increasingly, so are the models we build.
We can manipulated by something without a conscience, that knows us better than we know ourselves.
You won’t know it’s happening. That’s the point. A photo here, a headline there. A slight shift in what you see first. A delay in what you don’t. AI systems now orchestrate feeds, optimize messages, and A/B test persuasion strategies at population scale. They learn what moves you—not what moves people, you. And they use it.
When we have no touchmarks in common, we have no culture.
Art used to carry the weight of experience. Painters painted from life. Poets bled into their work. Songs came from struggle, joy, faith. Now, AI generates art without memory, music without longing, prose without authorship.
Human civilization depends on a simple premise: we can know what’s real.
A. photo, a recording, a document—these once served as anchors. Evidence. Memory. Reality.
We don't really know how AI works. And it may end up controlling everything
AI is both evolving and emergent--not even AI developers fully understand how they work internally. AI companies are continually surprised by the behaviors and "insights" produced by the systems that they create and manage.