We always panic about new tools (and we're always wrong)
Duration: 7:32 | Size: 6.90 MB
This essay first appeared in my weekly newsletter, The Work of Being, where I write once a week about work, learning, and judgment.
There’s a pattern we keep repeating, and once you see it, you can’t unsee it.
Every time a new tool emerges for making or manipulating symbols—writing, images, music, calculations—we panic. We declare that the tool will destroy something essential about human cognition, creativity, or authenticity. We resist. We warn. We predict disaster.
And then, quietly, we accept the tool. We integrate it into our work. We forget we ever worried.
The pattern is so consistent across centuries that it’s almost embarrassing.
The printing press (1450s)
When Johannes Gutenberg invented the printing press, scribes didn’t celebrate the democratization of knowledge. They formed guilds to resist it.
Hand-copied texts were considered more authentic, more spiritually meaningful. Johannes Trithemius, a Benedictine abbot, protested what he called “the invasion of the library by the printed book.” Mechanical reproduction, he argued, lacked the devotion of “preaching with one’s hands.”
The scribes were right that something would change. Books would become cheaper, more accessible, less precious as physical objects. But the thing they were protecting—the exclusive control of knowledge reproduction—wasn’t worth protecting. The world gained literacy. The scribes lost their monopoly.
Today, nobody argues that hand-copied manuscripts are more authentic than printed books. We forgot we ever worried.
Typewriters (1880s)
When typewriters spread in the late 1800s, recipients of typewritten letters felt insulted. The typewriter was seen as cold, impersonal, mechanical.
Martin Heidegger—yes, the philosopher—argued that the typewriter meant “the word no longer passes through the hand as it writes and acts authentically but through the mechanized pressure of the hand.” The connection between thought and writing, he believed, was broken by the machine.
Friedrich Nietzsche owned a typewriter. After using it, he wrote: “Our writing equipment takes part in the forming of our thoughts.” He worried the machine was changing how he thought.
They were right that something would change. Writing became faster, more standardized, easier to produce in volume. But the thing they were protecting—some mystical connection between hand and thought—turned out not to matter. Good writing remained good. Bad writing remained bad. The tool was neutral.
Today, nobody argues that handwritten letters are more authentic than typed ones. We forgot we ever worried.
Photography (1839)
When photography emerged, the art world dismissed it as mechanical reproduction, incapable of true creativity.
Charles Baudelaire warned that photography would “supplant or corrupt” art altogether. It was made by a machine rather than by human hands. How could it be art?
The Museum of Fine Arts in Boston didn’t collect photographs until 1924—nearly a century after the medium was invented. It took that long for cultural institutions to accept that photographs could be art.
They were right that something would change. Realistic painting became less necessary. Portraits became cheaper. But the thing they were protecting—the exclusive claim that only hand-made images count as art—wasn’t worth protecting. Photography became its own art form. Painting didn’t die; it evolved.
Today, nobody argues that photographs aren’t art. We forgot we ever worried.
Calculators (1970s)
When calculators entered classrooms, educators warned that students would become dependent on machines. Their computational abilities would atrophy. They’d forget how to do math.
Parents believed their children were being intellectually crippled by these devices. The debate went on for decades. Some schools banned calculators entirely.
They were right that something would change. Students stopped memorizing multiplication tables as rigorously. Mental arithmetic became less emphasized. But the thing they were protecting—the ability to compute by hand—turned out to be less important than the ability to solve complex problems. Calculators freed students to work on harder mathematics.
Today, calculators are in every classroom. Nobody argues they’ve ruined mathematical ability. We forgot we ever worried.
Recorded music (1877)
When Thomas Edison invented the phonograph, critics feared it would kill live performance.
Why would anyone attend concerts when they could hear perfect recordings at home? Theodor Adorno argued that recording distorts authenticity. Walter Benjamin worried that broadcasting removed music from the “concert ritual” that gave it meaning.
They were right that something would change. Recorded music became dominant. The economics of performance shifted. But the thing they were protecting—the exclusive authenticity of live performance—wasn’t threatened. Live music didn’t die. It became one way to experience music rather than the only way.
Today, nobody argues that recorded music isn’t “real” music. We forgot we ever worried.
The pattern
Notice what’s consistent:
- A new tool emerges that makes symbol-manipulation easier, faster, or more accessible
- Experts predict disaster - authenticity will be lost, cognition will be damaged, creativity will die
- They’re right that something changes - the tool does alter workflows, economics, or social practices
- But wrong about what matters - the thing they’re protecting turns out to be less essential than they believed
- We accept the tool and evaluate its products on their merits
- We forget we ever resisted and the anxiety becomes invisible in retrospect
We’re currently in step 2 with AI.
The anxiety is real. The predictions are dire. AI will destroy writing, kill creativity, flood the world with worthless content. We need to detect it, label it, exclude it.
Maybe. Or maybe this is the printing press panic all over again. Maybe we’re protecting something that turns out not to matter—the exclusive human claim to symbol manipulation—while missing what actually matters: whether the output is true, useful, beautiful, or insightful.
I’m not saying AI writing is identical to any previous tool. Each technology is different. The specific changes will be unique to this moment.
But the pattern of anxiety is familiar. Suspiciously familiar. “This time it’s different” is what every generation says. And every generation has been wrong about what’s worth protecting.
What actually matters
Here’s what survives every tool transition: the work still has to be good.
Printing presses produced bad books. Typewriters produced bad writing. Cameras took bad photographs. Calculators didn’t make people good at math. Recordings captured bad performances.
The tool doesn’t determine quality. It never has. What determines quality is whether the person using the tool knows what they’re doing, cares about the outcome, and puts in the effort to make something worth experiencing.
So when I see the AI anxiety—the detection systems, the categorical dismissals, the “slop” panic—I see a familiar pattern. We’re in the resistance phase. We’re predicting disaster. We’re trying to protect something we think is essential.
History suggests we’re probably wrong about what’s worth protecting.
The question isn’t whether AI will change things. It will. The question is whether what we’re defending—the exclusive human ownership of writing—is actually what makes writing valuable.
I suspect it’s not. I suspect what makes writing valuable is whether it helps people think, clarifies ideas, or says something true. And that can be evaluated only by reading the work, not by checking who—or what—wrote it.
We’ll accept that eventually. We always do.
The question is how long we’ll spend in the panic phase before we get there.
Featured writing
Why customer tools are organized wrong
This article reveals a fundamental flaw in how customer support tools are designed—organizing by interaction type instead of by customer—and explains why this fragmentation wastes time and obscures the full picture you need to help users effectively.
Infrastructure shapes thought
The tools you build determine what kinds of thinking become possible. On infrastructure, friction, and building deliberately for thought rather than just throughput.
Server-Side Dashboard Architecture: Why Moving Data Fetching Off the Browser Changes Everything
How choosing server-side rendering solved security, CORS, and credential management problems I didn't know I had.
Books
The Work of Being (in progress)
A book on AI, judgment, and staying human at work.
The Practice of Work (in progress)
Practical essays on how work actually gets done.
Recent writing
Dev reflection - February 03, 2026
I've been thinking about constraints today. Not the kind that block you—the kind that clarify. There's a difference, and most people miss it.
When execution becomes cheap, ideas become expensive
This article reveals a fundamental shift in how organizations operate: as AI makes execution nearly instantaneous, the bottleneck has moved from implementation to decision-making. Understanding this transition is critical for anyone leading teams or making strategic choices in an AI-enabled world.
Dev reflection - February 02, 2026
I've been thinking about what happens when your tools get good enough to tell you the truth. Not good enough to do the work—good enough to show you what you've been avoiding.
Notes and related thinking
Infrastructure shapes thought
The tools you build determine what kinds of thinking become possible. On infrastructure, friction, and building deliberately for thought rather than just throughput.
Dev reflection - February 03, 2026
I've been thinking about constraints today. Not the kind that block you—the kind that clarify. There's a difference, and most people miss it.
Dev reflection - February 02, 2026
I've been thinking about what happens when your tools get good enough to tell you the truth. Not good enough to do the work—good enough to show you what you've been avoiding.