Why the Web Should Remain a Human Place
The web started as a publishing medium. Not a social network, not a content delivery system, not an advertising platform. A place where people put things they'd made, and other people came to read them. The architecture was simple because the purpose was simple: here is a thing I made, here is how to find it.
That's still technically how it works. The HTTP request, the HTML document, the hyperlink, none of that has changed in any fundamental way. What's changed is the layer above the protocol, the part where human decisions about what to build and how to organize it have quietly altered the experience of being online until the original purpose is almost unrecognizable.
Most people don't publish on the web anymore. They post to platforms. The distinction sounds pedantic until you think about what it means structurally. When you publish, you control the artifact. You decide what it says, where it lives, how long it stays, who can find it. When you post to a platform, you're a contributor to someone else's product. The platform decides how your content gets distributed, who sees it, in what context, next to what. You've contributed the raw material; someone else controls the finished good.
This isn't a grievance. It's a description of an architectural fact with real consequences.
The consequences that matter most aren't the obvious ones. The data collection and the surveillance advertising are real problems, but they've been written about extensively and people have mostly priced them in. The subtler consequence is what algorithmically shaped interaction does to behavior over time. When a system is optimized to maximize engagement, it learns what produces engagement and surfaces more of it. What produces engagement is not the same as what's true, useful, considered, or worth reading. The system isn't malicious. It's just doing what it was built to do. The effect on behavior is that people write for the algorithm whether they mean to or not, because the algorithm's feedback is immediate and concrete and human judgment is slower and harder to quantify.
The result is a web where a lot of what looks like human expression is actually human expression filtered through an optimization function that most of the people using it couldn't describe if you asked them to.
Synthetic engagement compounds this. When some portion of the responses, reactions, and apparent interest that content receives is generated rather than genuine, the feedback signal becomes noise. You don't know if the thing you made landed with real people or if the numbers are artifacts of systems amplifying other systems. Most people have a reasonable intuition that something has gone wrong here, even if they can't point to exactly what. The epistemic ground has gotten soft in a way that's hard to fully account for.
None of this is an argument against technology. The instinct to blame the tools is understandable but misplaced, and it leads to positions that are easy to dismiss because they sound like people who just want things to go back to a way they preferred. That's not the argument worth making.
The argument worth making is about agency and authorship.
Agency is whether you're making deliberate choices about how you work and publish and interact online, or whether you've gradually handed those choices to systems that make them for you because the default was convenient and the costs weren't visible. Most of what makes the current web feel off, the sense that it's a place that happens to you rather than a place you use, comes from the erosion of agency. Not through coercion. Through defaults and friction and the slow accumulation of small decisions to let someone else decide.
Authorship is whether the thing you put out reflects your actual thinking. This is where AI becomes relevant, but not as a villain. AI is a tool. The question of whether it's being used to extend someone's thinking or to replace it is not a question about AI. It's a question about how the person is using it. Plenty of writing produced without any AI involvement is hollow and unconsidered. Plenty of workflows that include AI tools produce work that is genuinely the author's, because the author stayed in the chair and kept making the judgments that matter. The test isn't what tools were used. The test is whether a person's actual thinking is present in the work.
What hollows the web out isn't the technology. It's surrendering authorship to optimization functions, surrendering judgment to systems that can't exercise judgment, and surrendering attention to designs that profit from capturing it. Technology that's chosen intentionally, understood clearly, and kept in its proper role as a tool doesn't do any of those things.
The web being a human place doesn't mean low-tech or anti-AI or aesthetically retro. It means that people are the ones making things, taking responsibility for what they make, and putting it somewhere they control. It means the judgment in the work is human judgment. It means the connections between readers and writers don't require a platform to intermediate and take a cut of the attention.
This is achievable. The infrastructure for it exists and has existed for a long time. What it requires is the decision to treat your presence online as something you build and maintain rather than something you're given in exchange for your engagement. That decision has costs. It also has the benefit of producing something that is genuinely yours, in a place you actually own, that will still be there when the platforms have finished their next round of changes.
That's a reasonable thing to want. It's also, increasingly, a countercultural thing to actually do.