The Web You’ve Been Given
You open your feed and it’s already too late.
Everything’s on fire—outrage headlines, quote-tweet dunks, and low-effort bait engineered to trigger just enough emotion to keep you scrolling. Somewhere, in a fluorescent-lit office on the other side of the planet, a tired contractor in a moderation warehouse is force-feeding their trauma threshold so you don’t see the really bad stuff. But everything else? That’s fair game. Rage is engagement. Engagement is profit. And you? You’re the product being sold.
The modern web isn’t built for discourse. It’s a meat grinder designed to harvest attention, wrap it in ads, and upsell your worst instincts. Every platform says it’s “community-first,” but moderation is a black box and the only consistent rule is: don’t touch the ad revenue.
Social media isn’t broken—it’s doing exactly what it was built to do. But we can do better on the open web.
Moderation by Trust, Not Algorithms
Now imagine something radically better. You don’t just report content into the void or hope the algorithm eventually “learns” what you like. Instead, you build a social graph of people you actually trust. Not follows. Not mutuals. Trust.
Your view of the web is shaped by this network. If your friends trust someone, their content floats into your feed. If someone starts spiraling into misinformation, and your trusted network flags it, it drops. Not deleted—just demoted. Still visible if you really want it. But not thrust in your face because it got a hundred angry replies.
Each person in your network carries metadata. Maybe you trust Alex’s takes on economics but block their health posts. Maybe you mute Dana on politics but always want her tech threads. Moderation isn’t one-size-fits-all. It’s contextual. Per-topic. Per-person. And those preferences are portable. You can save them, remix them, even share them.
That’s your Overton window—your lens on the world. But unlike Twitter’s version, it’s transparent and adjustable. You can expand it. Collapse it. Fork someone else’s entirely.
Discussion That Doesn't Exhaust You
This doesn’t just change who you see—it changes how you read.
Discussion threads aren’t flat chaos or top-down algorithmic noise. They’re layered. You can instantly see where your network agrees, where it fractures, where it explodes. You can isolate voices you trust and hide the static.
Nothing disappears. Even garbage stays available if you go digging. But the default view is filtered through relevance—not popularity, not virality, and definitely not which troll said the most unhinged thing fastest.
Instead of needing to scroll through 200 comments to find one worth reading, you see what matters first.
Friction Filters Out the Noise
Want to say something? Maybe you pay to comment. Or to edit. Or to post at all.
This isn’t some crypto bro fantasy where payment equals prestige. It’s a brake. A deterrent. It kills low-effort spam before it starts. Nobody’s going to spend money to drop a “ratio + L” if your network just quietly buries them.
But the door isn’t locked. If someone outside your trust graph shows up, pays a little, and says something meaningful? Your network notices. Trust expands. New nodes enter the graph. Paid participation becomes a way to discover people worth hearing—not a way to game visibility.
It’s friction with purpose. Unlike the platforms, we’re not afraid to slow things down.
Run Your Own Damn Feed
Moderation doesn’t end at your social graph. This is still the web. If you run a site, you set the rules.
Your domain is sovereign. Maybe you accept filtering input from your social graph. Maybe you build your own moderation presets. Maybe you lock things down entirely. That’s up to you. This isn’t a platform. It’s infrastructure.
When someone gets banned from Twitter, it’s often without warning, without clarity, and without recourse. When you ban someone from your domain, you can show them the exact policy they violated—and they can fork your site and start over. Same content, different norms.
We don’t need fewer websites. We need more moderated by their own communities, not shareholder anxiety.
Look Outside Without Losing Yourself
Of course you’ll want to see beyond your graph. This isn’t a call for infinite echo chambers. But stepping outside your bubble shouldn’t mean being fed algorithmic sludge.
Instead, you can search the wider network. Not based on what’s most clicked, but re-ranked through your lens of trust. LLMs help summarize complex threads from unfamiliar communities before you dive in. You don’t need to wade into every flame war just to see what someone else thinks.
And when you do engage with something messy or difficult, you’ll do it with filters active, context visible, and rage stripped of its algorithmic sugar coating.
The Web You Deserve
This isn’t nostalgia. We’re not rebuilding the early web. We’re building something the early web couldn’t: a network where moderation is deliberate, distributed, and transparent. Where filters are tools, not censors. Where identity and context matter. Where you’re not sold to advertisers or farmed for dopamine.
A web where your feed is yours. Your rules. Your people. Your trust.
Not a product. Not an experiment. A place to actually think.