Individualism is the flavor AI chatbots can't stomach

1,429 wordsFiled in: AI, publishing, web, content strategy

The intersection of multiple winding paths, each lined with intricately carved wooden shelves adorned with various objects.
Experience cannot be overlook as the open web tombstones. Image made with Loras.dev

The AI experience has drained traffic from reference content, but leaves a path forward of human-centric experiences, interactivity, and better on-site value.

AI ate the web and changed how people find answers: generic reference and how‑to content is increasingly satisfied inside chat bots, not by the open web.

Why spend time Googling when you can just ask?

Here’s the reality I see in my work: classic reference material—definitions, quick how‑tos, Stack‑Overflow‑style snippets—gets satisfied inside AI tools. People still seek out primary sources for breaking news, official documentation, regulatory text, or papers where provenance matters. They also show up for opinionated takes and personal narratives, where the author’s voice is the point.

There's a lot of doom, but there is hope.

tl;dr#

  • AI commoditizes surface‑level reference; traffic shifts away from source sites.
  • Blocking crawlers won’t fix distribution or prove you’re in the conversation.
  • Win with human‑centric value: interactive, personal, timely, and experiential content.
  • Measure differently; expect opaque referrals and uneven attribution.
  • A hopeful path: focusing on community, interaction.

In other words, this isn’t about someone “eating your lunch” in the future. It’s already eaten for much web traffic. The job changes from “be the best answer everywhere” to “be the best experience where a visit is obviously better.”

Assistants have collapse many open tabs into one answer box.

Meanwhile, readers still prefer to visit sources for:

  • Timely reporting, original analysis, or opinion.
  • Deep, consequential reference where provenance matters (docs, specs, research).
  • Personal experience and narrative from a trusted author.

You could think of it in the way retail imploded. Big‑box convenience drew shoppers away from small specialty stores. But here, the “big box” is built from the small shop’s inventory. The synthesis helps the reader, but it doesn’t return attention, credit, or revenue to the maker by default.

If you publish reference or “context” material, you’re not about to get disrupted — you already are. Search experiences increasingly satisfy the task without a click. Studies on AI Overviews and zero‑click behavior broadly show suppressed click‑through rates for commodity intents; some publishers have reported double‑digit declines on affected queries as assistants answer in‑place.

The small shop vs the big box#

The big box wins on convenience: one stop, all the basics. The difference now is that the big box store's “inventory” is your writing. It’s adapted, summarized, and combined with others’ work. Sometimes you’re cited; often you’re not. Even when you are, the click‑through is miniscule and hard to measure.

Not only do you not see the readers, but you can't even tell how often your work is utilized.

For the fortunate creators who don’t depend on ads, the situation is still maddening as relevance becomes invisible. Are you in the conversation? You might be, but the trail is faint.

Aside: why “just block the bots” isn’t a strategy#

You can attempt to opt out of training or scraping. Even if enforcement improves, you still face:

  • Fragmented policies across models and regions
  • No reliable way to prove your work influenced an answer
  • Lost distribution: fewer people discover you in the first place

Content will be consumed. The question is not how to fight change, but how to remain relevant, measurable and desirable.

Could regulation force contribution back to sources? Maybe in places, sometimes. But any legal framework is likely to be fragmented by jurisdiction, model type, and enforcement reality. Even if compensation shows up, it won’t solve distribution.

Where to invest?#

Prioritize experiences that are hard to reproduce in an answer box:

  • Walkthroughs with live code, data, or media the reader manipulates
  • Opinionated guidance tied to your track record and taste
  • Interactive tools and quizzes that adapt to the reader
  • Behind‑the‑scenes notes: constraints, tradeoffs, and “why,” not just “how”
  • Timely series and updates where recency and accountability matter

Also, tighten the basics so visiting the site is undeniably better:

  • Fast, clean pages with great typography and zero friction
  • Clear IA so the “next question” is one click away
  • Descriptive links, accessible images, and tidy Markdown

If the “big box” sells homogenized answers, your “small shop” wins by being unmistakably human: taste, context, and craft. Readers visit when the experience is uniquely yours.

Formats that travel poorly to assistants (that’s good for you)#

  • Explainers with a stance: you’re not just accurate; you’re useful and opinionated
  • Guided choices and tradeoff checklists tailored to a role or constraint
  • Interactive demos, data explorers, and calculators
  • Training modules with progress, certification, or feedback
  • Case studies with artifacts readers can inspect and reuse

Double down on the basics#

When a reader does click through, the on‑site experience has to be delightful:

  • Pages that feel instant; typography that’s pleasant; zero cookie walls
  • Clear “next step” information and related reading that builds momentum
  • Minimal UI friction for subscribing, saving, or sharing
  • Clean Markdown, descriptive links, alt text with meaningful captions

The measurement problem (and bias risk)#

As mentioned earlier, even when an assistant cites you, referral tracking is inconsistent. Sometimes you’ll see a link; often you won’t. Assistants and in‑SERP summaries don’t always pass referral data, so visits land in “direct” or “unknown.” That obscures whether you’re part of the conversation at all.

Practical ways to get a clearer signal:

  • Watch for branded queries rising while non‑brand falls. That pattern often accompanies more in‑place answering.
  • Track “assisted discovery” proxies: newsletter signups, repeat visitors, and time‑to‑return.
  • Use clearly labeled canonical links, schema, and stable anchors so citations have something durable to point to.

There’s also governance risk. Curators (human and algorithmic) can exclude entire classes of sources. There is no guarantee of neutrality, and no recourse if your work is elided from the synthesis. That’s not only a disadvantage; it’s a major risk.

Assume partial opacity. Ship work you’re proud of anyway, and bias your mix toward formats where direct visitation is the obvious win.

Monetization follows the same pattern. When an answer is consumed in‑place, you can’t wrap it in your narrative, your newsletter CTAs, or your on‑page monetization. You’ll see fewer ad impressions and fewer moments to earn a relationship.

What to track#

  • Cohort behavior: do new readers return within 7–14 days?
  • Reading depth: are people hitting 2–3 pages per session after their first visit?
  • Track "moments": email signups, RSS, “copy link,” and “add to notes” clicks
  • Mentions in public places (GitHub issues, newsletters) when assistants cite you

A path back to a more human web#

This is a problem a long time in the making: web publishing was commoditized when bots overran forums and comments. If browsers shipped an anonymous, privacy‑preserving human verification primitive, we could revive:

  • Low‑friction, high‑quality comments and discussions
  • Lightweight tips and small subscriptions without heavy identity
  • Community presence that rewards showing up as a person

Done right, that’s better than chasing impressions: it rebuilds the social fabric around sites.

We already see the ingredients: privacy‑preserving tokens to attest “a real device, a real user” without revealing identity; evolving standards from the web and privacy communities; platform moves to reduce captchas and automated abuse. If this lands in browsers, we can make comment forms pleasant again, bring back lightweight social presence, and make tip jars and tiny subscriptions practical—no surveillance required.

Imagine if browsers shipped a built‑in, privacy‑respecting “human presence” signal that any site could verify without tracking. Comments become less spammy without forcing logins into ad‑tech silos. “Thanks” buttons can move a few cents without a full billing relationship. Small communities can form around posts again. The web gets a little more like the earlier years—personal, conversational, and fun.

Closing#

LLM Assistants are here and useful. The adaptation is to double down on the parts of the web we actually like: human voices, interactive learning, and places that feel good to visit.

References#