Digital transformation for complex organizations (Parts 1-2)

A practical framework for moving legacy institutions toward measurable, audience-centered digital practice — without throwing out what already works.
Digital transformation isn't blocked by technology. It needs senior leadership, technologists, and your audience to understand each other well enough to move forward.
The symptoms of that disconnect are everywhere. I've seen 200-page PDF reports published with zero tracking or machine readability considerations. I've seen senior leaders ask "Can AI tell us?" believing AI is a magic spice to make sense of content that humans already struggle to find. I've been in rooms where technologists explain what the content and data platform can do — whether that's Drupal, a custom backend, or a database-driven system — and communicators explain what the content should do, and neither realizes they're talking past each other.
This post is a sketch of what I've learned about modernizing digital work inside complex bureaucracies. It's organized in two parts: the groundwork (what transformation actually requires) and the process (what you need to sustain it).
tl;dr#
Part 1: The Groundwork — Translation between senior leadership, technologists, and your audience. Use the Content–Action Model to align everyone around audience tasks and measurable outcomes.
Part 2: The Process — Establish a digital value loop: disciplined content → discoverability → actions → measurement → improvement. Measure task completion (not pageviews), make content AI-ready, and use small wins to build momentum.
Part 1: The Groundwork#
Governance can be heavy, teams are often split, incentives vary wildly, and "digital transformation" can mean everything and nothing. This section covers the groundwork that makes progress possible.
1. The bridge-builder model#
The core work is translation — between senior leadership, technologists, and your audience.
Senior leadership wants clarity, predictability, accountability. They need strategic aims converted into testable digital outputs — things they can report on, defend, and explain to boards.
Technologists want feasibility, requirements, constraints. They need technical jargon translated into choices and trade-offs that non-technical people can actually decide on.
Your audience wants simplicity, relevance, speed. Someone needs to advocate for them — even when no one else in the room is.
In practice, most digital transformation work is making these three groups talk to each other:
- Converting vague "we need better digital presence" into specific deliverables
- Turning "can we use AI?" into "here are three options with trade-offs"
- Building lightweight governance that nudges behavior without paralyzing teams
When stakeholders have competing priorities — and they will — staying focused on goals and measurement creates common ground. You iterate, you measure, you argue with evidence instead of opinions. Compromise happens, but it happens around facts.
This is the part no AI can automate — and probably the most valuable skill in organizational digital work. Whether your title is business analyst, project manager, product owner, or something else entirely, this translation work is what makes transformation possible.
2. Information for success at scale#
The mistake many organizations make: they think digital is about publishing. In reality, digital is about helping someone do something.
The Content–Action Model asks two questions of every page, product, and asset:
- What is this content helping someone do?
- How would we know if they did it?
This sounds simple. It isn't.
Most organizational content exists because someone decided to publish it, not because anyone identified an audience need it was meant to serve.
When you apply this model consistently, several things happen:
- Sites get smaller. Content that can't answer these questions gets archived or merged.
- Duplication reduces. You can't justify three pages about the same topic serving the same action.
- Publishing roles clarify. Editors become responsible for specific audience outcomes, not just "their content."
- SEO improves. Search engines reward content that demonstrably serves audience intent.
- Analytics become meaningful. You're measuring actions, not just visits.
I've helped this model scale to organizations with dozens of editors at institutions like EMBL-EBI — it works because it gives everyone a shared framework for evaluating content decisions, regardless of their specific domain expertise.
Part 2: The Process#
The groundwork in Part 1 is about human alignment. But alignment alone doesn't produce sustainable change — you also need systems that operationalize the vision. This section covers what to foster and institutionalize.
3. The digital value loop#
The future for complex organizations isn't "make more content." It's building a self-reinforcing loop:
- Disciplined content → enables
- Better discoverability (web + search + AI) → drives
- Clear audience actions → generates
- Measurable impact → informs
- Continuous improvement → produces better disciplined content
Each step feeds the next. Organizations that establish this loop — even imperfectly — outpace those still treating digital as a simple output function to mirror PDF documents.
The following sections unpack each element of the loop: what to measure, what to put in place, and how to implement while staying on track.
4. Measuring what matters#
The default metrics in many public-sector or research organizations aren't actually meaning.
Pageviews, impressions, PDF downloads, newsletter "reach" — none of these answer the question: did anyone find what they needed?
Better measurement focuses on:
Findability: Are people reaching this content through the pathways we expect? Internal referrals, search queries, and navigational success tell you whether your information architecture is working.
Task completion: Did they take the action the content was designed for? This could be downloading a resource, submitting a form, or clicking through to a deeper page.
Quality of engagement: Not just time-on-page, but scroll depth, interaction with expandable content, return visits.
Downstream impact: Citations, reuse, machine readability. Does your content propagate through the systems that matter to your mission?
For content that feeds AI systems, "computed reach" — how often your content appears in AI-generated responses — is becoming an important signal. Current approaches: manually query AI systems for your key topics, use emerging monitoring tools that track AI citations, and watch for referral traffic from AI-assisted search. The measurement is directional rather than precise, but ignoring it means flying blind as AI intermediates more discovery.
This is where digital work is heading: measuring how content flows, not just how it sits.
5. AI-ready content infrastructure#
Most organizations still publish content like it's 2008. A page gets written, formatted, maybe tagged with a category, and shipped.
The assumption is that humans will find it through search or navigation.
That assumption is breaking down. AI systems now intermediate much of how people discover and consume information. If your content isn't structured for machine consumption, it's increasingly invisible.
What AI-ready means in practice:
- Content is structured, not just visually formatted. Headings, lists, and semantic HTML aren't decoration — they're data.
- Content is accessible. Semantic HTML, ARIA labels, and proper heading hierarchy serve both assistive technologies and AI systems. What helps screen readers helps machine learning models.
- Metadata is semantic, not decorative. Tags and categories should map to controlled vocabularies, not ad-hoc labels.
- Taxonomies are machine-readable. SKOS-aligned concept schemes let AI systems understand relationships between topics.
- Pages emit structured data. JSON-LD schema markup tells search engines and AI systems what your content is, not just what it says.
- Assets generate machine-actionable outputs. PDFs should have metadata. Data should be queryable. Reports should have structured abstracts.
This isn't about being innovative. It's about being discoverable in a world where fewer humans reach your website directly.
AI systems read your content even when your audience doesn't. Give them something meaningful to read.
6. Making it work: Implementation patterns#
The loop from section 3 is the goal. Here's what I've seen work — and fail — when trying to establish it across complex organizations like UNDRR and EMBL.
Note: I may expand these patterns into detailed guides in future posts — each could become its own case study with templates and examples.
What works#
Replace a "traditional content output" mindset with audience+task mindset. The shift from "we need to publish this" to "we need to help our audience do X" changes every content decision downstream.
Establish centralized but flexible governance. Design systems that offer multiple integration paths — full adoption, partial adoption, visual-only — spread faster than mandated standards.
Create KPIs that senior leadership can relate to. Connect technical metrics to organizational outcomes. Page speed affects reach in bandwidth-constrained regions. Structured metadata improves citation rates.
Introduce data literacy gently, but consistently. Not everyone needs to read dashboards, but everyone should understand what "this content achieved X" means and trust the measurement.
Use small wins to prove value. Structured metadata for one content type. Performance improvements on one high-traffic page. These create appetite for larger transformation.
Avoid totalism — iterate instead. You don't need to solve everything at once. Ship something focused and high-quality that doesn't do everything, gather feedback, and let audience demand guide what comes next. This builds momentum and proves value faster than waiting for the complete solution.
Treat AI as an amplifier, not a replacement. AI tools can speed up content production, improve discoverability, and automate taxonomy tagging. They can't replace human judgment about what matters to your audience.
Examples of what to avoid#
Big-bang rebuilds without political buy-in. Platforms live or die on adoption. A technically perfect system that teams won't use is worthless.
Strategy divorced from technical feasibility. The communications team's vision needs to survive contact with your platform's actual capabilities.
Building platforms without a metadata plan. If you don't design for disciplined content from the start, you'll never retrofit it effectively.
Publishing PDFs with zero tracking. If you can't measure whether anyone read it, you can't justify making more of them.
Assuming editors will change workflows without incentives. People do what's easy and rewarded. Make the right behavior both.
Underestimating cost and timeline realities. Transformation always costs more and takes longer than initial estimates. Build the business case around small, demonstrable wins that justify continued investment — not aspirational end-states that may never materialize. (I'll cover business case development, ROI frameworks, and resource planning in a follow-up post.)
Everything here is about making progress in environments where change is slow.
Putting it together#
This framework isn't theoretical. I've used these patterns at UNDRR, EMBL, and across public-sector digital teams.
Part 1 — the groundwork — is about translation: making senior leadership, technologists, and your audience understand each other well enough to move forward. It's the human skill that no platform purchase or consultancy can replace.
Part 2 — the process — is what makes that understanding operational: measurement systems, disciplined content, AI-ready architecture, and implementation patterns that survive contact with organizational reality.
Part 3 & 4 — the execution — covers the tactical side: how to assess where you are, build the business case, plan resources, and manage risks. That post is coming in a week or two.
Neither part works alone. Brilliant strategy without process produces reports that gather dust. Perfect process without alignment produces platforms that no one adopts.
Digital transformation in complex organizations isn't about buying the right technology. It's about building shared language, lightweight governance, and measurement that connects technical work to organizational outcomes — and then building systems that sustain it.
If you're working through similar challenges — or if you've found different approaches that work — I'd love to hear about it.