Skip to content

Reverse (Over-) Engineering Is a Hamster Wheel in Reverse

    Inversion of Control

    At Search Central LIVE in Zurich, Nikola Todorovic brought us back to three simple but uncomfortable questions:
    “Is it good for users?”
    “What does good mean?”
    “How do you measure what’s good?”

    Basic questions. Frequently ignored.

    Because the moment a core update rolls out and the graphs twitch, another ritual begins: reverse engineering. We dissect winners and losers, count words, measure keyword density and search for a mathematical formula behind the ranking. We chase the answer to “How do you measure it” – but in yesterday’s data graveyard, not in today’s user reality.

    The thesis of this piece is uncomfortable but simple:
    Trying to logically solve the algorithm is a race against time.
    A hamster wheel. In reverse.
    We optimize metrics until they turn green – then wonder why traffic bleeds out in quality.

    Time to step back.
    Away from reverse (over-)engineering symptoms. Toward understanding causes.


    TL;DR – Key takeaways

    • The reverse-engineering trap: Historical data is no longer a compass. Google changes too fast.
    • Goodhart’s Law: When a proxy becomes the goal, you lose the goal. Metrics diagnose – they don’t direct.
    • Enabler vs. Hacker: Technical SEO is craftsmanship. Trying to “hack” relevance is self-deception.
    • The data pivot: Stop benchmarking the competition to death. Use real user data: CRM, Sales, NPS.
    • Inversion of control: Stop chasing metrics. Build something metrics have to chase.

    The Proxy Dilemma: When the Mirror Becomes More Important Than the Object

    The first problem: the data.

    Google tweaks its ranking mechanics over 1,000 times a year. By the time we collect data to “prove” a ranking factor, the foundation is often outdated.

    The second problem: the logic.

    We optimize for metrics like time on site, word count or keyword density. But these are proxies – approximations of something we can’t measure directly.

    And this is where Goodhart’s Law hits with full force:

    Once a measure becomes a target, it stops being a good measure.

    If we optimize content for “time on site,” we don’t create better content.
    We create longer scroll paths, confusing UX or text deserts.
    We polish the mirror instead of improving the object reflected in it.


    Enabler vs. Hacker: A Matter of Posture

    Should we ignore the technical layer? Of course not.
    But we need to separate the craft from the illusion.

    1. The Enabler – (Infra)Structure

    Rendering, crawlability, schema, internal linking.
    That’s not trickery. It’s groundwork.
    We need to understand how Google parses content (see entity saliency) so substance remains machine-readable.
    Principle: Form follows function.
    Good content deserves the structure that helps it be understood.

    2. The Hacker – Content & Orientation

    This is where relevance gets faked.
    Keyword Tetris, artificially stretched text, hollow claims.
    Principle: Function follows form.
    We build facades with no interior.

    Technical SEO enables ranking.
    It doesn’t earn ranking.
    Anyone simulating value will pay for it later – usually in visibility.


    The Escape From the Incest Loop

    If we’re not reconstructing the algorithm – what do we orient ourselves around?

    The classic reflex: “Look at the number one result and do it slightly better.”
    That’s the trap.

    Competition as your primary benchmark breeds sameness.
    A self-replicating mass of mediocre content.

    Real organic growth doesn’t come from imitation.
    It comes from transcending the status quo.

    We need a data pivot – away from algorithmic proxies toward real user reality:

    • What shows up in CRM logs?
    • What do customers ask in sales calls?
    • What does NPS reveal about expectation gaps?

    These data points are messier – but they’re honest.
    And unlike single metrics, they’re stable over time.
    A real customer problem stays a real customer problem – even after twenty core updates.

    Cheap content built to satisfy proxies isn’t a saving.
    It’s technical debt.
    And the debt collector always shows up.


    Conclusion: Inversion of Control

    The solution is a shift in perspective.

    Stop chasing the metric. Let the metric chase you.

    When content, context and format meet the real need, proxy metrics follow automatically: dwell time, CTR, shares.
    Not the other way around.

    Often, the real value emerges in the delivery:
    the interactive table for quick comparison,
    the video for complex instruction,
    not the longest block of text.

    If content has substance, it gets found.
    If it helps, it gets kept.
    If it’s precise, it gets shared.

    It’s time to move from the back seat to the wheel.
    Not by rebuilding the engine (the algorithm),
    but by focusing on the road: the user.

    With no over-engineering.


    More Notes On Search?