Designing a competitor comparison content series

Capturing bottom-funnel evaluation searches before buyers reach your brand

The context

Traditional “our product vs competitor” pages only capture people who already know your brand and are actively comparing you to other tools.

But a large portion of evaluation-stage search happens earlier — when buyers are comparing platforms like Asana, Monday, GitHub, and ClickUp without a clear shortlist yet.

This series was designed to test whether the same approach could work for Nulab: intercept brand-agnostic comparison searches and introduce Nulab as an alternative before decisions were locked in.

What I owned

I built and shipped a comparison content series on Learn targeting competitor-vs-competitor searches, including:

  • keyword research to find high-volume, achievable comparison queries

  • a standardized outline to keep pages consistent and scalable

  • publishing across two verticals (project management + software development)

  • performance reporting using organic and AI discovery signals

Pages published

Software development

  • GitLab vs GitHub

  • Git vs GitHub

  • Bitbucket vs GitHub

Project management

  • Asana vs Monday

  • ClickUp vs Monday

  • Wrike vs Asana

What happened (performance summary)

Even as a small pilot, the series proved viable as a new organic acquisition layer.

Organic search impact

  • pages showed sustained organic growth post-indexing

  • monthly traffic stabilized in the 100–150+ range per URL

  • trends suggested real keyword fit rather than launch spikes

Keyword visibility

  • 50+ organic keywords now ranking

  • multiple Top 3 and Top 10 placements

  • coverage across both head and long-tail comparison terms

AI search visibility
These pages began earning citations in AI-driven discovery environments:

  • Google AI Overviews: 38

  • ChatGPT: 3

  • Perplexity: 1

This confirmed the URLs were influencing AI-generated buying guidance — not just traditional SERPs.

Authority signals
Despite limited link building:

  • 79 backlinks earned

  • referring domains increasing

  • authority metrics improving

Why this mattered

This series expanded organic acquisition beyond brand-aware traffic and created a repeatable format for capturing bottom-funnel evaluation search.

Instead of relying only on people searching for Nulab directly, it allowed Nulab to show up earlier in the decision process — when buyers were still comparing other platforms.

Most importantly, it proved that competitor-vs-competitor content can function as a durable acquisition surface, with measurable search visibility and early traction in AI discovery environments.

What this shows about how I work

This project reflects how I approach evaluation-stage content:

  • I look for high-intent search behavior happening upstream of the product

  • I build repeatable, structured formats rather than one-off pages

  • I focus on credibility and usefulness over aggressive positioning

  • I close the loop with evidence — proving viability without turning the work into a roadmap

Previous
Previous

Making organic search more useful during evaluation

Next
Next

Scaling a template library into a demand-validated content system