Docs  /  Divergence score

Divergence score

An integer from 0 to 100 that answers one question: how much do outlets disagree about this story?

Higher = more disagreement. Lower = more consensus.

What goes into it

Four signals, combined:

  1. Framing spread. How many distinct framing tags the sources used. An event framed by three outlets as critical, three as pro-action, and three as neutral scores higher than one where all nine used neutral.
  2. Fact agreement. From the fact ledger: the ratio of confirmed to disputed and omitted claims across sources.
  3. Sentiment spread. Standard deviation of per-article sentiment. High spread means some outlets are enthusiastic while others are grim.
  4. Bias coverage. Whether the story is covered across the spectrum. A story covered only by one side gets a lower spread score (there is less to disagree about) but is flagged separately in /divergence/gaps.

The weights are:

SignalWeight
Framing spread0.40
Fact agreement0.30
Sentiment spread0.15
Bias coverage0.15

Weights are not configurable in v2. They may be tuned between releases.

Reading the score

RangeLabelWhat it means
0 - 33AgreementOutlets substantially agree on facts and framing
34 - 66Some divergenceFraming varies, maybe one or two disputed facts
67 - 100High divergenceOutlets tell meaningfully different stories about the same events

The homepage uses these three colors (green, amber, red) for the divergence pill. The API does not return the label - compute it client-side from the number.

What it is not

  • Not a measure of quality. Low divergence does not mean good reporting.
  • Not a measure of truth. The fact ledger is a separate signal.
  • Not a measure of political bias. An outlet scoring high on divergence on one story is not "more biased" - it is reporting a story others reported differently.