AI Citation Half-Life Reference: Decay Patterns Across ChatGPT, Perplexity, Gemini, and Copilot
Median AI citation half-life is 4.5 weeks across major engines. ChatGPT churns fastest (3.4 weeks); Perplexity persists longest (5.8 weeks); Google's AI surfaces cluster at 4.3-4.8 weeks. Use this reference to size refresh cadence and forecast visibility loss by engine, vertical, and distribution pattern.
TL;DR
- Across 3.5M citation events, the cross-platform median half-life is 4.5 weeks.
- ChatGPT cycles fastest (3.4w); Perplexity holds longest (5.8w); Google AI surfaces sit between 4.3-4.8w.
- Editorially distributed and multi-domain-cited content lasts roughly 2x longer than single-domain publication.
Headline half-life by engine
| Engine | Median half-life | Notes |
|---|---|---|
| ChatGPT | 3.4 weeks | Highest churn; novelty- and trend-weighted retrieval. |
| Perplexity | 5.8 weeks | Smallest trusted-publisher pool; heavy reuse across sessions. |
| Google AI Overviews | ~4.3 weeks | Tied to AI Overview ranking churn. |
| Google AI Mode | ~4.6 weeks | Conversational surface; mid-range. |
| Gemini | ~4.8 weeks | Highest persistence inside Google's AI ecosystem. |
| Microsoft Copilot | ~4.5 weeks (proxy) | No vendor-specific panel published; defaults to cross-platform median. |
| Cross-platform median | 4.5 weeks | All engines combined. |
Source: Scrunch + Stacker, 3.5 million citation events, September 2025 to March 2026.
Copilot was not separately reported in Scrunch's 2026 panel. Treat as the cross-platform median until per-platform data publishes.
URL-level vs brand-level decay
Two tracking layers exist and should not be conflated.
| Layer | Metric | Value |
|---|---|---|
| URL | Median citation lifespan (per URL) | ~0 days |
| URL | Share of URLs that appear once and vanish | ~74% |
| URL | Average citation lifespan (per URL) | 6.8 days |
| Brand | Time for brand mentions to halve | ~31 days |
| Brand | Share of brands halving within 7 days | ~50% |
Source: Trakkr Research, 108,650 distinct citation URLs.
URL-level volatility is extreme: most cited URLs are one-shot. Brand persistence is slower because LLMs swap synonymous URLs from the same publisher across sessions. Budget refresh effort at the brand layer; instrument tracking at the URL layer.
Decay by industry vertical
| Vertical | Median half-life |
|---|---|
| Insurance | ~5.0 weeks |
| Finance | ~4.7 weeks |
| Technology | ~4.5 weeks |
| Education | ~4.4 weeks |
| Healthcare | ~4.1 weeks |
| News / current events | < 2 weeks |
Source: Scrunch industry breakdown (March 2026).
Healthcare and news cycle faster because YMYL freshness signals carry more weight. Insurance and finance hold longer because canonical regulatory and definitional references are reused.
Decay by content distribution pattern
| Pattern | Half-life | Multiplier vs baseline |
|---|---|---|
| Single-domain publication | 4.5 weeks | 1.0x |
| Editorially syndicated (network distribution) | ~9-10 weeks | ~2.0x |
| Cited in 3+ independent domains | ~10+ weeks | ~2.2x |
| Original research with primary data | ~10+ weeks | ~2.0-2.5x |
Source: Stacker source-decay study; corroborated by GEO_optimization longitudinal study (62% disappearance by month 3 for non-distributed content).
Distribution is the single biggest lever on persistence. The same article published once on an owned blog and again through editorial syndication can show a 2x difference in citation lifespan.
Source-type composition by engine
Half-life interacts with the source mix each engine prefers.
| Source type | ChatGPT | Perplexity | Google AI | Claude |
|---|---|---|---|---|
| News / publisher | 38% | 42% | 46% | 51% |
| Topical authority / niche | 31% | 35% | 28% | 24% |
| Academic / research | 18% | 12% | 15% | 16% |
| Government / institutional | 13% | 11% | 11% | 9% |
Source: SparkToro / Gumshoe AI citation analysis, 2026.
Engines weighted toward news (Claude, Google AI) show faster URL-level turnover than the headline engine half-life implies. ChatGPT and Perplexity give topical-authority sources the highest share, so niche content has more leverage there than on Claude.
Sticky-citation profile
Citations that survive a full 6-month tracking window share four traits, per a GEO longitudinal study (n=500+ citations):
- Updated within the last 30 days (freshness signal).
- 2,000+ words of structured, comparative content.
- Original data, datasets, or quantitative findings.
- Cited by 2+ independent domains on the same topic.
Only 18% of citations met all four traits and persisted across 6 months. 62% disappeared by month 3.
How citation half-life is measured
Citation half-life studies generally follow this method:
- Define a query panel covering the topic universe (often 1,000-10,000 queries).
- Run the panel against each AI engine on a fixed cadence (daily or weekly).
- Parse the cited URLs and domains from each response.
- For each citation in cohort C at time T0, track presence across subsequent runs.
- Compute the time T_half where 50% of cohort C's citations no longer appear.
Two cohorts matter: the URL cohort (tracks specific links) and the brand cohort (tracks any URL on a domain). Half-life numbers should always specify which.
How to read these numbers
- A half-life of 4.5 weeks means 50% of a cohort's citations disappear in that window. It does not mean every citation lasts 4.5 weeks; most disappear quickly, a minority persist for months.
- Half-life is per platform. A page can be sticky in Perplexity and ephemeral in ChatGPT for the same query.
- Numbers above are medians across topics. Time-sensitive subjects (news, prices, sports scores) decay 2-4x faster.
- Half-life is not the same as ranking. A citation can drop because the engine swapped to a fresher source, not because the page lost authority.
Limitations of current data
- The Scrunch / Stacker dataset is the largest published panel as of April 2026 (3.5M events) but is biased toward English-language content.
- Per-platform numbers for Microsoft Copilot are not separately published.
- Industry breakdowns are coarse (six verticals) and do not yet capture sub-vertical variance.
- Brand-level decay (Trakkr) and URL-level decay (Scrunch) use different cohorts and should not be compared directly.
FAQ
Q: What does AI citation half-life mean?
Half-life is the time it takes for 50% of a cohort of citations earned at time T0 to no longer appear in AI answers. It is a population statistic, not a per-citation lifespan, and is always reported per platform and per cohort layer (URL or brand).
Q: Why does ChatGPT have the shortest half-life?
ChatGPT recomputes retrieval aggressively and weights novelty and trending sources highly, so the source set churns faster. Perplexity, by contrast, draws from a narrower pool of trusted publishers and reuses them across sessions.
Q: Is Microsoft Copilot's half-life the same as the cross-platform median?
There is no published per-platform half-life for Copilot as of April 2026. Until Microsoft or an independent panel publishes vendor-specific numbers, plan with the cross-platform median of 4.5 weeks.
Q: How does brand authority change citation half-life?
Citations carried by trusted editorial outlets and syndicated networks last roughly 2x longer than citations on a single owned domain. Earned media and multi-domain coverage are the strongest persistence levers identified in the published research.
Q: How should I use these numbers operationally?
Pick the engine your program depends on most, set refresh cadence at roughly half its half-life (about every 10-14 days for a ChatGPT-first program; about every 3 weeks for a Perplexity-first program), and instrument both URL-level and brand-level tracking. See the Citation Half-Life Refresh Cadence Framework for a full schedule.
Q: Will half-life numbers stay stable in 2026 and beyond?
Probably not. As engines tune retrieval and as more publishers join measurement panels, half-life numbers will shift. Re-validate against a current panel quarterly.
Related Articles
GEO Topical Decay Framework: When and How AI Citations Fade by Content Type
A content-type-aware framework for predicting AI citation decay across guides, references, comparisons, and news, with refresh cadence triggers grounded in 17M+ citations.
AI Citation Patterns: How AI Engines Cite Sources (2026)
Reference of how ChatGPT, Perplexity, Google AI Overviews, Google AI Mode, Gemini, Microsoft Copilot, and Claude attribute sources in 2026 — with platform-specific optimization tactics.
Citation Half-Life Refresh Cadence Framework: Platform-Specific Update Schedules for AI Search
Citation half-life refresh cadence framework with platform-specific update schedules for ChatGPT, Perplexity, Google AI Mode, and Gemini in 2026.