Source substitution occurs when an AI answer replaces the canonical governing source with a secondary or more convenient source.
CollectionDefinition
TypeDefinition
Version1.0
Stabilization2026-05-13
Published2026-05-13
Updated2026-05-13
Source substitution
Source substitution occurs when an AI-mediated answer relies on, cites, or appears to validate a secondary source where a canonical or governing source should have controlled the claim.
The substitution can be visible, when the wrong URL is cited, or silent, when a third-party source structures the answer while the official source is merely displayed or ignored. The correction usually requires stronger canonical surfaces, clearer internal routing and reduced ambiguity across external sources.
Canonical definition of proof of fidelity: the minimum evidence required to show that an AI output remains faithful to the canon rather than merely plausible.
Audit service for evaluating whether a site, corpus, page or entity is accessible, retrievable, extractable, citable and governable in AI-mediated answers.