Hallucination Correctors API Definition
The Hallucination Correctors API enables users to automatically detect and correct factual inaccuracies, commonly referred to as hallucinations, in generated summaries or responses. By comparing a user-provided summary against one or more source documents, the API returns a corrected version of the summary with minimal necessary edits.
Use this API to validate and improve the factual accuracy of summaries generated by LLMs in Retrieval Augmented Generation (RAG) pipelines, ensuring that the output remains grounded in trusted source content. If HCM does not detect hallucination, it preserves the original summary.
Hallucination Correctors Request and Response Details
To correct a potentially hallucinated summary, send a POST
request to
/v2/hallucination_correctors
. The request body must include:
summary
: The generated text to evaluate and potentially correct.documents
: One or more source documents containing the factual information that the summary should be based on.text
: The full content of a source document.
Example request
This example provides a summary about a historical event.
{
"summary": "The Treaty of Versailles was signed in 1920, officially ending World War I.
It was primarily negotiated by France, Britain, Italy, and Japan.",
"documents": [
{
"text": "The Treaty of Versailles was signed on June 28, 1919. The United States
played a major role, represented by President Woodrow Wilson."
}
]
}
Example response
The response corrects the original summary.
{
"original_summary": "The Treaty of Versailles was signed in 1920, officially ending World War I.
It was primarily negotiated by France, Britain, Italy, and Japan.",
"corrected_summary": "The Treaty of Versailles was signed in 1919, officially ending World War I.
It was primarily negotiated by France, Britain, Italy, and the United States."
}
If the input summary is accurate, the corrected_summary
matches the original_summary
.
Error responses
- 400 Bad Request – The request body was malformed or contained invalid parameters.
- 403 Forbidden – The user does not have permission to perform factual consistency evaluation.
REST 2.0 URL
Hallucination Correctors Endpoint Address
Vectara exposes an HTTP endpoint for the Hallucination Correctors:
https://api.vectara.io/v2/hallucination_correctors