Answer
Get an LLM answer to a question informed by Exa search results. Fully compatable with OpenAI’s chat completions endpoint - docs here.
/answer
performs an Exa search and uses an LLM (GPT-4o-mini) to generate either:
- A direct answer for specific queries. (i.e. “What is the capital of France?” would return “Paris”)
- A detailed summary with citations for open-ended queries (i.e. “What is the state of ai in healthcare?” would return a summary with citations to relevant sources)
The response includes both the generated answer and the sources used to create it. The endpoint also supports streaming (as stream=True
), which will returns tokens as they are generated.
Get your Exa API key
Authorizations
API key can be provided either via x-api-key header or Authorization header with Bearer scheme
Body
The question or query to answer.
1
If true, the response is returned as a server-sent events (SSS) stream.
If true, the response includes full text content in the search results
The search model to use for the answer. Exa passes only one query to exa, while exa-pro also passes 2 expanded queries to our search model.
exa
, exa-pro