Get an LLM answer to a question informed by Exa search results. /answer performs an Exa search and uses an LLM to generate either:
The response includes both the generated answer and the sources used to create it. The endpoint also supports streaming (as stream=True), which will return tokens as they are generated.
Alternatively, you can use the OpenAI compatible chat completions interface.
/answer? Reach out to [email protected] and we can help tailor the endpoint to your use case.API key can be provided either via x-api-key header or Authorization header with Bearer scheme
The question or query to answer.
1"What is the latest valuation of SpaceX?"
If true, the response is returned as a server-sent events (SSS) stream.
If true, the response includes full text content in the search results