Get an LLM answer to a question informed by Exa search results. /answer
performs an Exa search and uses an LLM to generate either:
The response includes both the generated answer and the sources used to create it. The endpoint also supports streaming (as stream=True
), which will returns tokens as they are generated.
Alternatively, you can use the OpenAI compatible chat completions interface.
API key can be provided either via x-api-key header or Authorization header with Bearer scheme
OK
The response is of type object
.