Nothing
OPENAI_API_KEY since the API is OpenAI-compatible.agent() where the final assistant message
was not appended to the returned history when the agent loop exited
without further tool calls. Affected all providers but was most visible
with non-Claude models."local" provider and chat_local() / list_local_models()
exports. Direct llama.cpp inference via the localLLM package is no
longer supported; use provider = "ollama" instead.Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.