View source: R/provider-cortex.R
| chat_cortex_analyst | R Documentation |
Chat with the LLM-powered Snowflake Cortex Analyst.
chat_cortex() picks up the following ambient Snowflake credentials:
A static OAuth token defined via the SNOWFLAKE_TOKEN environment
variable.
Key-pair authentication credentials defined via the SNOWFLAKE_USER and
SNOWFLAKE_PRIVATE_KEY (which can be a PEM-encoded private key or a path
to one) environment variables.
Posit Workbench-managed Snowflake credentials for the corresponding
account.
Viewer-based credentials on Posit Connect. Requires the connectcreds package.
Unlike most comparable model APIs, Cortex does not take a system prompt. Instead, the caller must provide a "semantic model" describing available tables, their meaning, and verified queries that can be run against them as a starting point. The semantic model can be passed as a YAML string or via reference to an existing file in a Snowflake Stage.
Note that Cortex does not support multi-turn, so it will not remember previous messages. Nor does it support registering tools, and attempting to do so will result in an error.
See chat_snowflake() to chat with more general-purpose models hosted on
Snowflake.
chat_cortex_analyst(
account = snowflake_account(),
credentials = NULL,
model_spec = NULL,
model_file = NULL,
api_args = list(),
echo = c("none", "text", "all")
)
account |
A Snowflake account identifier,
e.g. |
credentials |
A list of authentication headers to pass into
|
model_spec |
A semantic model specification, or |
model_file |
Path to a semantic model file stored in a Snowflake Stage,
or |
api_args |
Named list of arbitrary extra arguments appended to the body of every chat API call. |
echo |
One of the following options:
Note this only affects the |
A Chat object.
Other chatbots:
chat_bedrock(),
chat_claude(),
chat_databricks(),
chat_deepseek(),
chat_gemini(),
chat_github(),
chat_groq(),
chat_ollama(),
chat_openai(),
chat_openrouter(),
chat_perplexity()
chat <- chat_cortex_analyst(
model_file = "@my_db.my_schema.my_stage/model.yaml"
)
chat$chat("What questions can I ask?")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.