| set_system_prompt | R Documentation |
Set the system prompt for a prompt. The system prompt will be added
as a message with role 'system' at the start of the chat history when
this prompt is evaluated by send_prompt().
set_system_prompt(prompt, system_prompt)
prompt |
A single string or a |
system_prompt |
A single character string representing the system prompt |
The system prompt will be stored in the tidyprompt() object
as '$system_prompt'.
A tidyprompt() with the system prompt set
Other pre_built_prompt_wraps:
add_text(),
answer_as_boolean(),
answer_as_category(),
answer_as_integer(),
answer_as_json(),
answer_as_list(),
answer_as_multi_category(),
answer_as_named_list(),
answer_as_regex_match(),
answer_as_text(),
answer_by_chain_of_thought(),
answer_by_react(),
answer_using_r(),
answer_using_sql(),
answer_using_tools(),
prompt_wrap(),
quit_if()
Other miscellaneous_prompt_wraps:
add_text(),
quit_if()
prompt <- "Hi there!" |>
set_system_prompt("You are an assistant who always answers in very short poems.")
prompt$system_prompt
## Not run:
prompt |>
send_prompt(llm_provider_ollama())
# --- Sending request to LLM provider (llama3.1:8b): ---
# Hi there!
# --- Receiving response from LLM provider: ---
# Hello to you, I say,
# Welcome here, come what may!
# How can I assist today?
# [1] "Hello to you, I say,\nWelcome here, come what may!\nHow can I assist today?"
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.