View source: R/answer_as_text.R
answer_as_text | R Documentation |
Make LLM answer as a constrained text response
answer_as_text(
prompt,
max_words = NULL,
max_characters = NULL,
add_instruction_to_prompt = TRUE
)
prompt |
A single string or a |
max_words |
(optional) Maximum number of words allowed in the response. If specified, responses exceeding this limit will fail validation |
max_characters |
(optional) Maximum number of characters allowed in the response. If specified, responses exceeding this limit will fail validation |
add_instruction_to_prompt |
(optional) Add instruction for replying within the constraints to the prompt text. Set to FALSE for debugging if extractions/validations are working as expected (without instruction the answer should fail the validation function, initiating a retry) |
A tidyprompt()
with an added prompt_wrap()
which
will ensure that the LLM response conforms to the specified constraints
Other pre_built_prompt_wraps:
add_text()
,
answer_as_boolean()
,
answer_as_integer()
,
answer_as_json()
,
answer_as_list()
,
answer_as_named_list()
,
answer_as_regex_match()
,
answer_by_chain_of_thought()
,
answer_by_react()
,
answer_using_r()
,
answer_using_sql()
,
answer_using_tools()
,
prompt_wrap()
,
quit_if()
,
set_system_prompt()
Other answer_as_prompt_wraps:
answer_as_boolean()
,
answer_as_integer()
,
answer_as_json()
,
answer_as_list()
,
answer_as_named_list()
,
answer_as_regex_match()
## Not run:
"What is a large language model?" |>
answer_as_text(max_words = 10) |>
send_prompt()
# --- Sending request to LLM provider (llama3.1:8b): ---
# What is a large language model?
#
# You must provide a text response. The response must be at most 10 words.
# --- Receiving response from LLM provider: ---
# A type of AI that processes and generates human-like text.
# [1] "A type of AI that processes and generates human-like text."
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.