Man pages for tidyprompt
Prompt Large Language Models and Enhance Their Functionality

add_msg_to_chat_historyAdd a message to a chat history
add_textAdd text to a tidyprompt
answer_as_booleanMake LLM answer as a boolean (TRUE or FALSE)
answer_as_integerMake LLM answer as an integer (between min and max)
answer_as_jsonMake LLM answer as JSON (with optional schema)
answer_as_key_valueMake LLM answer as a list of key-value pairs
answer_as_listMake LLM answer as a list of items
answer_as_named_listMake LLM answer as a named list
answer_as_regex_matchMake LLM answer match a specific regex
answer_as_textMake LLM answer as a constrained text response
answer_by_chain_of_thoughtSet chain of thought mode for a prompt
answer_by_reactSet ReAct mode for a prompt
answer_using_rEnable LLM to draft and execute R code
answer_using_sqlEnable LLM to draft and execute SQL queries on a database
answer_using_toolsEnable LLM to call R functions
chat_historyCreate or validate 'chat_history' object
chat_history.characterMethod for 'chat_history()' when the input is a single string
chat_history.data.frameMethod for 'chat_history()' when the input is a 'data.frame'
chat_history.defaultDefault method for 'chat_history()'
construct_prompt_textConstruct prompt text from a tidyprompt object
df_to_stringConvert a dataframe to a string representation
extract_from_return_listFunction to extract a specific element from a list
get_chat_historyGet the chat history of a tidyprompt object
get_prompt_wrapsGet prompt wraps from a tidyprompt object
is_tidypromptCheck if object is a tidyprompt object
llm_breakCreate an 'llm_break' object
llm_feedbackCreate an 'llm_feedback' object
llm_provider-classLlmProvider R6 Class
llm_provider_google_geminiCreate a new Google Gemini LLM provider
llm_provider_groqCreate a new Groq LLM provider
llm_provider_mistralCreate a new Mistral LLM provider
llm_provider_ollamaCreate a new Ollama LLM provider
llm_provider_openaiCreate a new OpenAI LLM provider
llm_provider_openrouterCreate a new OpenRouter LLM provider
llm_provider_xaiCreate a new XAI (Grok) LLM provider
llm_verifyHave LLM check the result of a prompt (LLM-in-the-loop)
persistent_chat-classPersistentChat R6 class
prompt_wrapWrap a prompt with functions for modification and handling...
quit_ifMake evaluation of a prompt stop if LLM gives a specific...
r_json_schema_to_exampleGenerate an example object from a JSON schema
send_promptSend a prompt to a LLM provider
set_chat_historySet the chat history of a tidyprompt object
set_system_promptSet system prompt of a tidyprompt object
skim_with_labels_and_levelsSkim a dataframe and include labels and levels
tidypromptCreate a tidyprompt object
tidyprompt-classTidyprompt R6 Class
tidyprompt-packagetidyprompt: Prompt Large Language Models and Enhance Their...
tools_add_docsAdd tidyprompt function documentation to a function
tools_get_docsExtract documentation from a function
user_verifyHave user check the result of a prompt (human-in-the-loop)
vector_list_to_stringConvert a named or unnamed list/vector to a string...
tidyprompt documentation built on April 4, 2025, 12:24 a.m.