Nothing
test_that("LLM can exit evaluation via quit_if", {
skip_test_if_no_ollama()
response <- "what is the favorite number of the dog of my step father who moved to the north pole?" |>
set_system_prompt("you are an assistant who does not guess things") |>
answer_as_integer() |>
quit_if() |>
send_prompt(llm_provider_ollama(), verbose = FALSE)
expect_true(is.null(response))
})
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.