chat_ollama: Chat with Ollama

View source: R/providers.R

chat_ollamaR Documentation

Chat with Ollama

Description

Convenience wrapper for local 'Ollama' models.

Usage

chat_ollama(prompt, model = "llama3.2", ...)

Arguments

prompt

Character. The user message to send.

model

Character. Model name (e.g., "gpt-4o", "claude-3-5-sonnet-latest", "llama3.2").

...

Additional parameters passed to the API.

Value

The assistant's response as a character string, or a list when history is in use. See chat for details.

Examples

## Not run: 
chat_ollama("What is machine learning?")
chat_ollama("Explain Docker", model = "mistral")

## End(Not run)

llm.api documentation built on April 16, 2026, 5:08 p.m.