persistent_chat-class {tidyprompt} | R Documentation |
PersistentChat R6 class
Description
A class for managing a persistent chat with a large language model (LLM).
While 'tidyprompt' is primariy focused on automatic interactions with
LLMs through send_prompt()
using a tidyprompt object with
prompt_wrap()
, this class may be useful for having a manual conversation
with an LLM. (It may specifically be used to continue a chat history which was
returned by send_prompt()
with return_mode = "full"
.)
Public fields
chat_history
A
chat_history()
objectllm_provider
A llm_provider object
Methods
Public methods
Method new()
Initialize the PersistentChat object
Usage
persistent_chat-class$new(llm_provider, chat_history = NULL)
Arguments
llm_provider
A llm_provider object
chat_history
(optional) A
chat_history()
object
Returns
The initialized PersistentChat object
Method chat()
Add a message to the chat history and get a response from the LLM
Usage
persistent_chat-class$chat(msg, role = "user", verbose = TRUE)
Arguments
msg
Message to add to the chat history
role
Role of the message
verbose
Whether to print the interaction to the console
Returns
The response from the LLM
Method reset_chat_history()
Reset the chat history
Usage
persistent_chat-class$reset_chat_history()
Returns
NULL
Method clone()
The objects of this class are cloneable with this method.
Usage
persistent_chat-class$clone(deep = FALSE)
Arguments
deep
Whether to make a deep clone.
See Also
Examples
# Create a persistent chat with any LLM provider
chat <- `persistent_chat-class`$new(llm_provider_ollama())
## Not run:
chat$chat("Hi! Tell me about Twente, in a short sentence?")
# --- Sending request to LLM provider (llama3.1:8b): ---
# Hi! Tell me about Twente, in a short sentence?
# --- Receiving response from LLM provider: ---
# Twente is a charming region in the Netherlands known for its picturesque
# countryside and vibrant culture!
chat$chat("How many people live there?")
# --- Sending request to LLM provider (llama3.1:8b): ---
# How many people live there?
# --- Receiving response from LLM provider: ---
# The population of Twente is approximately 650,000 inhabitants, making it one of
# the largest regions in the Netherlands.
# Access the chat history:
chat$chat_history
# Reset the chat history:
chat$reset_chat_history()
# Continue a chat from the result of `send_prompt()`:
result <- "Hi there!" |>
answer_as_integer() |>
send_prompt(return_mode = "full")
# --- Sending request to LLM provider (llama3.1:8b): ---
# Hi there!
#
# You must answer with only an integer (use no other characters).
# --- Receiving response from LLM provider: ---
# 42
chat <- `persistent_chat-class`$new(llm_provider_ollama(), result$chat_history)
chat$chat("Why did you choose that number?")
# --- Sending request to LLM provider (llama3.1:8b): ---
# Why did you choose that number?
# --- Receiving response from LLM provider: ---
# I chose the number 42 because it's a reference to Douglas Adams' science fiction
# series "The Hitchhiker's Guide to the Galaxy," in which a supercomputer named
# Deep Thought is said to have calculated the "Answer to the Ultimate Question of
# Life, the Universe, and Everything" as 42.
## End(Not run)