Nothing
#' Ask Large Language Model
#'
#' Note: See also `clearChatSession`.
#'
#' @param question The question to ask Large Language Model.
#' @param PERPLEXITY_API_KEY PERPLEXITY API key.
#' @param modelSelection model choice. Default is mistral-7b-instruct.
#' @param systemRole Role for model. Default is: "You are a helpful assistant
#' with extensive knowledge of R programming."
#' @param maxTokens The maximum integer of completion tokens returned by API.
#' @param temperature The amount of randomness in the response,
#' valued between 0 inclusive and 2 exclusive. Higher values are more random,
#' and lower values are more deterministic. Set either temperature or top_p.
#' @param top_p Nucleus sampling threshold, valued between 0 and 1 inclusive.
#' @param top_k The number of tokens to keep for highest top-k filtering,
#' specified as an integer between 0 and 2048 inclusive.
#' If set to 0, top-k filtering is disabled.
#' @param presence_penalty A value between -2.0 and 2.0.
#' Positive values penalize new tokens based on whether they appear in the text
#' so far, increasing the model's likelihood to talk about new topics.
#' Incompatible with frequency_penalty.
#' @param frequency_penalty A multiplicative penalty greater than 0.
#' Values greater than 1.0 penalize new tokens based on their existing
#' frequency in the text so far, decreasing the model's likelihood to repeat
#' the same line verbatim. A value of 1.0 means no penalty.
#' @param proxy Default value is NULL.
#'
#' @examples
#' \dontrun{
#' AskMe("What do you think about Large language models?")
#' }
#'
#' @return A character value with the response generated by Large Language Model.
#'
#' @export
#'
AskMe <- function(question,
PERPLEXITY_API_KEY = Sys.getenv("PERPLEXITY_API_KEY"),
modelSelection = c(
"mistral-7b-instruct",
"mixtral-8x7b-instruct",
"codellama-70b-instruct",
"sonar-small-chat",
"sonar-small-online",
"sonar-medium-chat",
"sonar-medium-online"
),
systemRole = "You are a helpful assistant.",
maxTokens = 265,
temperature = 1,
top_p = NULL,
top_k = 100,
presence_penalty = 0,
frequency_penalty = NULL,
proxy = NULL) {
# Make an API request to Perplexity.AI using API_Request() function
chatResponse <- API_Request(question, PERPLEXITY_API_KEY, modelSelection[1], systemRole, maxTokens, temperature, top_p, top_k, presence_penalty, frequency_penalty, proxy)
# Parse the response using responseParser() and store the result
chatResponse <- responseParser(chatResponse)
# Returns output to console and clipboard
return(responseReturn(chatResponse))
}
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.