---
title: "Install ollama and ellmer"
author: "Prof. Eric A. Suess"
format:
html:
embed-resources: true
editor: visual
---
Today we will install \[Ollama\]() to access LLMs locally. We will install the R package \[ellmer\]() to access the models in R Studio.
# Ollama
To install \[chat_ollama\]() run the one command provided on the \[Ollama\]() website. The command is run in the Terminal, see the tab to the right of Console in RStudio.
> curl -fsSL \| sh
# Ellmer
Install the R package ellmer and load the package.
```{r}
library(ellmer)
```
Next we will download an \[Ollama model\]() using a function from the R package \[ollamar\]()
```{r}
library(ollamar)
ollamar::pull("llama3.2")
```
Or you can run the following Ollama command in the Terminal to download and chat with the llama3.2 model in the terminal.
> ollama run llama3.2
To see your downloaded models, in the terminal run
> ollama list
Run ellmer chat_ollama in an R Quarto Notebok or in the Console.
```{r}
chat <- chat_ollama(model = "llama3.2")
chat$chat("Tell me three jokes about Snoopy")
```
# ShinyChat
We will open ShinyChat. Install and load the ShinyChat package.
```{r}
#| eval: false
library(shinychat)
ui <- bslib::page_fluid(
chat_ui("chat")
)
server <- function(input, output, session) {
chat <- chat_ollama(
model = "llama3.2"
)
observeEvent(input$chat_user_input, {
stream <- chat$stream_async(input$chat_user_input)
chat_append("chat", stream)
})
}
shinyApp(ui, server)
```