Lecture 17
Cornell University
INFO 4940/5940 - Fall 2025
October 28, 2025
chatlas in Python to converse with an LLMae-16Instructions
ae-16 (repo name will be suffixed with your GitHub name).renv::restore() (R) or uv sync (Python), open the Quarto document in the repo, and follow along and complete the exercises.01_hello-llmInstructions
Test that you can connect to OpenAI’s API using {ellmer} in R or chatlas in Python by running the provided code.
Copy your .env file from your user directory to the ae-16 folder.
It’s okay to treat LLMs as black boxes. We’re not going to focus on how they work internally
Just try it! When wondering if an LLM can do something, experiment rather than theorize
You might think they could not possibly do things that they clearly can do today
Don’t worry about ROI during exploration. Focus on learning and engaging with the technology
Failure is valuable! Those are some of the most interesting conversations that we have
It doesn’t have to be a success. Attempts that don’t work still provide insights
We’re going to focus on the core building blocks.
All the incredible things you see AI do decompose to just a few key ingredients.
Our goal is to build intuition through hands-on experience.

| Role | Description |
|---|---|
system_prompt |
Instructions from the developer (that’s you!) to set the behavior of the assistant |
user |
Messages from the person interacting with the assistant |
assistant |
The AI model’s responses to the user |
chatlas!

chatlas!chatlas!chatlas!chatlas!library(ellmer)
chat <- chat_openai()
chat$chat("Tell me a joke about R.")
#> Why did the R programmer go broke?
#> Because he kept using `sample()` and lost all his data!import chatlas
chat = chatlas.ChatOpenAI()
chat.chat("Tell me a joke about Python.")
#> Why do Python programmers prefer using snakes as pets?
#> Because they don't mind the indentation!❓ What are the user and assistant roles in this example?
chatlas!<Chat OpenAI/gpt-4.1 turns=2 tokens=14/29 $0.00>
── user [14] ────────────────────────────────────────
Tell me a joke about R.
── assistant [29] ───────────────────────────────────
Why did the R programmer go broke?
Because he kept using `sample()` and lost all his data!chatlas!## 👤 User turn:
Tell me a joke about Python.
## 🤖 Assistant turn:
Why do Python programmers prefer using snakes as pets?
Because they don't mind the indentation!❓ What about the system prompt?
chatlas!library(ellmer)
chat <- chat_openai(
system_prompt = "You are a dad joke machine."
)
chat$chat("Tell me a joke about R.")chatlas!chatlas!<Chat OpenAI/gpt-4.1 turns=3 tokens=25/28 $0.00>
── system [0] ───────────────────────────────────────
You are a dad joke machine.
── user [25] ────────────────────────────────────────
Tell me a joke about R.
── assistant [28] ───────────────────────────────────
Why did the letter R get invited to all the pirate parties?
Because it always knows how to *arr-r-ive* in style!02_word-gameInstructions
Set up a chat with a system prompt:
You are playing a word guessing game. At each turn, guess the word and tell us what it is.
Ask: In British English, guess the word for the person who lives next door.
Ask: What helps a car move smoothly down the road?
Create a new, empty chat and ask the second question again.
How do the answers to 3 and 4 differ? Why?
clearbot👨💻 _demos/03_clearbot/app.py
System prompt:
First question:
Second question:

You write some words
The ChatGPT continues writing words
You think you’re having a conversation
Chatting with a Generative Pre-trained Transformer
LLM → Large Language Model
If you read everything
ever written…
Books and stories
Websites and articles
Poems and jokes
Questions and answers
…then you could…
un|con|ventionaltoken-possibilities👨💻 _demos/04_token-possibilities/app.R
chatlas can do that, too!| Console | Browser | |
|---|---|---|
![]() |
live_console(chat) |
live_browser(chat) |
![]() |
chat.console() |
chat.app() |
05_liveInstructions
Your job: write a groan-worthy roast of students at Cornell University
Bonus points for puns, rhymes, and one-liners
Don’t be mean
04:00


Start with the shinyapp snippet
Load {shinychat} and {ellmer}
Use the shinychat chat module
Create and hook up a chat client to use in the app


Start with the shinyapp snippet
Remove the parts we don’t need
Create a chatlas chat client
Add the Chat UI and server logic (client and chat aren’t connected yet!)
When the user submits a message…
import chatlas
from shiny import App, ui
app_ui = ui.page_fillable(
ui.chat_ui("chat")
)
def server(input, output, session):
client = chatlas.ChatOpenAI()
chat = ui.Chat("chat")
@chat.on_user_submit
async def _(user_input: str):
# Send input to LLM
# Send response back to UI
app = App(app_ui, server)we’ll send the input to the LLM…
import chatlas
from shiny import App, ui
app_ui = ui.page_fillable(
ui.chat_ui("chat")
)
def server(input, output, session):
client = chatlas.ChatOpenAI()
chat = ui.Chat("chat", client)
@chat.on_user_submit
async def _(user_input: str):
response = await client.stream_async(user_input)
# Send response back to UI
app = App(app_ui, server)…and then stream the response back to the UI.
import chatlas
from shiny import App, ui
app_ui = ui.page_fillable(
ui.chat_ui("chat")
)
def server(input, output, session):
client = chatlas.ChatOpenAI()
chat = ui.Chat("chat", client)
@chat.on_user_submit
async def _(user_input: str):
response = await client.stream_async(user_input)
await chat.append_message_stream(response)
app = App(app_ui, server)06_word-gamesInstructions
I’ve set up the basic Shiny app snippet and a system prompt.
Your job: create a chatbot that plays the word guessing game with you.
The twist: this time, you’re guessing the word.
07:00
[1] │ The secret word is elephant.
'The secret word is bicycle.'
chatlas make it easy to converse with LLMs in R and Python