Chat with language models in Emacs.
- Everything is text! Kill and yank all over the chat buffer.
- Supports OpenAI, Claude, Replicate, Groq, Ollama
- Include context from other open buffers, files, and websites on the internet.
- Save and load chat sessions locally.
- Streaming text output.
The following API keys must be set in the environment to be able to use cloud APIs:
Ollama doesn't need any API keys since it's running locally.
You can use (setenv <provider> <api-key>)
to set environment variables if they're not loaded automatically in Emacs.
M-x aichat
starts a new chat session.
A chat session looks like this:
## SYSTEM:
You are a helpful assistant.
## USER:
Write a haiku about Uddevalla.
## ASSISTANT:
Here is a haiku about Uddevalla:
Harbor town by sea,
Steep hills and ancient churches,
Uddevalla's charm.
Hit M-<return>
to send the chat session to the AI model.
By using the magic <ai-context>
tag, you can include any file on the file system or website on the internet in the prompt.
## USER:
<ai-context>https://bohuslansspelmansforbund.se/aktuellt/evenemang/</ai-context>
What's the next folk music event in Bohuslän?
M-<return>
- Send the chat buffer to the AI and get an assistant responseC-; u
- Insert the string "USER:"C-; s
- Insert the string "SYSTEM:"C-; a
- Insert the string "ASSISTANT:"C-; f
- Insert an<ai-context>
tag with the name of a file on diskC-; b
- Insert an<ai-context>
tag with the filename of an open bufferC-; h
- Insert an<ai-context>
tag with a website URLC-; m
- Change AI modelC-; c
- Copy the current code block (delimited by triple backticks)C-g
- While text is being generated, useC-g
to interrupt generation