Again Credits goes to jackMort/ChatGPT.nvim
For these Awesome features
- Interactive Q&A: Engage in interactive question-and-answer sessions with the powerful gpt model (OGPT) using an intuitive interface.
- Persona-based Conversations: Explore various perspectives and have conversations with different personas by selecting prompts from Awesome ChatGPT Prompts.
- Code Editing Assistance: Enhance your coding experience with an interactive editing window powered by the gpt model, offering instructions tailored for coding tasks.
- Code Completion: Enjoy the convenience of code completion similar to GitHub Copilot, leveraging the capabilities of the gpt model to suggest code snippets and completions based on context and programming patterns.
- Customizable Actions: Execute a range of actions utilizing the gpt model, such as grammar correction, translation, keyword generation, docstring creation, test addition, code optimization, summarization, bug fixing, code explanation, Roxygen editing, and code readability analysis. Additionally, you can define your own custom actions using a JSON file.
For a comprehensive understanding of the extension's functionality, you can watch a plugin showcase video
- clean up documentation
- original functionality of OGPT.nvim with Ollama
- Custom settings per session
- Add/remove settings as Ollama request options
- Change Settings -> Parameters
- Another Windows for Template, System
- Query and Select model from Ollama
- Support model creation on the fly
Change Model by Opening the Parameter panels default to (ctrl-o) and Tab your way to it then press Enter () on the model field to change it. It should list all the available models on the your Ollama server.
Same with changing the model, add and delete parameters by using the keys "a" and "d" respectively
- additional actions can be added to config options
- running
OGPTRun
shows telescope picker - for
type="chat"
andstrategy="display"
, "r" and "a" can be used to "replace the highlighted text" or "append after the highlighted text", respectively. Otherwise, "esc" or "ctrl-c" would exit the popup
OGPT
is a Neovim plugin that allows you to effortlessly utilize the Ollama OGPT API,
empowering you to generate natural language responses from Ollama's OGPT directly within the editor in response to your inquiries.
- Make sure you have
curl
installed. - Have a local instance of Ollama running.
Custom Ollama API host with the configuration option api_host_cmd
or
environment variable called $OLLAMA_API_HOST
. It's useful if you run Ollama remotely
-- Packer
use({
"huynle/ogpt.nvim",
config = function()
require("ogpt").setup()
end,
requires = {
"MunifTanjim/nui.nvim",
"nvim-lua/plenary.nvim",
"nvim-telescope/telescope.nvim"
}
})
-- Lazy
{
"huynle/ogpt.nvim",
event = "VeryLazy",
config = function()
require("ogpt").setup()
end,
dependencies = {
"MunifTanjim/nui.nvim",
"nvim-lua/plenary.nvim",
"nvim-telescope/telescope.nvim"
}
}
OGPT.nvim
comes with the following defaults, you can override them by passing config as setup param
Plugin exposes following commands:
OGPT
command which opens interactive window using the mistral:7b
model.
(also known as OGPT
)
OGPTActAs
command which opens a prompt selection from Awesome OGPT Prompts to be used with the mistral:7b
model.
OGPTRun edit_with_instructions
command which opens interactive window to edit selected text or
whole window using the deepseek-coder:6.7b
model, you can change in this in your config options
This command opens an interactive window to edit selected text or the entire window using the
deepseek-coder:6.7b
model. You can modify this in your config options. The Ollama response will
be extracted for its code content, and if it doesn't contain any codeblock, it will default back to
the full response.
You can map it using the Lua API, e.g. using which-key.nvim
:
local ogpt = require("ogpt")
wk.register({
p = {
name = "OGPT",
e = {
function()
ogpt.edit_with_instructions()
end,
"Edit with instructions",
},
},
}, {
prefix = "<leader>",
mode = "v",
})
OGPTRun [action]
command which runs specific actions -- see actions.json
file for a detailed list. Available actions are:
grammar_correction
translate
keywords
docstring
add_tests
optimize_code
summarize
fix_bugs
explain_code
roxygen_edit
code_readability_analysis
-- see demo
All the above actions are using mistral:7b
model.
It is possible to define custom actions with a JSON file. See actions.json
for an example. The path of custom actions can be set in the config (see actions_paths
field in the config example above).
An example of custom action may look like this: (#
marks comments)
{
"action_name": {
"type": "chat", # or "completion" or "edit"
"opts": {
"template": "A template using possible variable: {{filetype}} (neovim filetype), {{input}} (the selected text) an {{argument}} (provided on the command line)",
"strategy": "replace", # or "display" or "append" or "edit"
"params": { # parameters according to the official Ollama API
"model": "mistral:7b", # or any other model supported by `"type"` in the Ollama API, use the playground for reference
"stop": [
"```" # a string used to stop the model
]
}
},
"args": {
"argument": {
"type": "strig",
"optional": "true",
"default": "some value"
}
}
}
}
The edit
strategy consists in showing the output side by side with the input and
available for further editing requests
For now, edit
strategy is implemented for chat
type only.
The display
strategy shows the output in a float window.
append
and replace
modify the text directly in the buffer with "a" or "r"
When using OGPT
, the following
keybindings are available:
<C-Enter>
[Both] to submit.<C-y>
[Both] to copy/yank last answer.<C-o>
[Both] Toggle settings window.<Tab>
[Both] Cycle over windows.<C-f>
[Chat] Cycle over modes (center, stick to right).<C-c>
[Both] to close chat window.<C-u>
[Chat] scroll up chat window.<C-d>
[Chat] scroll down chat window.<C-k>
[Chat] to copy/yank code from last answer.<C-n>
[Chat] Start new session.<C-d>
[Chat] draft message (create message without submitting it to server)<C-r>
[Chat] switch role (switch between user and assistant role to define a workflow)<C-s>
[Both] Toggle system message window.<C-i>
[Edit Window] use response as input.<C-d>
[Edit Window] view the diff between left and right panes and use diff-mode commands
When the setting window is opened (with <C-o>
), settings can be modified by
pressing Enter
on the related config. Settings are saved across sections
Add these to your whichkey plugin mappings for convenient binds
c = {
name = "OGPT",
e = { "<cmd>OGPTRun edit_with_instructions<CR>", "Edit with instruction", mode = { "n", "v" } },
c = { "<cmd>OGPTRun edit_code_with_instructions<CR>", "Edit code with instruction", mode = { "n", "v" } },
g = { "<cmd>OGPTRun grammar_correction<CR>", "Grammar Correction", mode = { "n", "v" } },
t = { "<cmd>OGPTRun translate<CR>", "Translate", mode = { "n", "v" } },
k = { "<cmd>OGPTRun keywords<CR>", "Keywords", mode = { "n", "v" } },
d = { "<cmd>OGPTRun docstring<CR>", "Docstring", mode = { "n", "v" } },
a = { "<cmd>OGPTRun add_tests<CR>", "Add Tests", mode = { "n", "v" } },
o = { "<cmd>OGPTRun optimize_code<CR>", "Optimize Code", mode = { "n", "v" } },
s = { "<cmd>OGPTRun summarize<CR>", "Summarize", mode = { "n", "v" } },
f = { "<cmd>OGPTRun fix_bugs<CR>", "Fix Bugs", mode = { "n", "v" } },
x = { "<cmd>OGPTRun explain_code<CR>", "Explain Code", mode = { "n", "v" } },
r = { "<cmd>OGPTRun roxygen_edit<CR>", "Roxygen Edit", mode = { "n", "v" } },
l = { "<cmd>OGPTRun code_readability_analysis<CR>", "Code Readability Analysis", mode = { "n", "v" } },
},
First of all, thank you to the author of jackMort/ChatGPT.nvim
for creating a seamless framework
to interact with OGPT in neovim!
THIS IS A FORK of the original OGPT.nvim that supports Ollama (https://ollama.ai/), which allows you to run complete local LLMs.
Ollama is still in its infancy, so there are numerous pull requests open to expand its capabilities. One of which is to conform to the OGPT API - ollama/ollama#991. Because of this, this repo is a hack together solution for me to test out the local LLMs that I have running.
THIS PLUGIN MAY NOT LAST VERY LONG, depending on the state of Ollama.