Coder Social home page Coder Social logo

vim-chatgpt's People

Contributors

3ximus avatar armanschwarz avatar codercooke avatar duan-jm avatar fictorial avatar joaoantoniomaruti avatar konfekt avatar nakhan98 avatar slonoed avatar whille avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

vim-chatgpt's Issues

Temperature Value

Temperature value seems high in default which is 0.7
Maybe this can be changed optionally (i.e. from vimrc -> let g:chat_gpt_temperature)

Want to chat in my mother tongue, but want code comments in English.

Thank you for the wonderful software.

By the way, I would like to converse with ChatGPT in Japanese (my mother tongue), but in the source code that's being created, I would like the comments to always be written in English.

I've considered adding the following:

diff --git a/chatgpt.vim b/chatgpt.vim
index 5cbc81b..b721474 100755
--- a/chatgpt.vim
+++ b/chatgpt.vim
@@ -45,6 +45,10 @@ if !exists("g:chat_gpt_lang")
   let g:chat_gpt_lang = ''
 endif
 
+if !exists("g:chat_gpt_comment_lang")
+  let g:chat_gpt_comment_lang = ''
+endif
+
 " Function to show ChatGPT responses in a new buffer
 function! DisplayChatGPTResponse(response, finish_reason, chat_gpt_session_id)
   call cursor('$', 1)
@@ -97,7 +101,9 @@ def chat_gpt(prompt):
   model = str(vim.eval('g:chat_gpt_model'))
   temperature = float(vim.eval('g:chat_gpt_temperature'))
   lang = str(vim.eval('g:chat_gpt_lang'))
+  comment_lang = str(vim.eval('g:chat_gpt_comment_lang'))
   resp = lang and f" And respond in {lang}." or ""
+  resp += comment_lang and f" But code comments must be in {comment_lang} always." or ""
   systemCtx = {"role": "system", "content": f"You are a helpful expert programmer we are working together to solve complex coding challenges, and I need your help. Please make sure to wrap all code blocks in ``` annotate the programming language you are using. {resp}"}
 
   try:

I've posted this as it might be useful to others.

Regards,
Atsushi

Custom prompts without the "Given the following code snippet" fragment.

It would be useful if there would be a way to specify custom prompts without the hardcoded "Given the..." sentence in the end. For example, a prompt like this would work better without it:

let g:chat_gpt_custom_prompts = {'grammar': 'Suggest ways to improve the grammar of this sentence. Style: accessible yet academic.'}

Can't find openai module

I'm running vim on mac and install openai with pip. If I open python3 and import openai, the module is correctly imported.
However, vim fails when importing openai:

Error: openai module not found. Please install with Pip and ensure equality of the versions given by :!python3 -V, and :python3 import sys; print(sys.version) Error detected while processing /Users/user/.vim/bundle/vim-chatgpt/plugin/chatgpt.vim: line 20: Traceback (most recent call last): File "<string>", line 6, in <module> ModuleNotFoundError: No module named 'openai' line 34: Traceback (most recent call last): File "<string>", line 1, in <module> NameError: name 'openai' is not defined. Did you mean: 'open'?

Is there any constraint to run the plugin with conda environment?

Issue with text rendering

Apologies for the bug report, but I couldn't figure this one out. What is up with the text rendering of the headings? "^A>>>User:^A"?

I've checked and tried a few things with my font settings and nothing seems to make a difference. Image attached.

image

Breaks using the leader key for button remaps or breaks being able to use netrw

I'm a tad new to vim and have been using neovim to get this working. I ran this through my plugin manager and I have my leader key with p+v set to open netrw. For some reason when I attempt to do this, it says that the command for netrw no longer exists. I'm not sure what about this plugin causes it to break, but after I remove it netrw works fine again.

Wrap too long lines

When displaying the prompt, the lines are too long, what would you say of adding a text wrap, something like

  setlocal buftype=nofile bufhidden=hide noswapfile nobuflisted wrap nolist linebreak breakat=\ 

in DisplayChatGPTResponse

Error at vim start up

I got following errors at vim startup

Error from .../.vim/pack/enabled/start/vim-chatgpt/plugin/chatgpt.vim :
line   21 :
Traceback (most recent call last):
  File "<string>", line 2, in <module>
vim.error: Vim:E121: Variable not defined : g:chatgpt_venv_path
line   26 :
Traceback (most recent call last):
  File "<string>", line 1, in <module>
NameError: name 'os' is not defined
line   21 :
Traceback (most recent call last):
  File "<string>", line 2, in <module>
vim.error: Vim:E121: Variable not defined : g:chatgpt_venv_path
line   26 :
Traceback (most recent call last):
  File "<string>", line 1, in <module>
NameError: name 'os' is not defined

I did the export CHAT_GPT_KEY in my .zshrc

trouble using the plugin

Hi, I have followed the instructions to install the plugin on my macOS. I have copied the chatgpt.vim file into

~/.vim/plugged/vim-chatgpt
MacBook-Air@Momo: cs plugin/
total 8
-rw-r--r-- 1 Momo 5101 Apr  7 14:03 chatgpt.vim

Also, I have edited my .vimrc file including the path for openai executable

MacBook-Air@Momo: pip3 show openai
Name: openai
Version: 0.27.4
Summary: Python client library for the OpenAI API
Home-page: https://github.com/openai/openai-python
Author: OpenAI
Author-email: [email protected]
License: None
Location: /usr/local/lib/python3.9/site-packages
Requires: aiohttp, requests, tqdm
Required-by: 

but when I run nvim to ask something I get the error

E492: Not an editor command: Ask 'how to split the screen in vim'

It seems I don't have activated this plugin. However, I have in my .vimrc

filetype plugin indent on

so I don't know what I am losing

Persistent session is not valid markdown

Hi, I absolutely love this plugin and use it almost every day in my work.

I've noticed though that when using a persistent session, the resulting text is not valid markdown due to the user and assistant syntax, which causes my syntax highlighting to break.

Screenshot 2023-10-05 at 08 08 49

Is there some reason for using these symbols? If prefixing both with e.g. > it would become valid Markdown.

Screenshot 2023-10-05 at 08 11 57

Python 3 support is required for ChatGPT plugin

Anyone know why I am getting this

I have python installed:
(base) ➜ ~ python
Python 3.10.9 | packaged by conda-forge | (main, Feb 2 2023, 20:26:08) [Clang 14.0.6 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.

and python3
(base) ➜ ~ python3
Python 3.10.9 | packaged by conda-forge | (main, Feb 2 2023, 20:26:08) [Clang 14.0.6 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.

Both used from

(base) ➜ ~ which python3
/Users/tag/miniforge3/bin/python3

Thanks

api key doesn't update with vimrc

Changing the key inside vimrc with let g:openai_api_key='your-api-key-here' doesn't udpate the key.
The old key is persistent even if the plugin is re-installed.
The only way to update the key is to exit vim and change the ENV variable from the terminal.

Persistent session does not seem to be aware of previous conversation

When using a persistent session, I would expect it to be aware of the previous conversation, but that seems to not be the case (see screenshot). It would be helpful if possible to use similar to the browser version, where one can make follow up prompts to a previous answer.

Screenshot 2023-10-05 at 08 29 01

Cost Effective Model Selection

Screenshot from 2023-07-02 10-49-41

In general, I typically utilize a 4K context. However, there are cases where I require a longer context of 16 tokens. In such cases, I have two options:

  1. Opt for a model with a larger context limit (16K) without the need to change the model name every time. This ensures I can consistently send/receive longer outputs, albeit at a higher cost.
  2. Stick with the 4K context and switch to a different model when a longer context is needed. While this option is more cost-effective, it does require a bit of additional time to make the model change.

The system provides an error message indicating the context limit (e.g., "Error: This model's maximum context length is 4097 tokens. However, you requested X tokens (Y in the messages, Z in the completion). Please reduce the length of the messages or completion.").

Based on this message or through automated calculation, the system can select the alternative model automatically.

^A character in buffer window using neovim

When I run :Ask hello I see this in my buffer:

^A>>>User:^A


hello

^A>>>Assistant:^A
Hello! How can I assist you today?

Any idea what's causing these ^A control characters to appear and what their purpose is? I'm using neovim (the plugin works great otherwise, I'm not sure if the intention was to support neovim).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.