Coder Social home page Coder Social logo

icarecti / chatgpt_macro_for_texstudio Goto Github PK

View Code? Open in Web Editor NEW
23.0 2.0 2.0 84 KB

The ChatGPT Macro for TeXstudio is a user-friendly integration that connects TeXstudio with OpenAI's API.

License: MIT License

Python 100.00%
chatgpt texstudio

chatgpt_macro_for_texstudio's Introduction

πŸ€– ChatGPT for TeXstudio

Enhance your TeXstudio experience with the power of AI! These macros leverage OpenAI's technology to provide intelligent suggestions and improvements to your LaTeX documents. Watch this video to see it in action:

ChatGPT_Macro_TeXstudio.mp4

🧠 How does it work

The ChatGPT Macro for TeXstudio is a user-friendly integration that connects TeXstudio with OpenAI's API. The first macro ChatGPT allows you to send selected text from your document to a Python script, which interacts with the API and processes the response. The response text is inserted directly into your editor, creating an intuitive and interactive ChatGPT experience. The secondary macro, ChatGPT-PromptLibrary, offers a collection of predefined prompts accessible through a dropdown menu, allowing you to easily apply them to any selected text.

πŸš€ Getting Started

Note: this macro was developed and tested on Ubuntu 22.04 but should also run on Windows and Mac.

Follow these simple steps to set up the ChatGPT Macro for TeXstudio:

Prerequisites

1. Install the latest version of TeXstudio

Make sure you're using TeXstudio version 4.5.2rc1 or higher. To check your version, go to "Help" -> "About TeXstudio."

If you need to update, download the latest version from the TeXstudio release page.

For Linux users, download the *AppImage, make it executable (chmod +x filename), and run it.

2. Install Python and the OpenAI Python library

Install Python from the official website.

Install the openai Python library. Open a terminal and run pip install openai.

3. Obtain an OpenAI API key

Create an account at openai.com and get your API key from the OpenAI API Keys page. It will be only shown once, so save it somewhere for the next step.

Step 1: Set up the Python script

  • Download the openai_python_script.
    • by clicking on raw -> Save as... (Ctrl + S)
  • Make it executable.
  • Open the script and insert your OpenAI API key.
  • Note the absolute filepath of the script.

Step 2: Import the macro into TeXstudio

  • Download both macros ChatGPT.txsMacro and ChatGPT-PromptLibrary.txsMacro.
    • by clicking on raw -> Save as... (Ctrl + S)
    • save both files as ChatGPT.txsMacro and ChatGPT-PromptLibrary.txsMacro (don't add a file ending like .txt)
  • Import it into TeXstudio.
    • Macros -> Edit Macros... -> Import
  • Edit both macro:
    • Macros -> Edit Macros...
    • Update the script_path variable with the absolute filepath of the Python script you noted in Step 1.
    • Verify that the Python path is correct (type which python3 in the terminal and paste the result into the macro).

Step 3: Enjoy the ChatGPT Macro

Now you're all set! Highlight any text in your document and run the macros using the shortcuts Shift+F1 and Shift+F2 or by clicking on it. The first time you execute the macros they ask you Do you trust this script? if you click Yes, allow all calls it will ever make this message will not be shown to you again.

βš™οΈ Advanced

Stop a Running Script

If you have executed the macro and you want to stop it (because the response is to long or not what you expected) then just click on Macro -> Stop ChatGPT or Stop ChatGPT PromptLib. These two menu options are dynamically generated when the Macros are executed and not visible if the Macros where never executed.

Screenshot of the menu:

Change the parameters in the python script

Within the Python script, you have the ability to modify various parameters to fine-tune the generated response:

  • system message: The system message determines the behavior of the assistant. By default, ChatGPT uses "You are a helpful assistant." for this macro, it has been modified to "You are a helpful assistant and an expert LaTeX editor. You only return valid LaTeX. Everything you return is directly inserted into a LaTeX document and interpreted as LaTeX code."
  • model: The model is set to gpt-3.5-turbo. If you have access to GPT4 you can switch it to gpt-4.
  • max_tokens: This parameter sets the maximum length of the response. The total token limit for a single request with gpt-3.5-turbo is 4000 (approximately 6 pages of text), including the input. If your input consists of 3000 tokens, the response can only be 1000 tokens long. By default, this is set to 3000, meaning your maximum input can be 1000 tokens (roughly 1.5 pages of text). If you use gpt-4then I recommend max_tokens=5000.
  • temperature: see official documentation

πŸ“ Roadmap

  • add a prompt library
  • add the functionality to abort a running call
  • make the max_tokens dynamically, depending on the length of the input
  • improve prompts in the prompt library
  • use any selected text as input (even special characters)
  • include feedback about used token / used money
  • parse errors and finish reason

πŸ’ͺ Contribute

You have some ideas on how to improve the macros or tips on how to make them run on different systems? Don't hesitate

  • create an issue
  • open a pull request

πŸ“š FAQ

❓ The response of ChatGPT has no empty spaces. Why is that?

A: If your TeXstudio version is older than 4.5.2rc1, then empty spaces are removed by TeXstudio while reading the response. This issue was resolved with version 4.5.2rc1.

❓ Why is it so slow in generating text?

A: When you create an account at OpenAI, you receive a free credit that expires after a few months. Using only this free credit results in slower response times. Adding a payment option to your OpenAI account significantly improves the response time, as demonstrated in the introduction video above.

❓ How can I add my own prompt to the prompt-library?

A: Adding your own prompt is a breeze! Just follow these simple steps:

  1. Navigate to Macros > Edit Macros...
  2. Click on ChatGPT-PromptLibrary.
  3. Add a line in the following format:

{ promptOption: "text that will be displayed in the dropdown", basePrompt: "command that will be sent to ChatGPT" }

❓ How expensive is it to use this macro?

A: The macro itself is completely free! However, OpenAI does charge a small fee for each request made to their API. The costs are quite minimal, so you can easily generate a large amount of content without breaking the bank.

To give you an idea, the current pricing for the gpt-3.5-turbo model is $0.002 per 1,000 tokens. You can check the most up-to-date pricing information on the OpenAI Pricing page.

πŸ’‘ Examples of Costs

  • Generate the entire Harry Potter book series (7 books, 2,200 pages) for just $3.
  • Create 100 pages of text (including input) for a mere 10 cents.

chatgpt_macro_for_texstudio's People

Contributors

icarecti avatar rebexter avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Forkers

rebexter sibovg

chatgpt_macro_for_texstudio's Issues

Problem with unicode character

Hi there,

thanks for providing this nice macro for TeXstudio. I use it in combination with LuaLaTeX, where unicode character are allowed. This sometimes lead to chatgpt answers with unicode characters, that cannot be printed. The error message looks like this:
Traceback (most recent call last): File "D:\Programme\TeXstudio Settings\openai_python_script.py", line 39, in <module> print(content, end='', flush=True) File "C:\Program Files\Python39\lib\encodings\cp1252.py", line 19, in encode return codecs.charmap_encode(input,self.errors,encoding_table)[0] UnicodeEncodeError: 'charmap' codec can't encode character '\u0394' in position 0: character maps to ``

Is there anything you can do about? Unfortunately, I did not figure a way out to resolve this.
Cheers
Niklas

Not an issue: How to make it work with Windows

Just follow these instructions: https://github.com/icarecti/chatgpt_macro_for_texstudio

But then change the imported ChatGPT macro, by replacing line 31 with this line (TeXstudio: Macros -> Edit)
cmd = system(command + "' > /dev/null");
This line was cmd = system("bash -c 'exec " + command + "' > /dev/null"); and there's no bash on my Windows machine.

Note: all paths to python and the script have forward slashes like so:

// #########################################################################
// Set the path to Python3 and the Python script
var python_path = "C:/Users/User/AppData/Local/Programs/Python/Python311/python.exe"; // *** edit this line **
var script_path = "h:/dev/openAI/openai_python_script.py"; // *** edit this line **
// #########################################################################

sys.argv[1] gets split into single words

Hi there,

thanks for the nice interface to chatGPT in TeXStudio. I had the problem that the message got split up into multiple sys.argv elements with one word each. I think, it occured if I was using dots or questionmarks in the message. This resulted in a wrong promt, as only the first element/first word was used.

I am not very experienced in python, but with promt = " ".join(sys.argv[1:]) in openai_python_script.py, the problem was solved. I donΒ΄t know, if this may create other issues, but so far there were none.

Cheers
Niklas

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.