Coder Social home page Coder Social logo

openai's People

Contributors

chookity-pokk avatar dependabot[bot] avatar gruns avatar jcs090218 avatar jczuurmond avatar lilactown avatar rjtk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

openai's Issues

Cannot install openai

Hi there,

when I plan to use chatgpt, it says no openai. Later I found that it depends on this repo. But seems like we cannot use use-package to install this package?

Furthermore, is that possible to use API-KEY like this? Typically we don't want to expose our KEY, but the way in this package to hide key seems a little bit difficult for me (sorry that I am not that familiar with Emacs)...

Thanks!
Songpeng

MELPA inclusion / gptai

I think it would be great if this could at some point be published in MELPA.

Also, on MELPA, there is gptai and it might make sense to jointly develop the emacs implementation. I haven't looked into the details yet to compare the openai and the gptai package. I am considering writing a spacemacs layer for one of the two.

Suggestions

I have some suggestions for the generic API package. They are a bit drastic and depart from how this package is organized, but here it goes.

  1. Remove the defcustoms. The purpose of a generic API module is to expose all the functionality through request interfaces and some generic submit function.
  2. Expose the responses as classes or structures since they are known. That will make working with results much simpler.
  3. openai-key should not be defcustom. Many users commit their custom file and they could accidentally commit their tokens.
  4. Merge all the files into one file. They are too small to make much sense.
  5. Expose all the constants as named constants

For inspiration, see https://github.com/emacs-lsp/lsp-mode/blob/master/lsp-protocol.el

Since the openai API is small, things can be hand-written and not generated with complicated macros, unlike LSP which has hundreds of interfaces.

After we discuss this, I'll be happy to start working on it and provide some pull requests.

Add option for api-key header for azure openai deployment

#16 allows you to use bearer tokens with azure openai, but I would like to use a long lived token that Azure provides. These tokens are not normal bearer tokens that are passed via the Authorization header, but are instead passed via an api-key header.

Allowing this to be configured (global config and/or per-request) would be helpful.

Awkward behavior due to openai-completion-select-insert calling (forward-paragraph)

If you use openai-completion-select-insert on a selected paragraph, then the response will be inserted after the following paragraph instead of directly after the paragraph.

This happens because openai-completion-select-insert calls (forward-paragraph) before insertion.
Of course it is equally awkward if after selecting only part of a paragraph and sending it to openai, the result appears in the middle of the paragraph, so just removing (forward-paragraph) would not be good.

I do not know an elegant solution to this issue. Perhaps it is possible to pass an optional argument to openai-completion-select-insert not to skip to the next paragraph? Similar issues may arise due to newline characters being added (in some use cases these might not be desirable). If one could pass optional arguments such as "prefix", "suffix" (which default to "\n") and "position" (which defaults to "(goto-char end) (forward-paragraph)"), then this would allow users to use for example advice-add to fine-tune the placement of the response.

A way to select which model?

I tried looking around in the code and think that running openai-retrieve-model let's you select which model to use? I'm not totally sure though. If that is correct I can open a PR with that info in the README.

Adding a file to the request does not work

openai-image-variation-prompt does not currently work. it is just passing the filename as a parameter instead of attaching the image itself. I tried to fix this with the code below:

(openai-request "https://api.openai.com/v1/images/variations"
    :type "POST"
    :headers `(("Content-Type"  . "application/json")
               ("Authorization" . ,(concat "Bearer " openai-key)))
    :data (json-encode
           `(("n"               . ,openai-image-n)
             ("size"            . ,openai-image-size)
             ("response_format" . ,openai-image-response-format)
             ("user"            . ,openai-user)))
    :files `(("image" . ,image))
    :parser 'json-read
    :success (cl-function
              (lambda (&key data &allow-other-keys)
                (funcall callback data))))

However this returns an error from openai

Invalid Content-Type header (application/json), expected multipart/form-data. (HINT: If you're using curl, you can pass -H 'Content-Type: multipart/form-data')

So it looks like it doesn't like mixing form data with json encoded data (and you can't json encode images). I tried to switch to using only form encoding using this:

(openai-request "https://api.openai.com/v1/images/variations"
  :type "POST"
  :headers `(("Authorization" . ,(concat "Bearer " openai-key)))
  :data `(("n"               . "1")
          ("size"            . ,openai-image-size)
          ("response_format" . ,openai-image-response-format)
          ("user"            . ,openai-user))
  :files `(("image" . ,image))
  :parser 'json-read
  :success (cl-function
            (lambda (&key data &allow-other-keys)
              (funcall callback data))))

But that got a different error from openai

400 - Additional properties are not allowed ('--form image' was unexpected)

I am not very familiar with HTTP so I am stuck on how to get this working. Any help would be appreciated.

Usernames with special characters generate a 400 error

How to reproduce:

  • Set username to be your email address, including an @
  • Running chatgpt or dall-e or anything interactive will yield an error.
  • Observe that it is not clear from the error that there is a problem with the username.

Expected behavior:
It would probably be best to see an informative error message if one uses a username that is disallowed.

Streaming response support

For large responses, it would be nice to provide a streaming API. I'm not sure that the current request function API is ideal for this, though.

I'm fairly new to elisp, but perhaps either a lambda that was expected to be called multiple times with parts of the response or a stream type that is common amongst elisp packages?

Add option to have user defined `params` for the OpenAI request

Why?

To use the Azure-hosted OpenAI rest API the api-version has to be provided in the query:

POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/completions?api-version=2022-12-01

At the moment, I can use this library by overriding the openai-request method, like so:

  (defmacro openai-request (url &rest body)
    "Wrapper for `request' function.

  The URL is the url for `request' function; then BODY is the arguments for rest."
    (declare (indent 1))
    `(progn
       (setq openai-error nil)
       (request ,url
         :params '(("api-version" . "2023-03-15-preview"))
         :error (cl-function
                 (lambda (&key response &allow-other-keys)
                   (setq openai-error response)
                   (openai--handle-error response)))
         ,@body))))

I would like it if the library supports this feature.

What?

A more generic approach would be:

  1. Create a variable
(defcustom openai-request-parameters '()
  "The parameters for the OpenAI request."
  :type 'list
  :group 'openai)

(Maybe name the variable openai-user-defined-request-parameters to emphasize that the library might add other parameters.)

  1. Then pass the parameters to all OpenAI request, for example in the chat:
(cl-defun openai-chat ( messages callback
                        &key
                        (base-url openai-base-url)
                        (params openai-user-defined-params)
                        (content-type "application/json")
                        (key openai-key)
                        org-id
                        (model "gpt-3.5-turbo")
                        temperature
                        top-p
                        n
                        stream
                        stop
                        max-tokens
                        presence-penalty
                        frequency-penalty
                        logit-bias
                        (user openai-user))
  "Send chat request.

Arguments MESSAGES and CALLBACK are required for this type of request.  MESSAGES
is the conversation data.  CALLBACK is the execuation after request is made.

Arguments BASE-URL, PARAMS, CONTENT-TYPE, KEY, ORG-ID and USER are global options; however, you
can overwrite the value by passing it in.

The rest of the arugments are optional, please see OpenAI API reference page
for more information.  Arguments here refer to MODEL,  TEMPERATURE, TOP-P, N,
STREAM, STOP, MAX-TOKENS, PRESENCE-PENALTY, FREQUENCY-PENALTY, and LOGIT-BIAS."
  (openai-request (concat base-url "/chat/completions")
    :type "POST"
    :params params         ;; <--- does this need an `@` or `,` prefix?
    :headers (openai--headers content-type key org-id)
    :data (openai--json-encode
           `(("model"             . ,model)
             ("messages"          . ,messages)
             ("temperature"       . ,temperature)
             ("top-p"             . ,top-p)
             ("n"                 . ,n)
             ("stream"            . ,stream)
             ("stop"              . ,stop)
             ("max_tokens"        . ,max-tokens)
             ("presence_penalty"  . ,presence-penalty)
             ("frequency_penalty" . ,frequency-penalty)
             ("logit_bias"        . ,logit-bias)
             ("user"              . ,user)))
    :parser 'json-read
    :complete (cl-function
               (lambda (&key data &allow-other-keys)
                 (funcall callback data)))))
  1. I think the macro passes the parameter automagically to the underlying request:
(defmacro openai-request (url &rest body)
  "Wrapper for `request' function.

The URL is the url for `request' function; then BODY is the arguments for rest."
  (declare (indent 1))
  `(progn
     (setq openai-error nil)
     (request ,url
       :error (cl-function
               (lambda (&key response &allow-other-keys)
                 (setq openai-error response)
                 (openai--handle-error response)))
       ,@body)))

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.