emacs-openai / openai Goto Github PK
View Code? Open in Web Editor NEWElisp library for the OpenAI API
License: GNU General Public License v3.0
Elisp library for the OpenAI API
License: GNU General Public License v3.0
Hi there,
when I plan to use chatgpt, it says no openai. Later I found that it depends on this repo. But seems like we cannot use use-package
to install this package?
Furthermore, is that possible to use API-KEY like this? Typically we don't want to expose our KEY, but the way in this package to hide key seems a little bit difficult for me (sorry that I am not that familiar with Emacs)...
Thanks!
Songpeng
I think it would be great if this could at some point be published in MELPA.
Also, on MELPA, there is gptai and it might make sense to jointly develop the emacs implementation. I haven't looked into the details yet to compare the openai and the gptai package. I am considering writing a spacemacs layer for one of the two.
Tried a few different ways but couldn't make it work. Maybe openai has changed something about the endpoint? I keep gettin invalid_request_error
I have some suggestions for the generic API package. They are a bit drastic and depart from how this package is organized, but here it goes.
defcustom
s. The purpose of a generic API module is to expose all the functionality through request interfaces and some generic submit function.openai-key
should not be defcustom. Many users commit their custom file and they could accidentally commit their tokens.For inspiration, see https://github.com/emacs-lsp/lsp-mode/blob/master/lsp-protocol.el
Since the openai API is small, things can be hand-written and not generated with complicated macros, unlike LSP which has hundreds of interfaces.
After we discuss this, I'll be happy to start working on it and provide some pull requests.
OpenAI has recently released their ChatGPT to beta, implement the https://platform.openai.com/docs/api-reference/chat is the next step.
#16 allows you to use bearer tokens with azure openai, but I would like to use a long lived token that Azure provides. These tokens are not normal bearer tokens that are passed via the Authorization
header, but are instead passed via an api-key
header.
Allowing this to be configured (global config and/or per-request) would be helpful.
If you use openai-completion-select-insert on a selected paragraph, then the response will be inserted after the following paragraph instead of directly after the paragraph.
This happens because openai-completion-select-insert calls (forward-paragraph) before insertion.
Of course it is equally awkward if after selecting only part of a paragraph and sending it to openai, the result appears in the middle of the paragraph, so just removing (forward-paragraph) would not be good.
I do not know an elegant solution to this issue. Perhaps it is possible to pass an optional argument to openai-completion-select-insert not to skip to the next paragraph? Similar issues may arise due to newline characters being added (in some use cases these might not be desirable). If one could pass optional arguments such as "prefix", "suffix" (which default to "\n") and "position" (which defaults to "(goto-char end) (forward-paragraph)"), then this would allow users to use for example advice-add to fine-tune the placement of the response.
I tried looking around in the code and think that running openai-retrieve-model
let's you select which model to use? I'm not totally sure though. If that is correct I can open a PR with that info in the README.
According to this documentation, an organization can be provided in case of multiple organizations.
Also, it seems that the user does not need to be specified, only the key.
openai-image-variation-prompt
does not currently work. it is just passing the filename as a parameter instead of attaching the image itself. I tried to fix this with the code below:
(openai-request "https://api.openai.com/v1/images/variations"
:type "POST"
:headers `(("Content-Type" . "application/json")
("Authorization" . ,(concat "Bearer " openai-key)))
:data (json-encode
`(("n" . ,openai-image-n)
("size" . ,openai-image-size)
("response_format" . ,openai-image-response-format)
("user" . ,openai-user)))
:files `(("image" . ,image))
:parser 'json-read
:success (cl-function
(lambda (&key data &allow-other-keys)
(funcall callback data))))
However this returns an error from openai
Invalid Content-Type header (application/json), expected multipart/form-data. (HINT: If you're using curl, you can pass -H 'Content-Type: multipart/form-data')
So it looks like it doesn't like mixing form data with json encoded data (and you can't json encode images). I tried to switch to using only form encoding using this:
(openai-request "https://api.openai.com/v1/images/variations"
:type "POST"
:headers `(("Authorization" . ,(concat "Bearer " openai-key)))
:data `(("n" . "1")
("size" . ,openai-image-size)
("response_format" . ,openai-image-response-format)
("user" . ,openai-user))
:files `(("image" . ,image))
:parser 'json-read
:success (cl-function
(lambda (&key data &allow-other-keys)
(funcall callback data))))
But that got a different error from openai
400 - Additional properties are not allowed ('--form image' was unexpected)
I am not very familiar with HTTP so I am stuck on how to get this working. Any help would be appreciated.
How to reproduce:
Expected behavior:
It would probably be best to see an informative error message if one uses a username that is disallowed.
For large responses, it would be nice to provide a streaming API. I'm not sure that the current request function API is ideal for this, though.
I'm fairly new to elisp, but perhaps either a lambda that was expected to be called multiple times with parts of the response or a stream type that is common amongst elisp packages?
As a follow-up on #14, when using the Azure OpenAI Service REST API, support the Azure Active Directory Authentication.
I want to use an OpenAI endpoint from Azure. I expect this can be implemented by adding a new variable baseurl
that by default is "https://api.openai.com/v1"
. Then, use the baseurl
in all the requests. Users may override the baseurl
with another endpoint, like the one on Azure.
To use the Azure-hosted OpenAI rest API the api-version
has to be provided in the query:
POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/completions?api-version=2022-12-01
At the moment, I can use this library by overriding the openai-request
method, like so:
(defmacro openai-request (url &rest body)
"Wrapper for `request' function.
The URL is the url for `request' function; then BODY is the arguments for rest."
(declare (indent 1))
`(progn
(setq openai-error nil)
(request ,url
:params '(("api-version" . "2023-03-15-preview"))
:error (cl-function
(lambda (&key response &allow-other-keys)
(setq openai-error response)
(openai--handle-error response)))
,@body))))
I would like it if the library supports this feature.
A more generic approach would be:
(defcustom openai-request-parameters '()
"The parameters for the OpenAI request."
:type 'list
:group 'openai)
(Maybe name the variable openai-user-defined-request-parameters
to emphasize that the library might add other parameters.)
(cl-defun openai-chat ( messages callback
&key
(base-url openai-base-url)
(params openai-user-defined-params)
(content-type "application/json")
(key openai-key)
org-id
(model "gpt-3.5-turbo")
temperature
top-p
n
stream
stop
max-tokens
presence-penalty
frequency-penalty
logit-bias
(user openai-user))
"Send chat request.
Arguments MESSAGES and CALLBACK are required for this type of request. MESSAGES
is the conversation data. CALLBACK is the execuation after request is made.
Arguments BASE-URL, PARAMS, CONTENT-TYPE, KEY, ORG-ID and USER are global options; however, you
can overwrite the value by passing it in.
The rest of the arugments are optional, please see OpenAI API reference page
for more information. Arguments here refer to MODEL, TEMPERATURE, TOP-P, N,
STREAM, STOP, MAX-TOKENS, PRESENCE-PENALTY, FREQUENCY-PENALTY, and LOGIT-BIAS."
(openai-request (concat base-url "/chat/completions")
:type "POST"
:params params ;; <--- does this need an `@` or `,` prefix?
:headers (openai--headers content-type key org-id)
:data (openai--json-encode
`(("model" . ,model)
("messages" . ,messages)
("temperature" . ,temperature)
("top-p" . ,top-p)
("n" . ,n)
("stream" . ,stream)
("stop" . ,stop)
("max_tokens" . ,max-tokens)
("presence_penalty" . ,presence-penalty)
("frequency_penalty" . ,frequency-penalty)
("logit_bias" . ,logit-bias)
("user" . ,user)))
:parser 'json-read
:complete (cl-function
(lambda (&key data &allow-other-keys)
(funcall callback data)))))
(defmacro openai-request (url &rest body)
"Wrapper for `request' function.
The URL is the url for `request' function; then BODY is the arguments for rest."
(declare (indent 1))
`(progn
(setq openai-error nil)
(request ,url
:error (cl-function
(lambda (&key response &allow-other-keys)
(setq openai-error response)
(openai--handle-error response)))
,@body)))
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.