henomis / lingoose Goto Github PK
View Code? Open in Web Editor NEW🪿 LinGoose is a Go framework for building awesome AI/LLM applications.
Home Page: https://lingoose.io
License: MIT License
🪿 LinGoose is a Go framework for building awesome AI/LLM applications.
Home Page: https://lingoose.io
License: MIT License
Hi! Is it possible to use LinGoose with LocalAI which has a OpenAI compatible API?
If so, cloud you provide a simple example? Since last week LocalAI has api-key
support.
PR: #23
tasks
New(input interface{}, outputDecoderFn DecoderFn, template string)
type Decoder interface {
Decode(interface{}) error
}
type OutputHandler func(string) Decoder
type Template struct {
Input interface{}
Output interface{}
OutputHandler OutputHandler
Template string
templateEngine *template.Template
}
func New(
input interface{},
output interface{},
outputHandler OutputHandler,
template string,
) (*Template, error) {
// validate input struct using go struct validator
// validate template
templateEngine, err := texttemplate.New("prompt").Parse(template)
if err != nil {
return nil, err
}
return &Template{
Input: input,
Output: output,
OutputHandler: outputHandler,
Template: template,
templateEngine: templateEngine,
}, nil
}
func (p *Template) Format() (string, error) {
var output bytes.Buffer
err := p.templateEngine.Execute(&output, p.Input)
if err != nil {
return "", err
}
return output.String(), nil
}
type Llm struct {}
func (l *Llm) Completion(promptTemplate *Template) (interface{}, error) {
// prompt
prompt, err := promptTemplate.Format()
_ = prompt
var output string
_ = output // call llm(prompt) -> output
var llmResponse interface{}
_ = llmResponse // llm response
// decode output
err = promptTemplate.OutputHandler(output).Decode(promptTemplate.Output)
if err != nil {
return nil, err
}
return llmResponse, err
}
func (l *Llm) Chat(chat *chat.Chat) interface{} {
// chat prompt
messages := chat.ToMessages()
_= messages
// call llm(messages) -> output
// add message to chat messages?
return nil
}
type Pipeline struct{}
func (p *Pipeline) Run(llm *Llm, prompt *Template) (interface{}, error) {
llm.Completion(prompt)
return prompt.Output, nil
}
Hi henomis,
Thanks for making this cool project.
I am hesitating which framework to use, a go version or a python version.
Is the only reason why we choose a golang framework because it's more friendly to gophers?
Are there any other strong reason for choosing a golang framework?
Any insight would be much appreciated!
map[string]Loader
func (s *Index) IsEmpty() (bool, error) {
...
err := s.load()
...
}
func (s *Index) SimilaritySearch(ctx context.Context, query string, opts ...option.Option) (index.SearchResponses, error) {
...
err := s.load()
...
}
The invocation of the load()
function is very subtle, which easily leads to cases of repeated calls. Such as in the example.
We'd better make sure it's loaded only once.
PR: #19
Hello, does it support function mode?
Hi! Thank you for this library, very excited to give it a try. When trying to build your quickstart, I run into these issues on go build (Go 1.21).
./main.go:18:8: undefined: index.NewSimpleVectorIndex
./main.go:19:27: undefined: index.NewSimpleVectorIndex
./main.go:19:127: undefined: index.WithTopK
All other packages seem to be correctly installed. Only "index" is preventing the build. Any idea what could be wrong here?
Hello, where to add the openai API key in the readme example?
Add support to hugging face inference API
llm/uggingface/huggingface.go
hi @henomis To be honest, I really like this project. In the world of Golang, it is indeed a very impressive project that has helped us solve many problems, such as memory, vector queries……
I have encountered a problem now
Is your feature request related to a problem? Please describe.
the Chat(ctx context.Context, prompt *chat.Chat)
API don't support image input, even though I extended Prompt
interface, and I found it didn't work.
By analyzing the code, I discovered that the issue lies in the implementation of the Chat api(https://github.com/henomis/lingoose/blob/main/llm/openai/openai.go#L301). In the function buildMessages(), the user's message is being uniformly formatted as a string(https://github.com/henomis/lingoose/blob/main/llm/openai/openai.go#L446). If the user's data type is an json array, it is also being formatted as a string. As a result, when GPT responds, it produces gibberish instead of the intended output.
Here is a specific demonstration of the issue:
The correct json:
{
"model": "gpt-4-vision-preview",
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "What’s in this image?"
},
{
"type": "image_url",
"image_url": {
"url": "https://.../some.jpg"
}
}
]
}
]
}
The json generated by lingoose
{
"model": "gpt-4-vision-preview",
"messages": [
{
"role": "user",
"content": "[{\"type\": \"text\",\"text\": \"What’s in this image?\" },{ \"type\": \"image_url\", \"image_url\": {\"url\": \"https://.../some.jpg\"}} ]"
}
]
}
Describe the solution you'd like
The corresponding data structures are already supported in the foundational library(https://github.com/sashabaranov/go-openai/blob/master/chat.go#L85), so an upgrade is required to support the corresponding data fields.
Additional context
Add any other context or screenshots about the feature request here.
PR: #15
Describe the bug
When using qapipeline.WithPrompt, it returns a new QAPipeline instead of modifying the current one, later when calling Run or Query it throws a nil pointer exception because the LLMEngine is nil.
To Reproduce
res, err := qapipeline.
New(openaiClient).
WithPrompt(chatConv).
WithIndex(a.index).
Query(context.Background(), query, option.WithTopK(1))
Expected behavior
The WithPrompt method should modify the current instance and return that one.
Lingoose version: 0.0.12
LinGoose needs a better technical documentation.
The actual documentation can be found here https://lingoose.io/docs/
There are a lot of examples in examples/ directory that can be used as a code base for writing documentation.
Long term plan
pipeline.Llm.LlmMode
use i, ok = llm.(interface{Completion()})
to understand the llm type. Remove "llm" from struct fields.Currently, in the doRequest
function of the HuggingFace
, HTTP requests are made using the http.DefaultClient
. While this works for most scenarios, there is a need for more flexibility when it comes to customizing the behavior of the HTTP client.
I would like to request an enhancement that allows users to specify their own HTTP client when making requests through the library. This feature would provide users with the ability to configure custom settings for the HTTP client, such as timeouts, custom transport options, or any other client-specific configurations.
One possible implementation approach could involve modifying the doRequest function to accept an http.Client as an argument. This change would allow users to pass their own pre-configured HTTP client when making requests, as follows:
func (h *HuggingFace) doRequest(ctx context.Context, jsonBody []byte, model string, httpClient *http.Client) ([]byte, error) {
// Use the provided httpClient for making the request.
// ...
}
By making this modification, users would have the flexibility to tailor the HTTP client to their specific requirements.
I understand that openAI does not allow for shared API keys, but I have been unable to find in any example where to put an API key. For example, from the quickstart on the docs I see no instructions on where to put my key, despite the example saying that this is literally it. I also haven't been able to locate anything at all regarding openAI API keys in documentation.
Tasks
a := pkg.New().WithValue().WithSome()
. #49t := NewTube(llm,decoder).WithMemory(name,memory)
in general extend with methods to get optional parameters. #50WithSplitter(textsplitter)
. textsplitter must be an interface inside loaders. The result should be something like: loader.NewPDFToTextLoader("/usr/bin/pdftotext", "./kb").WithSplitter(textsplitter.NewRecursiveCharacterTextSplitter(2000, 200)).Load()
#54Tasks
Is your feature request related to a problem? Please describe.
Sometimes you do want to remove a collection/payload from a vector database, these methods are exposed by vector/databases APIs.
Describe the solution you'd like
Add a method to the index interface to allow for deletion of documents
Tasks
Describe the bug
When trying to run the Knowledge base example (https://github.com/henomis/lingoose/blob/main/examples/embeddings/knowledge_base/main.go) I got an error about the github.com/henomis/lingoose/index/vectordb/jsondb package.
% go mod tidy
go: finding module for package github.com/henomis/lingoose/index/vectordb/jsondb
go: example.com/lingoosedb imports
github.com/henomis/lingoose/index/vectordb/jsondb: module github.com/henomis/lingoose@latest found (v0.0.11), but does not contain package github.com/henomis/lingoose/index/vectordb/jsondb
load()
in simpleVectorIndex
#123Can vectordb increase support for memory instead of files?
Describe the bug
When using function calling along with streaming, function calls seem to be ignored as the model passes the function arguments in chunks.
To Reproduce
Steps to reproduce the behavior:
true
Expected behavior
Function call chunks should be collected and call once complete.
As titled, would prefer to use a local LLM instead of OpenAI's GPT. I arrived here via this tutorial/introduction to RAG;
Tasks
Tasks
Learning Knowledge Base
tasks:
Fun fact: started integrating qdrant official client that uses GRPC. To avoid huge imports a custom qdrant-go package has beed develope, based on my (and already imported in lingoose) restclientgo
PR : #17
Tasks
New
constructors and if they contain initialization of default values, consider to unexport the struct. (easy if all the struct properties are unexported) #35String()
in prompt #39A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.