Coder Social home page Coder Social logo

semanser / codel Goto Github PK

View Code? Open in Web Editor NEW
2.0K 24.0 149.0 15.88 MB

✨ Fully autonomous AI Agent that can perform complicated tasks and projects using terminal, browser, and editor.

Home Page: https://discord.gg/uMaGSHNjzc

License: GNU Affero General Public License v3.0

Go 39.79% JavaScript 1.37% HTML 0.47% TypeScript 57.74% Dockerfile 0.62%
agent ai autonomous-agents devin openai bot llms ollama llama2

codel's People

Contributors

eltociear avatar fardeem avatar jondwillis avatar luckrnx09 avatar nilbot avatar semanser avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

codel's Issues

Allow working on a flow even when the initial task is done

It should be possible to do a continuous prompting after the initial task is done. This would be useful when the user isn't satisfied with the results or wants to make some improvements.

This code probably needs to be modified:

func processDoneTask(db *database.Queries, task database.Task) error {
flow, err := db.UpdateFlowStatus(context.Background(), database.UpdateFlowStatusParams{
ID: task.FlowID.Int64,
Status: database.StringToPgText("finished"),
})
if err != nil {
return fmt.Errorf("failed to update task status: %w", err)
}
subscriptions.BroadcastFlowUpdated(task.FlowID.Int64, &gmodel.Flow{
ID: uint(flow.ID),
Status: gmodel.FlowStatus("finished"),
Terminal: &gmodel.Terminal{},
})
return nil
}

Add a token limiter for each task

User should be able to limit that amount of tokens that it's possible to spend on a particular task. It should be a progress bar showing how much tokens left after each iteration.

openai api error

thank you for doing great job.
when using codel, I met 500 server error

2024/04/02 00:30:33 Processing command 4 of type input
2024/04/02 00:30:33 Getting next task
2024/04/02 00:30:38 Failed to get response from OpenAI error, status code: 500, message: Internal server error

although it does not look like codel error, link says that misusing api can be the problem.
can you investigate further?

Docker startup & missing image

I whipped up a startup;

https://gist.github.com/tluyben/888408b9105b059d7993df77ed38f7ad

which works well it appears, however; always the following error appears:

failed to spawn container: %!w(*fmt.wrapError=&{Error creating container: Error response from daemon: No such image: debian:latest {0xc0008052c0}})

I saw the error already being reported and I thought it would be handy for others to be able to test faster by having an easy start docker image.

Always use absolute path when writing files

We always expect an absolute path to the file that should be written:

dir := filepath.Dir(path)
err = dockerClient.CopyToContainer(context.Background(), container, dir, archive, types.CopyToContainerOptions{})

So this should be specified in the prompt.

Otherwise, it can sometimes lead to problems like this:

https://discord.com/channels/1221753114441420811/1221753115443855374/1224461580251430932

Please add support for LiteLLM to this project

This project is pretty great BUT we need more options to use different LLM's.You don't have to worry about creating a solution which supports 100+ LLM easily as LiteLLM is another foss project which is capable of doing this task for you.
Project LiteLLM link - https://github.com/BerriAI/litellm
Adding LiteLLM will be big win for the project as many will be easily able to use many more LLM easily which everyone wants and project will require 3 major parameters from user like base url,model name,api key that's all and with open ai api general structure it can query and give back result for the query.Many big projects have started adding support for this project in there project to make things advanced in easier way so study it and after that if you have any query you can ask them they are pretty responsive plus if u want to know more about my personal experience of using it with other great projects like flowise then I can tell you that too in detail.

Feature req: Please integrate apipie.ai

Users want access to as much AI as they can get, they dont want to manage 50 accounts, they want the fastest AI they want the cheapest AI, and you can provide all of that for them with this update.

in addition to or in place of integrating with any aggregators - Please integrate APIpie so devs can access them all from one place/subscription and plus it also provides:

-The most affordable, reliable and fastest AI available
-One API to access ~500 Models and growing
-Language, embedding, voice, image, vision and more
-Global AI load balancing, route queries based on price or latency
-Redundancy for major models providing the greatest up time possible
-Global reporting of AI availability, pricing and performance

Its the same API format as openai, just change the domain name and your API key and enjoy a plethora of models without changing any of your code other than how you handle the models list.

This is a win win for everyone, any new AI's from any providers will be automatically integrated into your stack with this one integration. Not to mention all the other advantages.

error on backend

I updated the repo, added a backend/.env and a frontend/.env.local

when I try to create a new task in the UI i get this message on the backend console:

2024/03/25 05:37:48 failed to send message to channel: %!w(*fmt.wrapError=&{failed to send to the channel: connection not found for id 7 0xc000a014b0})
2024/03/25 05:37:48 failed to process input: %!w(*fmt.wrapError=&{failed to spawn container: failed to send to the channel: connection not found for id 7 0xc0009d81e0})

How do I fix? I'm on arch linux.

client version 1.44 is too new

Hi,

I tried to run Codel on my Docker server and it says that the Docker client is too new, is there a way to bypass this or do I have to wait it out ?

image

Here's my compose :

name: codel-ai
services:
    codel:
        image: ghcr.io/semanser/codel:latest
        environment:
            - OLLAMA_MODEL=llama2
            - GIN_MODE=release
        ports:
            - 3010:8080
        volumes:
            - /var/run/docker.sock:/var/run/docker.sock

error loading page: error resolving url: Get "http://127.0.0.1:9222/json/version

Steps to replicate:

  1. Start Codel
  2. Enter prompt "write a script to get weather in C in Bangkok"
  3. It frezes with the following console error:
2024/04/15 07:54:00 Processing command 3 of type browser
2024/04/15 07:54:00 Trying to get content from https://www.google.com/search?q=free+weather+API
2024/04/15 07:54:00 failed to process browser: failed to get content: error loading page: error resolving url: Get "http://127.0.0.1:9222/json/version": dial tcp 127.0.0.1:9222: connect: connection refused
2024/04/15 07:54:00 Waiting for a task

Opening the URL above manually works fine.

can't connect to db in `./backend` folder

$ go run .

2024/03/23 05:22:11 /home/ettinger/src/apps/codel/backend/main.go:31
[error] failed to initialize database, got error failed to connect to `host=localhost user=postgres database=ai-coder`: dial error (dial tcp 127.0.0.1:5432: connect: connection refused)
2024/03/23 05:22:11 failed to connect database: failed to connect to `host=localhost user=postgres database=ai-coder`: dial error (dial tcp 127.0.0.1:5432: connect: connection refused)
exit status 1

program error when i type new message in browser

like this

`[SUCCESS] Generate outputs
$ vite

VITE v5.1.6 ready in 2097 ms

➜ Local: http://localhost:5173/
➜ Network: use --host to expose
16:01:43 [vite] http proxy error: /graphql
AggregateError [ECONNREFUSED]:
at internalConnectMultiple (node:net:1116:18)
at afterConnectMultiple (node:net:1683:7)
16:02:09 [vite] http proxy error: /graphql
AggregateError [ECONNREFUSED]:
at internalConnectMultiple (node:net:1116:18)
at afterConnectMultiple (node:net:1683:7)
16:02:12 [vite] http proxy error: /graphql
AggregateError [ECONNREFUSED]:
at internalConnectMultiple (node:net:1116:18)
at afterConnectMultiple (node:net:1683:7)
16:02:14 [vite] http proxy error: /graphql
AggregateError [ECONNREFUSED]:
at internalConnectMultiple (node:net:1116:18)
at afterConnectMultiple (node:net:1683:7)
`

Unsupported Endpoint Request for LM Studio Model

While attempting to use the LM Studio model, it was identified that the model only supports the following endpoints:

  • /v1/chat/completions
  • /v1/embeddings
  • /v1/models

However, an error occurred when executing a request:
2024/05/12 04:35:34 failed to process input: failed to get message summary: Unexpected endpoint or method. (POST /v1/api/chat)

Please verify that the requested endpoint matches the Supported endpoints of the LM Studio model to ensure proper handling of requests and prevent unexpected errors.

implement ngrok rtunneling

Implements ngrok tunneling to expose local development servers to the internet, enhancing the project's capability for remote access and testing.

  • Updates backend/.env.example: Adds ngrok configuration options including NGROK_AUTH_TOKEN and NGROK_TUNNEL_NAME to allow developers to specify their ngrok auth token and desired tunnel name.
  • Modifies backend/executor/terminal.go:
    • Imports the ngrok package to utilize its tunneling features.
    • Initializes ngrok in the TerminalName function using a placeholder auth token, aiming to create or retrieve an ngrok tunnel based on the flow ID. This replaces the previous logic of simply formatting a terminal name string.
    • Adjusts the TerminalName function to return the ngrok tunnel's public URL, enabling the use of ngrok tunnel URL for terminal connections.
  • Enhances README.md:
    • Adds instructions for setting up ngrok tunneling, including obtaining an auth token and configuring the .env file with ngrok details.
    • Guides on starting the ngrok tunnel to expose the local development server to the internet, facilitating external access and sharing.

For more details, open the Copilot Workspace session.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.