![Discord: mekb.](discord.png)
![Skills: Java, Node.js, Bash, C, Lua](skills.png)
Discord AI chatbot using Ollama
Home Page: https://github.com/jmorganca/ollama
Describe the feature
Process .txt files from Discord messages and send them as part of the request to Llama
Why do you think this feature should be implemented?
Discord has a 2000 characters limits, after that, it converts any text into a txt file. It would be great if the bot was able to process txt files and send them as part of the request to llama
Additional context
I encountered this potential feature improvement while trying to process code. Specifically, when trying to process the code of this bot.js using discord integration of ollama.
Followed setup instructions here
Ran npm i
then npm start
Got error:
[Shard Manager] [INFO] Loading
[Shard #0] [INFO] Created shard
/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1132
error: new Error("Used disallowed intents")
^
Error: Used disallowed intents
at WebSocketShard.onClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1132:18)
at connection.onclose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:676:17)
at callListener (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/event-target.js:290:14)
at WebSocket.onClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/event-target.js:220:9)
at WebSocket.emit (node:events:514:28)
at WebSocket.emitClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/websocket.js:260:10)
at TLSSocket.socketOnClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/websocket.js:1272:15)
at TLSSocket.emit (node:events:526:35)
at node:net:337:12
at TCP.done (node:_tls_wrap:657:7)
Node.js v20.9.0
/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/Shard.js:178
reject(new DiscordjsError(ErrorCodes.ShardingReadyDied, this.id));
^
Error [ShardingReadyDied]: Shard 0's process exited before its Client became ready.
at Shard.onDeath (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/Shard.js:178:16)
at Object.onceWrapper (node:events:629:26)
at Shard.emit (node:events:514:28)
at Shard._handleExit (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/Shard.js:439:10)
at ChildProcess.emit (node:events:514:28)
at ChildProcess._handle.onexit (node:internal/child_process:294:12) {
code: 'ShardingReadyDied'
}
Node.js v20.9.0
ollama/discord bot/discord-ai-bot via v20.9.0 on ☁️ [user]@gmail.com
❯ node:events:492
throw er; // Unhandled 'error' event
^
Error [ERR_IPC_CHANNEL_CLOSED]: Channel closed
at new NodeError (node:internal/errors:406:5)
at target.send (node:internal/child_process:754:16)
at Client.<anonymous> (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/ShardClientUtil.js:43:19)
at Client.emit (node:events:514:28)
at WebSocketManager.<anonymous> (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/client/websocket/WebSocketManager.js:271:19)
at WebSocketManager.emit (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@vladfrangu/async_event_emitter/dist/index.cjs:282:31)
at WebSocketShard.<anonymous> (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1173:51)
at WebSocketShard.emit (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@vladfrangu/async_event_emitter/dist/index.cjs:290:37)
at WebSocketShard.onClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1056:10)
at connection.onclose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:676:17)
Emitted 'error' event on process instance at:
at node:internal/child_process:758:35
at process.processTicksAndRejections (node:internal/process/task_queues:77:11) {
code: 'ERR_IPC_CHANNEL_CLOSED'
}
It seems that Ollama will be hitting against Discord's message size limit fairly regularly. Could be worth looking into the feasibility of chunking responses over multiple messages.
[Shard #0] [DEBUG] #study-room3 - boneitis-dev: tell me a dirty joke.
[Shard #0] [DEBUG] Response: Why did the chicken cross the road? To prove to the world that chickens can be just as smart as pigs.
[Shard #0] [DEBUG] #study-room3 - boneitis-dev: tell me another, but much more NSFW.
[Shard #0] [DEBUG] Response: Once upon a time there was a little boy who <..snip..original message was about 2800 characters..>
[Shard #0] [ERROR] DiscordAPIError[50035]: Invalid Form Body
content[BASE_TYPE_MAX_LENGTH]: Must be 2000 or fewer in length.
at handleErrors (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist/index.js:687:13)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async SequentialHandler.runRequest (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist/index.js:1072:23)
at async SequentialHandler.queueRequest (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist /index.js:913:14)
at async _REST.request (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist/index.js:1218:22)
at async TextChannel.send (/home/llamabeast/discord-ai-bot/node_modules/discord.js/src/structures/interfaces/TextBasedChannel.js:162:15)
at async handleMessage (file:///home/llamabeast/discord-ai-bot/src/bot.js:300:24)
at async Timeout._onTimeout (file:///home/llamabeast/discord-ai-bot/src/bot.js:107:16)
Describe the bug
If you change SYSTEM string message in the .env file, it will add that string to the end of a response in discord after a couple prompts with the bot.
To Reproduce
Steps to reproduce the behavior:
-in .env change SYSTEM="you are a cat that likes fish"
-On discord @bot with question. It will respond appropriately as of it were a cat as expected.
-After 2nd or 3rd interaction the bot will start adding system string to end of its responses
Example:
User: "@catbot what are you"
Catbot: "I am a cat, meow.you are a cat that likes fish"
Expected behavior
Bot doesn't add system string to end of responses
Device
Additional context
Add any other context about the problem here.
Describe the feature
Currently the bot is incapable of pinging people despite frequently trying.
Currently before giving the message to the chatbot, you pre-process the discord tags like <#1082142594567516160>
and <@280411966126948353>
into #channel
and @user
, I suggest you add a step before sending the reply that goes the other way around.
Why do you think this feature should be implemented?
Ping other members who are relevant to the conversation and have clickable channel mentions
Additional context
I already implemented this on my fork, I would PR but I have made a ton of other changes that you probably don't want.
Here are the relevant lines https://github.com/CodeF53/sage/blob/006d5db453dc0119cc31d707390389a16f6b461f/src/aiRespond.ts#L127-L150, should be copy pasteable into your codebase
npm start
start
node src/index.js
/home/user/ollama_discord_bot/node_modules/discord.js/src/client/BaseClient.js:29
userAgentAppendix: options.rest?.userAgentAppendix
^
SyntaxError: Unexpected token '.'
at wrapSafe (internal/modules/cjs/loader.js:915:16)
at Module._compile (internal/modules/cjs/loader.js:963:27)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10)
at Module.load (internal/modules/cjs/loader.js:863:32)
at Function.Module._load (internal/modules/cjs/loader.js:708:14)
at Module.require (internal/modules/cjs/loader.js:887:19)
at require (internal/modules/cjs/helpers.js:74:18)
at Object. (/home/lee/ollama_discord_bot/node_modules/discord.js/src/index.js:6:22)
at Module._compile (internal/modules/cjs/loader.js:999:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10)
Hi - sorry for the naive question - I got this working and it's really cool - I'm trying to figure out what the benefit is of using a local llm (ollama) for a (non-local) discord channel. Can you shed some light on this?
Describe the bug
Everything works perfectly, but for some reason it does not send requests from discord channel when running ollama and discord-ai-bot from wsl
To Reproduce
Expected behavior
Using the same token and files when running not in WSL works flawlessly
Device
Can we please have docker image for the same, i have no i dea how to do it and i dont seem to find the appropriate resources. Docker will be helpful considering it can just autostart on reboot and you can use your desktop without a terminal window always open
SYSTEM prompt is replied to the user with no context even when asked not to.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Bot using SYSTEM prompt as INFO on how to act
Device
sorry for the question i know its probably a simple solution but i really have no clue where to start with this discord api.
but i would like to run it in discord chat freely so it can reply to all the messages given.
is this possible?
If messages are still pending, the reset/clear command will not clear them, but they should be cleared
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.