Coder Social home page Coder Social logo

sunner / chatall Goto Github PK

View Code? Open in Web Editor NEW
14.2K 117.0 1.5K 85.51 MB

Concurrently chat with ChatGPT, Bing Chat, Bard, Alpaca, Vicuna, Claude, ChatGLM, MOSS, 讯飞星火, 文心一言 and more, discover the best answers

Home Page: http://chatall.ai

License: Apache License 2.0

JavaScript 60.76% Vue 39.04% HTML 0.19% Shell 0.02%
bingchat chatgpt electron-app vuejs3 vuetify3 desktop-app linux macos windows electron

chatall's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chatall's Issues

Would it be possible to add a settings do disable auto-scroll to the bottom?

Since it is quite annoying when one wants to read already outputted content from one bot, while the other one is still outputting, since it keeps scolling to the bottom erratically. Right now, one basically needs to wait the entire time until all bots are responded to start reading.

I would like it best if this would be configurable and if the page is scrolled to the bottom once the message is sent, so the bot outputs are in view.

能不能在打包在docker中运行

飞腾FT2000,银河麒麟操作系统,安装arm64时出错,好像是npm的版本低,安装相关依赖包,这个操作系统不让装。
能否给个容器运行chatALL,或者给个网页服务,我可以部署到服务器上,谢谢了!

关于点击清除聊天历史后,输入框再也无法输入的问题。

Describe the bug
ChatALL运行正常,但不管在啥状态,只要点击清除聊天历史按钮后,输入框就再也无法输入了。

To Reproduce
详见录像视频。没打开什么应用,只有CMD-启动模型,Google Chrome是Gradio访问的Web页面,还有录屏软件和ChatALL,其它没啥应用开着。

20230515_ChatALL_error.mp4

Expected behavior
我暂时还是分析、猜测不出原因,不知道会不会跟某个系统的进程冲突?抑或搜狗输入法冲突?

Desktop (please complete the following information):
Windows 11 x64,32GB内存。Python 3.10.7。

"LMSYS (claude-v1)" - it has too low a writing limit is there a way to extend the "Tokens"/ responses lengthen?

I am testing the program and it is truly an exciting project, congratulations to all the developers. Please help me clarify this step, I was checking the source code to understand the project and see if I can try to contribute in the future (even though for me it is a new language), I understood the various AIs how they communicate but I do not have a clear understanding of where and how the language "LMSYS (claude-v1)" communicates I tried to find the source code but I can't find the reference, can you kindly point me where to look? Then, I noticed that the "LMSYS (claude-v1)" language seems to be an excellent AI, but it has too low a writing limit, is there a way to extend the "Tokens" and lengthen the responses so that it does not interrupt too soon?

[BUG]When asking questions in Japanese, GPT-4 translates the question into English instead of answering it

When asking questions in Japanese, GPT-4 translates the question into English instead of answering it

As the title suggests, when I ask a question in Japanese, GPT-4 does not answer the question but instead translates the Japanese text of the question into English.

I am using the Mac version of the ChatALL app, and the "Language" setting in the settings screen is set to "Automatic." The language setting for macOS is Japanese. There is no "Japanese" option in the "Language" choices. Could this be the cause of the issue?

Screenshots

SS 2023-05-17 12 03 52

Desktop (please complete the following information):

  • OS and version: macOS Big Sur 11.7.6
  • App Ver: 1.17.8

LMSYS (chatglm-6b) source

Love the repo!

Is LMSYS (chatglm-6b) an API? Where do the results come from? Would love to use the model more.

DeprecationWarning: Invalid 'main' field in '/workspace/ChatALL/dist_electron/package.json' of 'background.js'. Please either fix that or report it to the module author

ChatALL (main) $ npm run electron:serve

[email protected] electron:serve
vue-cli-service electron:serve

INFO Starting development server...

DONE Compiled successfully in 5385ms 6:02:11 PM

App running at:

Note that the development build is not optimized.
To create a production build, run npm run build.

⠧ Bundling main process...

DONE Compiled successfully in 602ms 6:02:11 PM

File Size Gzipped

dist_electron/index.js 825.06 KiB 182.23 KiB

Images and other types of assets omitted.
Build at: 2023-05-16T18:02:12.009Z - Hash: 929d4d11ab29d3db836f - Time: 602ms

INFO Launching Electron...
update-electron-app config looks good; aborting updates since app is in development mode
(node:4323) [DEP0128] DeprecationWarning: Invalid 'main' field in '/workspace/ChatALL/dist_electron/package.json' of 'background.js'. Please either fix that or report it to the module author
(Use electron --trace-deprecation ... to show where the warning was created)

http://localhost:8080 is blank.

Describe the bug
A clear and concise description of what the bug is.

http://localhost:8080 is blank.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

use the app in browser

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS and version: Windows Server 2022 Datacenter Evaluation 21H2

Additional context
Add any other context about the problem here.

百度文心一言,只能登录,没有API...[FEAT]

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

Not disabling electrons sandbox

Why is the --no-sandbox flag being used? Chatall works fine with the sandbox.

image

That's the desktop file inside the appimage

通过Gradio调用本机ChatGLM-6B错误。

命令行窗口返回信息如下:
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\gradio\routes.py", line 395, in run_predict
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1191, in process_api
inputs = self.preprocess_data(fn_index, inputs, state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1027, in preprocess_data
self.validate_inputs(fn_index, inputs)
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1014, in validate_inputs
raise ValueError(
ValueError: An event handler (predict) didn't receive enough input values (needed: 6, got: 1).
Check if the event handler calls a Javascript function, and make sure its return value is correct.
Wanted inputs:
[textbox, chatbot, slider, slider, slider, state]
Received inputs:
["广州这个城市怎么样?"]
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\gradio\routes.py", line 395, in run_predict
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1191, in process_api
inputs = self.preprocess_data(fn_index, inputs, state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1027, in preprocess_data
self.validate_inputs(fn_index, inputs)
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1014, in validate_inputs
raise ValueError(
ValueError: An event handler (predict) didn't receive enough input values (needed: 6, got: 1).
Check if the event handler calls a Javascript function, and make sure its return value is correct.
Wanted inputs:
[textbox, chatbot, slider, slider, slider, state]
Received inputs:
["广州这个城市怎么样?"]
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\gradio\routes.py", line 395, in run_predict
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1191, in process_api
inputs = self.preprocess_data(fn_index, inputs, state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1027, in preprocess_data
self.validate_inputs(fn_index, inputs)
File "C:\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1014, in validate_inputs
raise ValueError(
ValueError: An event handler (predict) didn't receive enough input values (needed: 6, got: 1).
Check if the event handler calls a Javascript function, and make sure its return value is correct.
Wanted inputs:
[textbox, chatbot, slider, slider, slider, state]
Received inputs:
["你好!"]
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 254, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\fastapi\applications.py", line 276, in call
await super().call(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\errors.py", line 149, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\cors.py", line 76, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\exceptions.py", line 79, in call
raise exc
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\exceptions.py", line 68, in call
await self.app(scope, receive, sender)
File "C:\Python\Python311\Lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call
raise e
File "C:\Python\Python311\Lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 718, in call
await route.handle(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 341, in handle
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 82, in app
await func(session)
File "C:\Python\Python311\Lib\site-packages\fastapi\routing.py", line 289, in app
await dependant.call(**values)
File "C:\Python\Python311\Lib\site-packages\gradio\routes.py", line 518, in join_queue
if blocks.dependencies[event.fn_index].get("every", 0):
~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^
TypeError: list indices must be integers or slices, not str
ERROR: closing handshake failed
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\server.py", line 248, in handler
await self.close()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 766, in close
await self.write_close_frame(Close(code, reason))
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1232, in write_close_frame
await self.write_frame(True, OP_CLOSE, data, _state=State.CLOSING)
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1205, in write_frame
await self.drain()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1194, in drain
await self.ensure_open()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 935, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedError: sent 1000 (OK); no close frame received
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 254, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\fastapi\applications.py", line 276, in call
await super().call(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\errors.py", line 149, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\cors.py", line 76, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\exceptions.py", line 79, in call
raise exc
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\exceptions.py", line 68, in call
await self.app(scope, receive, sender)
File "C:\Python\Python311\Lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call
raise e
File "C:\Python\Python311\Lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 718, in call
await route.handle(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 341, in handle
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 82, in app
await func(session)
File "C:\Python\Python311\Lib\site-packages\fastapi\routing.py", line 289, in app
await dependant.call(**values)
File "C:\Python\Python311\Lib\site-packages\gradio\routes.py", line 518, in join_queue
if blocks.dependencies[event.fn_index].get("every", 0):
~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^
TypeError: list indices must be integers or slices, not str
ERROR: closing handshake failed
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\server.py", line 248, in handler
await self.close()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 766, in close
await self.write_close_frame(Close(code, reason))
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1232, in write_close_frame
await self.write_frame(True, OP_CLOSE, data, _state=State.CLOSING)
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1205, in write_frame
await self.drain()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1194, in drain
await self.ensure_open()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 935, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedError: sent 1000 (OK); no close frame received
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 254, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Python\Python311\Lib\site-packages\fastapi\applications.py", line 276, in call
await super().call(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\errors.py", line 149, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\cors.py", line 76, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\exceptions.py", line 79, in call
raise exc
File "C:\Python\Python311\Lib\site-packages\starlette\middleware\exceptions.py", line 68, in call
await self.app(scope, receive, sender)
File "C:\Python\Python311\Lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call
raise e
File "C:\Python\Python311\Lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 718, in call
await route.handle(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 341, in handle
await self.app(scope, receive, send)
File "C:\Python\Python311\Lib\site-packages\starlette\routing.py", line 82, in app
await func(session)
File "C:\Python\Python311\Lib\site-packages\fastapi\routing.py", line 289, in app
await dependant.call(**values)
File "C:\Python\Python311\Lib\site-packages\gradio\routes.py", line 518, in join_queue
if blocks.dependencies[event.fn_index].get("every", 0):
~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^
TypeError: list indices must be integers or slices, not str
ERROR: closing handshake failed
Traceback (most recent call last):
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\server.py", line 248, in handler
await self.close()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 766, in close
await self.write_close_frame(Close(code, reason))
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1232, in write_close_frame
await self.write_frame(True, OP_CLOSE, data, _state=State.CLOSING)
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1205, in write_frame
await self.drain()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 1194, in drain
await self.ensure_open()
File "C:\Python\Python311\Lib\site-packages\websockets\legacy\protocol.py", line 935, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedError: sent 1000 (OK); no close frame received

ChatALL软件窗口一开始是报返回错误,后来好像服务器拒绝连接,然后就报:连接被服务器关闭了。

设置里面参数如下图:
image

参数应该没问题。

ChatALL界面报错如下图:
image

最后就是连接被服务器拒绝了。。。

机器环境:
联想P52笔记本,Windows 11 x64 版本,最新更新。
内存32GB,NIVIDA P3200显卡,6GB显存。
Python 3.11.3、Torch+CUDA是2.0+cu117,ChatGLM-6B原模型,通过.half().quantize(4).cuda()量化加载。

Gradio的Web页面对话一直都正常,见下图:
image

估计是ChatALL软件调用方式有不太完善的地方,请版主分析。

另外,ChatGLM支持API调用?版主为什么不采用更稳定的API调用呢?

还有一个问题就是,ChatALL点击扫帚图标,清空对话历史后,输入框再也无法输入了。

是否可以支持chatgpt4新开放的插件系统?

chatgpt plus新开放了插件系统,可以提供接入网络或者使用其他插件的能力,不知道chatall在调用chatgpt4的时候,是否有办法可以有办法使用上这个插件能力?
image

chat all添加一个继续按钮,可以分别继续对话。

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

Bard Error

Describe the bug
Bard Error

TypeError: Cannot read properties of null (reading 'atValue')

input - claude vs vicuna vs alpaca vs chatglm

image

[Feature Request] Search for specific text in the chat history by pressing `Ctrl/Cmd + F`

Thanks for creating this app! 🏅

One thing I missed was being able to press Ctrl/Cmd + F to look for a specific text in the chat history.
This is especially useful when you have asked a batch of questions and want to get back to a specific question/response.

As a workaround, I've been using the text search function from the Elements tab in the Developer Tools.

能否支持不使用梯子和账户申请就可以获取的llm能力,对于普通用户来说,账户其实难以申请到

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

Help compiling into an exe

I am a Delphi programmer never worked with JS
I have visual studio 19 and code runner.
I tried to run main.js without any success a lot of errors
Could you plz tell me how to compile to exe/Or run JS.
I do something wrong as I do not know what I am doing

Dark mode

Currently only a light mode is offered, a dark mode would also be nice

Migrate to Typescript

What do you think about it? It greatly improves the dev experience and the maintainability of the project

如何读取对话历史

请问下,对话历史文件的保存路径是在哪里?属于哪种类型的数据库?能否使用python代码直接读取呢?

[BUG] LMSYS (claude-v1)点复制会有段落标签<p>复制内容</p>

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS and version: [e.g. Windows 11, Ubuntu Linux 22.04, macOS 13.3.1]

Additional context
Add any other context about the problem here.

[Feature Request] Multiple chats

Is your feature request related to a problem? Please describe.
I can have only 1 chat at a time

Describe the solution you'd like
Like chatgpt, have multiple sessions of chats, where I can say different things

Bard登录后无法使用

如题
bard在软件中已完成登录,
但在试图选取bard时仍显示未登录
已确认成功连接上代理,其他AI均可使用,例如Chatgpt,new bing

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.