Coder Social home page Coder Social logo

webapp-conversation's Introduction

cover-v5-optimized

Dify Cloud · Self-hosting · Documentation · Enterprise inquiry

Static Badge Static Badge chat on Discord follow on Twitter Docker Pulls Commits last month Issues closed Discussion posts

README in English 简体中文版自述文件 日本語のREADME README en Español README en Français README tlhIngan Hol README in Korean README بالعربية Türkçe README README Tiếng Việt

Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production. Here's a list of the core features:

1. Workflow: Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.

optimized_workflow_intro.mp4

2. Comprehensive model support: Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found here.

providers-v5

3. Prompt IDE: Intuitive interface for crafting prompts, comparing model performance, and adding additional features such as text-to-speech to a chat-based app.

4. RAG Pipeline: Extensive RAG capabilities that cover everything from document ingestion to retrieval, with out-of-box support for text extraction from PDFs, PPTs, and other common document formats.

5. Agent capabilities: You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DALL·E, Stable Diffusion and WolframAlpha.

6. LLMOps: Monitor and analyze application logs and performance over time. You could continuously improve prompts, datasets, and models based on production data and annotations.

7. Backend-as-a-Service: All of Dify's offerings come with corresponding APIs, so you could effortlessly integrate Dify into your own business logic.

Feature comparison

Feature Dify.AI LangChain Flowise OpenAI Assistants API
Programming Approach API + App-oriented Python Code App-oriented API-oriented
Supported LLMs Rich Variety Rich Variety Rich Variety OpenAI-only
RAG Engine
Agent
Workflow
Observability
Enterprise Features (SSO/Access control)
Local Deployment

Using Dify

  • Cloud
    We host a Dify Cloud service for anyone to try with zero setup. It provides all the capabilities of the self-deployed version, and includes 200 free GPT-4 calls in the sandbox plan.

  • Self-hosting Dify Community Edition
    Quickly get Dify running in your environment with this starter guide. Use our documentation for further references and more in-depth instructions.

  • Dify for enterprise / organizations
    We provide additional enterprise-centric features. Log your questions for us through this chatbot or send us an email to discuss enterprise needs.

    For startups and small businesses using AWS, check out Dify Premium on AWS Marketplace and deploy it to your own AWS VPC with one-click. It's an affordable AMI offering with the option to create apps with custom logo and branding.

Staying ahead

Star Dify on GitHub and be instantly notified of new releases.

star-us

Quick start

Before installing Dify, make sure your machine meets the following minimum system requirements:

  • CPU >= 2 Core
  • RAM >= 4GB

The easiest way to start the Dify server is to run our docker-compose.yml file. Before running the installation command, make sure that Docker and Docker Compose are installed on your machine:

cd docker
cp .env.example .env
docker compose up -d

After running, you can access the Dify dashboard in your browser at http://localhost/install and start the initialization process.

If you'd like to contribute to Dify or do additional development, refer to our guide to deploying from source code

Next steps

If you need to customize the configuration, please refer to the comments in our .env.example file and update the corresponding values in your .env file. Additionally, you might need to make adjustments to the docker-compose.yaml file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run docker-compose up -d. You can find the full list of available environment variables here.

If you'd like to configure a highly-available setup, there are community-contributed Helm Charts and YAML files which allow Dify to be deployed on Kubernetes.

Using Terraform for Deployment

Azure Global

Deploy Dify to Azure with a single click using terraform.

Contributing

For those who'd like to contribute code, see our Contribution Guide. At the same time, please consider supporting Dify by sharing it on social media and at events and conferences.

We are looking for contributors to help with translating Dify to languages other than Mandarin or English. If you are interested in helping, please see the i18n README for more information, and leave us a comment in the global-users channel of our Discord Community Server.

Contributors

Community & contact

  • Github Discussion. Best for: sharing feedback and asking questions.
  • GitHub Issues. Best for: bugs you encounter using Dify.AI, and feature proposals. See our Contribution Guide.
  • Discord. Best for: sharing your applications and hanging out with the community.
  • Twitter. Best for: sharing your applications and hanging out with the community.

Star history

Star History Chart

Security disclosure

To protect your privacy, please avoid posting security issues on GitHub. Instead, send your questions to [email protected] and we will provide you with a more detailed answer.

License

This repository is available under the Dify Open Source License, which is essentially Apache 2.0 with a few additional restrictions.

webapp-conversation's People

Contributors

crazywoola avatar crazywoolala avatar eltociear avatar goocarlos avatar iamjoel avatar jingjingxinshang avatar jzongkvo avatar npm-ued avatar yefori-go avatar yoyocircle avatar zxhlyh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

webapp-conversation's Issues

请问如何将此应用编译成静态网页,使用客户端渲染?

我想要将此应用编译成一个静态的网页,去掉其原有的 api,使用自己的后端 api 替代。

这样前端就无需发送 app_id 和 api_key,这些信息保存在自己的后端服务器,根据后端设置决定是否允许前端请求。

但如果我在 next.config.js 中加上 output: 'export' 后,npx run build 报错:

Build error occurred
Error: Page "/api/conversations/[conversationId]/name" is missing "generateStaticParams()" so it cannot be used with "output: export" config.

去掉 app/api 目录后,编译还是报错:
Generating static pages (0/4) [== ]a [Error]: Page with dynamic = "error" couldn't be rendered statically because it used cookies.

能否指导一下这个需求如何实现?谢谢。

Coding Error - Word Case

code file=关联文件

const openstatement = {
      id: `${Date.now()}`,
      content: calculatedIntroduction,
      isAnswer: true,
      feedbackDisabled: true,
      isOpeningStatement: isShowPrompt,
    }
    if (calculatedIntroduction)
      return [openStatement]   //  do u mean openstatement
```

Issue with JSON on Vercel

Hello,

When pushing the repo to vercel I can type a chat message, seldom it does deliever an answer and mostly an JSON error appears:

"SyntaxError: Unterminated string in JSON at position 129"

This error always pops up when I want to type something, why is that so? Please help me.

# Feature Request: Display Opening Questions in UI

Feature Request: Display Opening Questions in UI

Description

Currently, the front-end application using Next.js does not display the opening questions that are provided by the backend as JSON. While the opening text message is displayed correctly, the buttons for the opening questions are not present in the UI, even though they exist in the JSON data.

Expected Behavior

The UI should display the opening question buttons as provided in the JSON data from the backend. Each button should be clickable and should function according to the logic defined in the application.

Actual Behavior

The UI currently displays only the opening text message. The opening question buttons, though present in the JSON data, are not visible or interactive in the UI.

JSON Data Example

Here is an example of the JSON data received from the backend:

//Response of route http://localhost:3000/api/parameters
{...
suggested_questions: [
    "ХЗТТ-д ямар хүн үйлчлүүлэх боломжтой вэ?",
    "ХЗТТ гэж юу вэ?",
    "ХЗТТ нь ямар үйлчилгээ үзүүлдэг вэ?",
    "Холбоо барих мэдээлэл",
    "Салбаруудын жагсаалт харах"
]
...}

Steps to Reproduce

  1. Configure dify backend endpoint, secret keys
  2. Launch the front-end application.
  3. Start new conversation
  4. Observe the opening text message displayed in the UI.
  5. Check the JSON response from the backend and notice the presence of opening questions.
  6. Observe that the opening question buttons are not displayed in the UI.

Proposed Solution

Modify the front-end code to render the opening question buttons as specified in the JSON data. Each button should be styled consistently with the rest of the UI and should be clickable, triggering the appropriate actions.

Additional Information

Framework: Next.js
Issue Found in Version: [0.1.0]
Browser: [Microsoft Edge]

网站聊天没回应,但dify日志能看到回应

在vercel部署的网站聊天经常没回应,但在dify的日志里却能看到生成的对话,不知道哪里出问题了。vercel的log如下:
image
RangeError: Incorrect locale information provided
at Intl.getCanonicalLocales ()
at CanonicalizeLocaleList (/var/task/.next/server/chunks/183.js:60105:17)
at match (/var/task/.next/server/chunks/183.js:60365:108)
at getLocaleOnServer (/var/task/.next/server/app/page.js:359:63)
at LocaleLayout (/var/task/.next/server/app/page.js:373:20)
at X (/var/task/.next/server/chunks/183.js:61941:13)
at Na (/var/task/.next/server/chunks/183.js:62109:21)
at Array.toJSON (/var/task/.next/server/chunks/183.js:61869:20)
at stringify ()
at da (/var/task/.next/server/chunks/183.js:61383:9)
[Error: An error occurred in the Server Components render. The specific message is omitted in production builds to avoid leaking sensitive details. A digest property is included on this error instance which may provide additional details about the nature of the error.] {
digest: '2702072224'
}

Has anyone encountered a GET/serviceWorker.js 404 issue?

Problem description

When I call APIs on the public network, although the terminal still reports an error GET /serviceWorker.js 404 in 102ms, this project is usable.
When I deployed Dify on the company's intranet and called the API, the terminal reported an error GET/serviceWorker. js 404 in 102ms and the browser displayed a 500 App is unavailable error.

Error displayed on the terminal

problem

Attempt to solve

When I deployed Dify on the company's intranet and called the API,I printed error messages.
error
response
I shut down my proxy server and firewall, but it didn't work.

And I tried testing the '/v1/chat messages' and' v1/conversion 'interfaces in Apifox and everything went smoothly, but when it came to the browser, it showed a 500 App is unavailable error.

There was no error message on the console. I checked the network option and found that the responses for all four interfaces were empty data.

network

Can anyone who has encountered this problem help me?I don't know how to fix it.

Local environment configuration

Node.js: 20.9.0
OS: Windows_NT x64 10.0.19045

本地运行出错,需要自己手动修改依赖的引入

error - ./node_modules/.pnpm/[email protected][email protected]/node_modules/ahooks/lib/utils/lodash-polyfill.js:14:0
Module not found: ESM packages (lodash-es) need to be imported. Use 'import' to reference the package instead. https://nextjs.org/docs/messages/import-esm-externals

类似这样的错误,在install包之后,本地开始运行npm run dev就会出现。

举个例子, 其中底层是引用的lodash-es 库,比如node_modules.pnpm\[email protected][email protected]\node_modules\ahooks\lib\useReactive\index.js。这里引用了var _lodashEs = require("lodash-es");

这里会报错,报错原因和解决方式参考这个链接:
Module not found: ESM packages (lodash-es) need to be imported. Use 'import' to reference the package instead. 官方解决

也就是require改为import。

还是不推荐开发者在下好依赖,还要手动修改依赖。期望修复或者有其他建议。

Embedding in an Iframe is not possible

The standard dify-web docker container allows to embed the Chatbot in an Iframe on a website.
As I needed to customize the frontend, I followed the instructions and deployed the webapp-conversations frontend on vercel.
It runs perfectly when I open the Vercel URL directly. As soon as I embed it into an Iframe, I always get the error, that the chatbot is still responding, when I try to send a second message.
error_chatbot

Already tested:

  • Using a Reverseproxy in front of Vercel
  • Using Browser Dev Tools to find possible CORS Errors ect.
  • Using other Browsers: Chrome, Safari -> Not working, Firefox-> Working

500 App is unavailable

I've set the correct .env.local file correctly and after I run npm run dev
I got a 500 App is unavailable error on the localhost:3000.
I'm not sure where I went wrong with the configuration.
Can I see any logs?

ReferenceError: AbortSignal is not defined

next start -p 4000

ready - started server on 0.0.0.0:4000, url: http://localhost:4000
info  - Loaded env from /Users/macbook/Desktop/arthrv.com/arthrv/.env.local
ReferenceError: AbortSignal is not defined
    at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/request.js (/Users/macbook/Desktop/arthrv.com/arthrv/node_modules/next/dist/compiled/@edge-runtime/primitives/fetch.js:7430:7)
    at __require (/Users/macbook/Desktop/arthrv.com/arthrv/node_modules/next/dist/compiled/@edge-runtime/primitives/fetch.js:14:50)
    at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/index.js (/Users/macbook/Desktop/arthrv.com/arthrv/node_modules/next/dist/compiled/@edge-runtime/primitives/fetch.js:7813:46)
    at __require (/Users/macbook/Desktop/arthrv.com/arthrv/node_modules/next/dist/compiled/@edge-runtime/primitives/fetch.js:14:50)
    at Object.<anonymous> (/Users/macbook/Desktop/arthrv.com/arthrv/node_modules/next/dist/compiled/@edge-runtime/primitives/fetch.js:11638:28)

Bot Replies Not Displaying

When deploying on Vercel, there is no reply from the bot. When I reload, the reply is displayed, and the following error appears in Vercel's logs:

    at consumeUint8ArrayReadableStream (/var/task/node_modules/next/dist/compiled/edge-runtime/index.js:1:15045)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async sendResponse (/var/task/node_modules/next/dist/server/send-response.js:42:34)
    at async doRender (/var/task/node_modules/next/dist/server/base-server.js:968:25)
    at async cacheEntry.responseCache.get.incrementalCache.incrementalCache (/var/task/node_modules/next/dist/server/base-server.js:1159:28)
    at async /var/task/node_modules/next/dist/server/response-cache/index.js:99:36```

It is possible to use this webapp with an external application ?

It is possible to use this webapp with an external application ?. If so, any pointer is appreciated.

All i need is to send the incoming message to the external app as a webhook and receive messages from the external app via an api call ?

Thanks in advance

"500 App is unavailable" when fetching empty conversations

I pulled the repository to my local machine and set the required environment variables. After I started the project with
npm run dev
It starts and all the requests are 200 OK, but the whole screen is a big 500 App is unavailable.
I reversed engineered the issue and it seems this is happening due to an empty body of /conversations/ API request. Attaching screenshots with error, error logging and potential source of problem.
Screenshot from 2024-04-26 23-46-55
Screenshot from 2024-04-26 23-49-48
Screenshot from 2024-04-26 23-50-09
Screenshot from 2024-04-26 23-51-24
Screenshot from 2024-04-26 23-50-50
Screenshots show that the error is raised here:
const isNotNewConversation = conversations.some(item => item.id === _conversationId)
But gets muted and instead setAppUnavailable() is called.
The error reads "TypeError: Cannot read properties of undefined (reading 'some')
at eval (index.tsx:228:52)" meaning the supposed list conversations is undefined at the moment.

能否添加用户名和密码登陆

没有用户名和密码登陆,不敢对外开放给同事使用(只能公司内部),希望能添加多用户的功能,比如使用sqlite数据库。 还有一个疑问,要是dify 后端创建了2个APP应用,难道就需要运行2个webapp-conversation 程序,开2个网站吗? 谢谢。

Change theme

How can i change the theme of the webapp in general?

Docker deployed,when APP is running, the error reporting on background

RangeError: Incorrect locale information provided
at Intl.getCanonicalLocales ()
at CanonicalizeLocaleList (/app/.next/server/chunks/757.js:60105:17)
at match (/app/.next/server/chunks/757.js:60365:108)
at getLocaleOnServer (/app/.next/server/app/page.js:359:63)
at LocaleLayout (/app/.next/server/app/page.js:373:20)
at X (/app/.next/server/chunks/757.js:61941:13)
at Na (/app/.next/server/chunks/757.js:62109:21)
at Array.toJSON (/app/.next/server/chunks/757.js:61869:20)
at stringify ()
at da (/app/.next/server/chunks/757.js:61383:9)
[Error: An error occurred in the Server Components render. The specific message is omitted in production builds to avoid leaking sensitive details. A digest property is included on this error instance which may provide additional details about the nature of the error.] {
digest: '4288037568'

local run error, /app/api/messages/ missing conversation_id

Can anyone help? Thank you very much!

(Vercel is ok.)

error - AxiosError: Request failed with status code 400
at settle (webpack-internal:///(sc_server)/./node_modules/axios/dist/node/axios.cjs:1628:16)
at IncomingMessage.handleStreamEnd (webpack-internal:///(sc_server)/./node_modules/axios/dist/node/axios.cjs:2506:21)
at IncomingMessage.emit (node:events:531:35)
at endReadableNT (node:internal/streams/readable:1696:12)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
at Axios.request (webpack-internal:///(sc_server)/./node_modules/axios/dist/node/axios.cjs:3148:49)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async ChatClient.sendRequest (webpack-internal:///(sc_server)/./node_modules/dify-client/index.js:78:24)
at async GET (webpack-internal:///(sc_server)/./app/api/messages/route.ts:13:23)
at async eval (webpack-internal:///(sc_server)/./node_modules/next/dist/server/future/route-modules/app-route/module.js:244:37) {

data: {
  code: 'invalid_param',
  message: 'Missing required parameter in the query string',
  params: 'conversation_id'
}

it seems the onData() in handleSend in index.tsx is not triggered when locally deployed.

App is unavailable

It shown 500 | App is unavailable error on first time run, no error occured on console log, what happen

generationConversationName接口怎么使用?

export const generationConversationName = async (id: string) => {
return post(conversations/${id}/name, { body: { auto_generate: true } })
}

这个是这个接口原本的样子,现在我想根据传递的name属性,去更改会话列表, 我在参数和body里面加了name,但是不好使啊,求助大佬 这是怎么回事啊

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.