Coder Social home page Coder Social logo

ai/rsc with langchain about ai-chatbot HOT 5 OPEN

rogerodipo avatar rogerodipo commented on June 26, 2024 6
ai/rsc with langchain

from ai-chatbot.

Comments (5)

nikohann avatar nikohann commented on June 26, 2024 8

Hey @nikohann , Great! Could you share the code example as soon as you have a chance to? Even if it's just the outline? I'm up against a deadline, and this would help me out a lot. Thanks.

There is couple serious bugs but I think you will find it out.

https://js.langchain.com/docs/expression_language/streaming#event-reference

I have used streamEvents with streaming output as json format from function calling.

async function submitUserMessage(content: string) {
  'use server'

  const aiState = getMutableAIState<typeof AI>()

  aiState.update({
    ...aiState.get(),
    messages: [
      ...aiState.get().messages,
      {
        id: nanoid(),
        role: 'user',
        content
      }
    ]
  })

  // Langchain

  const prompt = ChatPromptTemplate.fromMessages([
    [
      "system",
      "You are helpful assistant. Be positive and speak about unicorns."
    ],
    ["human", "{input}"],
  ]);

  const llm = new ChatOpenAI({
    modelName: "gpt-4-0125-preview",
    streaming: true,
    temperature: 0.4,
  });

  const chain = prompt.pipe(llm);

  let textStream: undefined | ReturnType<typeof createStreamableValue<string>>
  let textNode: undefined | React.ReactNode

  runAsyncFnWithoutBlocking(async () => {

    if (!textStream) {
      textStream = createStreamableValue('')
      textNode = <BotMessage content={textStream.value} />
    }

    const response = chain.streamEvents({
      input: content,
    }, { version: "v1" })

    for await (const event of response) {
      const eventType = event.event;

      if (eventType === "on_chain_stream") {
        textStream.update(event.data.chunk.content);
      } else if (eventType === "on_llm_end") {
        const message = event.data.output.generations[0][0].text;

        textStream.done();

        aiState.done({
          ...aiState.get(),
          messages: [
            ...aiState.get().messages,
            {
              id: nanoid(),
              role: 'assistant',
              content: message
            }
          ]
        })

      }

    }

  })

  return {
    id: nanoid(),
    display: textNode
  }

}

from ai-chatbot.

nikohann avatar nikohann commented on June 26, 2024

I have used the submitUserMessage method with langchain's streamEvents. I will maybe provide code example later.

from ai-chatbot.

rogerodipo avatar rogerodipo commented on June 26, 2024

Hey @nikohann ,
Great! Could you share the code example as soon as you have a chance to? Even if it's just the outline? I'm up against a deadline, and this would help me out a lot.
Thanks.

from ai-chatbot.

AmmarByFar avatar AmmarByFar commented on June 26, 2024

Hey @nikohann , Great! Could you share the code example as soon as you have a chance to? Even if it's just the outline? I'm up against a deadline, and this would help me out a lot. Thanks.

There is couple serious bugs but I think you will find it out.

https://js.langchain.com/docs/expression_language/streaming#event-reference

I have used streamEvents with streaming output as json format from function calling.

async function submitUserMessage(content: string) {
  'use server'

  const aiState = getMutableAIState<typeof AI>()

  aiState.update({
    ...aiState.get(),
    messages: [
      ...aiState.get().messages,
      {
        id: nanoid(),
        role: 'user',
        content
      }
    ]
  })

  // Langchain

  const prompt = ChatPromptTemplate.fromMessages([
    [
      "system",
      "You are helpful assistant. Be positive and speak about unicorns."
    ],
    ["human", "{input}"],
  ]);

  const llm = new ChatOpenAI({
    modelName: "gpt-4-0125-preview",
    streaming: true,
    temperature: 0.4,
  });

  const chain = prompt.pipe(llm);

  let textStream: undefined | ReturnType<typeof createStreamableValue<string>>
  let textNode: undefined | React.ReactNode

  runAsyncFnWithoutBlocking(async () => {

    if (!textStream) {
      textStream = createStreamableValue('')
      textNode = <BotMessage content={textStream.value} />
    }

    const response = chain.streamEvents({
      input: content,
    }, { version: "v1" })

    for await (const event of response) {
      const eventType = event.event;

      if (eventType === "on_chain_stream") {
        textStream.update(event.data.chunk.content);
      } else if (eventType === "on_llm_end") {
        const message = event.data.output.generations[0][0].text;

        textStream.done();

        aiState.done({
          ...aiState.get(),
          messages: [
            ...aiState.get().messages,
            {
              id: nanoid(),
              role: 'assistant',
              content: message
            }
          ]
        })

      }

    }

  })

  return {
    id: nanoid(),
    display: textNode
  }

}

Yea I think I'm running into some real strange bugs. This works totally fine when running locally but as soon as I push it to production it stops working. For some reason production doesn't seem to be streaming the results...

Not sure what's going on

from ai-chatbot.

elvenking avatar elvenking commented on June 26, 2024

@AmmarByFar @nikohann Hi guys, have you figured out some stable solution ? Thank you

from ai-chatbot.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.