Coder Social home page Coder Social logo

Wasm support about async-openai HOT 11 CLOSED

64bit avatar 64bit commented on June 19, 2024 6
Wasm support

from async-openai.

Comments (11)

64bit avatar 64bit commented on June 19, 2024 1

Thank you 😌

Given that reqwest support wasm, would like to have wasm support too.

I'm not very familar with wasm ecosystem, but seems like tokio has work in progress for it tokio-rs/tokio#4827

Perhaps via feature flag to swtich between tokio and wasm as initial starting point to support wasm?
Would love your input/ideas on implementation.

from async-openai.

Boscop avatar Boscop commented on June 19, 2024 1

Yes, tokio could be an optional feature.
Wasm doesn't need the tokio runtime (and it wouldn't be desirable because of the bloat), but request works on wasm as well.
Not sure which other deps are only needed outside of wasm.
If there are more deps that aren't needed on wasm, you could put them all under one feature named "runtime".

from async-openai.

ifsheldon avatar ifsheldon commented on June 19, 2024 1

I am maintaining the code for wasm support and I am trying to stabilize wasm target(s) in main. So perhaps we can discuss a more detailed plan here?

Current State

wasm32-unknown-unknown is already working. See the example openai-web-app. And to my knowledge, wasi support should just work since wasm32-unknown-unknown is the bare minimum.

Implementation Plan (not complete)

If you have something in mind, please make a comment or help out the implementation.

Tracking List:

  • Wasi example(s) on AWS/Cloudflare:
    • Compiles to wasm32-wasi target
    • Cloudflare example #178

from async-openai.

cosmikwolf avatar cosmikwolf commented on June 19, 2024

I would love to be able to implement a different async solution.

I am building a new app and was trying to make it a non async app that uses this app to make async calls by blocking with futures::executor with block_on, then I realized the tokio requirement.

I think futures-rs would be a great choice for async, as it is also compatible with no_std environments. I would love to have a no_std async access to the openai api.

https://github.com/rust-lang/futures-rs

My use case being personal devices that connect to the openAI API for voice to text, and then to chatGPT.

I think wasm devs would also appreciate using this crate

I may try and help here at some point in the near future. I am currently making a bot to make code upgrades automatically, using your library, so maybe I will point it this direction to test it out....

from async-openai.

64bit avatar 64bit commented on June 19, 2024

Hi @cosmikwolf

It seems that support for different async executor should be a separate issue?
Is that somehow related to WASM as well?

from async-openai.

Doordashcon avatar Doordashcon commented on June 19, 2024

Hello @64bit have little experience with WASM architecture but would like to pick this up in the coming week.

from async-openai.

64bit avatar 64bit commented on June 19, 2024

Thank you @cosmikwolf and @Doordashcon for offering to contribute!

I'll let you guys coordinate on this thread.

To consider this resolved we should at least have one working example for AWS lambda or Cloudflare ( or both if you're feeling adventurous :))

from async-openai.

ifsheldon avatar ifsheldon commented on June 19, 2024

+1. I skimmed through the code searching for tokio, and it seems most of the usage relates to files. So, I guess probably the easiest first step is to gate file-related ops behind a feature with optional tokio dependency. Those who want to upload/download audio/images have to wait for a while, but text only functions should just work on wasm I guess?

Update:
except this one (and only this one) I guess

pub(crate) async fn stream<O>(

from async-openai.

64bit avatar 64bit commented on June 19, 2024

Getting started without streaming and files support, but testable through examples would still be a good first step!

from async-openai.

ifsheldon avatar ifsheldon commented on June 19, 2024

Hi all! If you can help testing #120 and/or try it on wasm, it would be great.

from async-openai.

64bit avatar 64bit commented on June 19, 2024

Updates from release notes in 0.17.0:

WASM support, it lives in experiments branch. To use it please pin directly to git sha in your Cargo.toml. Any discussion, issues, related to WASM are welcome in #102 . Any WASM related PRs are welcome in experiments branch.

from async-openai.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.