Coder Social home page Coder Social logo

Comments (5)

MikeInnes avatar MikeInnes commented on May 5, 2024

RE this and #23: if this is easy to in either TensorFlow or MXNet then it should be pretty easy to add to Flux. Could you perhaps give an example of how that looks? If it's as simple as a function call it's very easy to add and I can help you with that.

from flux.jl.

 avatar commented on May 5, 2024

Well basically what I am looking for are these two packages
https://github.com/simonster/Lasso.jl
https://github.com/nignatiadis/SmoothingSplines.jl
but accelerated with something like TF or MXN running on GPUs.

As far as I know there are no calculations going on in there other than basic linear algebra, so as long as that is possible in the libs then I guess it should be possible to do here. Don't know how much work it would be to implement though..

But I did find this reference:
https://github.com/nfmcclure/tensorflow_cookbook,
where sections 6-8 in chapter 3 are the gist of the first package

from flux.jl.

 avatar commented on May 5, 2024

By the way, I'd love to help out in doing this, but I'm not quite sure how to get started. Could you perhaps give me some pointers as to how this could get integrated into Flux?

from flux.jl.

MikeInnes avatar MikeInnes commented on May 5, 2024

Sure, I should really write some docs on this stuff but here's a basics of it. So at the core of it you just want to write a function like

@net f(x, y) = x .* y
fm = mxnet(f)
fm(rand(5), rand(5))

With the inputs replaced with real inputs and the .* replaced by whatever linalg implements the algorithm you want.

All Flux needs to know is how to turn the linalg into a TensorFlow/MXNet graph. This is listed here for MXNet; you can see that we're just saying, given the .* function and two existing graphs (mx.Symbols or TensorFlow.Tensors), how do we use MXNet's API to add that op to the graph. So it's pretty simple.

So all you need to do is (1) write the algorithm you want in pure Julia (but using linear algebra that TensorFlow supports; it doesn't matter if it's slow), and (2) add @net, see where the conversion to TensorFlow fails, and add support for operations as required.

Let me know if you get stuck anywhere and I can try to get you going with this.

from flux.jl.

MikeInnes avatar MikeInnes commented on May 5, 2024

Closing this for now as it's from pre-Julia-backend times. If you need custom functions like this it'd probably be sensible to build it on CuArrays.

from flux.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.