fluxml / flux.jl Goto Github PK
View Code? Open in Web Editor NEWRelax! Flux is the ML library that doesn't make you tensor
Home Page: https://fluxml.ai/
License: Other
Relax! Flux is the ML library that doesn't make you tensor
Home Page: https://fluxml.ai/
License: Other
Thanks for Flux, looks great!
I had to clone the Flux and DataFlow git repos but then Pkg.test("Flux") errors:
julia> Pkg.test("Flux")
INFO: Testing Flux
WARNING: could not import Juno.errmsg into Interpreter
WARNING: could not import Juno.errtrace into Interpreter
Test Summary: | Pass Total
Batching | 3 3
Basics: Error During Test
Test threw an exception of type DimensionMismatch
Expression: all(isapprox.(m(x),((m.W.x)' * x,(m.V.x)' * x)))
DimensionMismatch("arrays could not be broadcast to a common size")
in _bcs1(::Array{Float64,1}, ::Array{Float64,1}) at ./broadcast.jl:47
in _bcs at ./broadcast.jl:40 [inlined]
in broadcast_shape(::Tuple{Array{Float64,1},Array{Float64,1}}, ::Tuple{Array{Float64,1},Array{Float64,1}}) at ./broadcast.jl:34
in broadcast_t(::Function, ::Type{T}, ::Tuple{Array{Float64,1},Array{Float64,1}}, ::Vararg{Tuple{Array{Float64,1},Array{Float64,1}},N}) at ./broadcast.jl:228
in macro expansion; at /Users/feldt/.julia/v0.5/Flux/test/basic.jl:50 [inlined]
in macro expansion; at ./test.jl:674 [inlined]
in anonymous at ./<missing>:?
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Applications/Julia-0.5.1.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Applications/Julia-0.5.1.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
in _start() at /Applications/Julia-0.5.1.app/Contents/Resources/julia/lib/julia/sys.dylib:?
Thank you for this package, I like it a lot.
I was wondering how one would write the char-rnn example while using the MXNet backend.
This is the code using tf, which is pretty straightforward:
m = tf(unroll(model, nunroll))
The mxnet()
function needs some additional arguments, which I don't understand in this example.
It would be great to
Coverage.jl
Hello,
is there a simple way, how I can use segmented_sum in the model? I would like to use Flux to implement multi-instance learning problems as described in
Using Neural Network Formalism to Solve Multiple-Instance Problems.
Thank you very much for help.
Tomas
I am trying to run the MNIST example and am getting the following error when trying to train the model.
On worker 2:
The Python TensorFlow package could not be imported. You must install Python TensorFlow before using this package.
in init at /home/user/.julia/v0.5/TensorFlow/src/py.jl:12
in #12 at /home/user/.julia/v0.5/TensorFlow/src/TensorFlow.jl:137
in #649 at ./multi.jl:1428
in run_work_thunk at ./multi.jl:1001
in run_work_thunk at ./multi.jl:1010 [inlined]
in #617 at ./event.jl:68
in #remotecall_wait#631(::Array{Any,1}, ::Function, ::Function, ::Base.Worker) at multi.jl:1095
in remotecall_wait(::Function, ::Base.Worker) at multi.jl:1086
in #remotecall_wait#634(::Array{Any,1}, ::Function, ::Function, ::Int64) at multi.jl:1105
in remotecall_wait(::Function, ::Int64) at multi.jl:1105
in eval(::Module, ::Any) at boot.jl:234
in load_python_process() at TensorFlow.jl:131
in gradients(::TensorFlow.Tensor, ::Array{Any,1}) at core.jl:1211
in compute_gradients(::TensorFlow.train.GradientDescentOptimizer, ::TensorFlow.Tensor, ::Void) at train.jl:48
in #minimize#1(::Void, ::Void, ::String, ::Function, ::TensorFlow.train.GradientDescentOptimizer, ::TensorFlow.Tensor) at train.jl:40
in #train!#11(::Int64, ::Float64, ::Flux.TF.##13#15, ::TensorFlow.train.GradientDescentOptimizer, ::Function, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at model.jl:78
in (::Flux.#kw##train!)(::Array{Any,1}, ::Flux.#train!, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at <missing>:0
in include_string(::String, ::String) at loading.jl:441
in include_string(::String, ::String, ::Int64) at eval.jl:28
in include_string(::Module, ::String, ::String, ::Int64, ::Vararg{Int64,N}) at eval.jl:32
in (::Atom.##53#56{String,Int64,String})() at eval.jl:40
in withpath(::Atom.##53#56{String,Int64,String}, ::String) at utils.jl:30
in withpath(::Function, ::String) at eval.jl:46
in macro expansion at eval.jl:57 [inlined]
in (::Atom.##52#55{Dict{String,Any}})() at task.jl:60
I'm probably getting a bit ahead of things here since I'm using master of TensorFlow.jl
as I'm on 0.6 and the latest release doesn't work. TensorFlow itself seems to be working just fine on master.
Anyway, I get the following error when attempting to train the MNIST example:
ERROR: LoadError: Tensorflow error: Status: You must feed a value for placeholder tensor 'placeholder_8' with dtype float
[[Node: placeholder_8 = Placeholder[_class=[], dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/gpu:0"]()]]
[[Node: placeholder_8/_17 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/cpu:0", send_device="/job:localhost/replica:0/task:0/gpu:0", send_device_incarnation=1, tensor_name="edge_3_placeholder_8", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/cpu:0"]()]]
Stacktrace:
[1] check_status at /home/expandingman/.julia/v0.6/TensorFlow/src/core.jl:402 [inlined]
[2] run(::TensorFlow.Session, ::Array{TensorFlow.Port,1}, ::Array{Any,1}, ::Array{TensorFlow.Port,1}, ::Array{Ptr{Void},1}) at /home/expandingman/.julia/v0.6/TensorFlow/src/run.jl:100
[3] run(::TensorFlow.Session, ::Array{TensorFlow.Tensor{Float32},1}, ::Dict{Any,Any}) at /home/expandingman/.julia/v0.6/TensorFlow/src/run.jl:169
[4] run(::TensorFlow.Session, ::TensorFlow.Tensor{Float32}, ::Dict{Any,Any}) at /home/expandingman/.julia/v0.6/TensorFlow/src/run.jl:187
[5] back!(::Flux.TF.Exec, ::Array{Float32,2}, ::Array{Float64,2}) at /home/expandingman/.julia/v0.6/Flux/src/backend/tensorflow/model.jl:44
[6] macro expansion at /home/expandingman/.julia/v0.6/Flux/src/training.jl:36 [inlined]
[7] macro expansion at /home/expandingman/.julia/v0.6/Juno/src/progress.jl:128 [inlined]
[8] macro expansion at /home/expandingman/.julia/v0.6/Flux/src/training.jl:15 [inlined]
[9] macro expansion at /home/expandingman/.julia/v0.6/Juno/src/progress.jl:128 [inlined]
[10] #train!#119(::Array{##3#4,1}, ::Int64, ::Float64, ::Function, ::Function, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at /home/expandingman/.julia/v0.6/Flux/src/training.jl:29
[11] (::Flux.#kw##train!)(::Array{Any,1}, ::Flux.#train!, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at ./<missing>:0
[12] include_from_node1(::String) at ./loading.jl:552
[13] include(::String) at ./sysimg.jl:14
while loading /home/expandingman/src/test_flux.jl, in expression starting on line 22
As I suspect this is mostly the fault of my insisting on using TensorFlow master as a result of using 0.6, I'll probably look into patching this myself until the 0.6 ecosystem matures.
Update: It seems that placeholder_8
is a placeholder that is getting added to the TensorFlow graph object, but is disconnected from the actual graph (i.e. not used by any functions). Again, this is in the basic MNIST example. Apparently, it is the placeholder for the gradients.
It would be useful to have a standard dropout layer. Should be easy to implement.
Using latest master
on v0.6.0-rc2
julia> using Flux; @net f(x) = x .* x; f_mxnet = mxnet(f);
julia> f_mxnet([1])
1-element Array{Float32,1}:
1.0
julia> f_mxnet(1)
signal (11): Segmentation fault
while loading no file, in expression starting on line 0
Segmentation fault (core dumped)
Hi, I get this error
UndefVarError: unpack not defined
in makesession(::Flux.Unrolled) at /home/dom/.julia/v0.5/Flux/src/backend/tensorflow/recurrent.jl:13
in tf(::Flux.Unrolled) at /home/dom/.julia/v0.5/Flux/src/backend/tensorflow/recurrent.jl:28
in tf(::Flux.Unrolled, ::Vararg{Flux.Unrolled,N}) at /home/dom/.julia/v0.5/Flux/src/backend/backend.jl:10
when trying to run this section of code from the Char RNN example
model = Chain(
Input(N),
LSTM(N, 256),
LSTM(256, 256),
Affine(256, N),
softmax)
m = tf(unroll(model, nunroll))
@time Flux.train!(m, Xs, Ys, η = 0.1, epoch = 1)
Thought it may be to do with a recent tensfor flow API change since I'm running TensorFlow version 1.0.1. I tried to change unpack to unstack in recurrent.jl and graph.jl but it didn't seem to work.
I am not sure if I should be using Flux.jl yet with 0.5, but:
julia> using Flux
INFO: Recompiling stale cache file /Users/viral/.julia/lib/v0.5/Media.ji for module Media.
WARNING: Method definition require(Symbol) in module Base at loading.jl:345 overwritten in module Juno at /Users/viral/.julia/v0.5/Requires/src/require.jl:12.
ERROR: LoadError: LoadError: Unsupported expression MacroTools.flatten in @>
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Users/viral/Desktop/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Users/viral/Desktop/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in eval(::Module, ::Any) at ./boot.jl:234
in eval(::Module, ::Any) at /Users/viral/Desktop/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in require(::Symbol) at ./loading.jl:415
in require(::Symbol) at /Users/viral/Desktop/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
while loading /Users/viral/.julia/v0.5/Flux/src/compiler/code.jl, in expression starting on line 7
while loading /Users/viral/.julia/v0.5/Flux/src/Flux.jl, in expression starting on line 12
In the Sequences section of the Recurrent docs it says:
You can get this behaviour more generally with the Over wrapper.
m = Over(Dense(10,5)) m(seq) # returns a new Seq of length 10
But I'm not super sure what that's equivalent to. At first I thought it might be saying that it's the same as the last m
model above, m = Flux.Recur(rnn, h)
, but now I'm thinking that it takes a model that's designed for 1 timestep and makes it Seq
-aware, but then I might expect that all the built-in layers in Flux would be Seq
-aware by default, and you'd only need to use Over
if you defined your own layer function.
Up to that point the docs do a great job of adding new features as equivalent sugar for things we've already seen, but that one felt like a bit of a jump.
Using broadcasting operators in 0.6 gives deprecation warnings, and soon won't work at all as the .+
etc function objects are removed. We also need a more generic way to handle generic f.(xs)
applications.
I suggest that DataFlow lowers any broadcast f.(xs...)
to Broadcast(f)(xs...)
, where broadcast is simply a wrapper around f
. Calls to broadcast can be appropriately overloaded, both in Julia code and in conversions to backends, as well as made to generate .
calls again when lowered back to syntax.
DataFlow now just creates explicit broadcast
calls as part of desugaring.
I had to install Juno.jl to use Flux.jl. Is this expected?
I am trying the example here docs/src/examples/char-rnn.md
to the line
sequences((onehot(Float32, char, alphabet) for char in chars), nunroll)
but the sequences
function doesn't exist in Flux. Could you please help?
See FluxML/NNlib.jl#9.
I'm getting the following error, code and versions to follow:
UndefVarError: stack not defined
in graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Split, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:42
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Split, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:58
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Split, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Split, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Split, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Input{1}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:41
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Input{1}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:58
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Input{1}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Input{1}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Input{1}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:60
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:41
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:58
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:60
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:41
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:58
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:60
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:41
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:58
in imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/compiler/interp.jl:26
in (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:80
in (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:12
in interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:42
in interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}, ::DataFlow.IVertex{Any}, ::Vararg{IVertex,N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:49
in interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.IVertex{Any}, ::TensorFlow.Tensor{Float32}, ::Vararg{TensorFlow.Tensor{Float32},N}) at /home/peter/.julia/v0.5/DataFlow/src/interpreter.jl:52
in interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Chain, ::DataFlow.IVertex{Any}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:60
in tograph(::Flux.Chain, ::TensorFlow.Tensor{Float32}, ::Vararg{TensorFlow.Tensor{Float32},N}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/graph.jl:66
in #makesession#3(::TensorFlow.Session, ::Function, ::Flux.Chain, ::Array{TensorFlow.Tensor{Float32},1}) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/model.jl:11
in (::Flux.TF.#kw##makesession)(::Array{Any,1}, ::Flux.TF.#makesession, ::Flux.Chain, ::Array{TensorFlow.Tensor{Float32},1}) at ./<missing>:0
in #makesession#4(::TensorFlow.Session, ::Function, ::Flux.Chain, ::Int64) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/model.jl:17
in tf(::Flux.Chain) at /home/peter/.julia/v0.5/Flux/src/backend/tensorflow/model.jl:20
in tf(::Flux.Chain, ::Vararg{Flux.Chain,N}) at /home/peter/.julia/v0.5/Flux/src/backend/backend.jl:10
After executing this code:
using Flux, MNIST
using Flux: accuracy
data = [(trainfeatures(i), onehot(trainlabel(i), 0:9)) for i = 1:60_000]
train = data[1:50_000]
test = data[50_001:60_000]
m = Flux.Chain(
Input(784),
Affine(128), relu,
Affine( 64), relu,
Affine( 10), softmax)
model = Flux.tf(m) # Convert to TensorFlow
This is using Julia 0.5.2, Flux 0.2.1, and TensorFlow 0.6.3
The following script produces a segfault, it seems to be related to show
on some Flux object. I have run into it a couple of different times in different situations.
https://gist.github.com/baggepinnen/5da18d7abecec112f532269d24a127d0
I was browsing the code and I noticed that while the LSTMCell
constructor takes an init
argument, all of the layers are initialized with initn()
instead of init()
:
Flux.jl/src/layers/recurrent.jl
Lines 65 to 68 in a60a754
Hi Mike,
Any plans to include the ability to create models with priors and distributions?
Then inference would have to include facilities to sample from the posterior like MCMC, Variational methods etc
On latest master
with julia 0.6.0-rc2
:
julia> using Flux
INFO: Precompiling module Flux.
ERROR: LoadError: LoadError: BoundsError: attempt to access 0-element Array{IVertex,1} at index [1]
Stacktrace:
[1] include_from_node1(::String) at ./loading.jl:569
[2] include(::String) at ./sysimg.jl:14
[3] include_from_node1(::String) at ./loading.jl:569
[4] include(::String) at ./sysimg.jl:14
[5] anonymous at ./<missing>:2
while loading /Users/alha02/.julia/v0.6/Flux/src/layers/affine.jl, in expression starting on line 3
while loading /Users/alha02/.julia/v0.6/Flux/src/Flux.jl, in expression starting on line 30
ERROR: Failed to precompile Flux to /Users/alha02/.julia/lib/v0.6/Flux.ji.
Stacktrace:
[1] compilecache(::String) at ./loading.jl:703
[2] _require(::Symbol) at ./loading.jl:490
[3] require(::Symbol) at ./loading.jl:398
Tried removing the lib
-folder first, with the same result.
It would be great if we could have an implementation of elastic nets in this framework, using the MXNet or TensorFlow backends for performance
currently
julia> w = param(rand(5));
julia> w[1] = 3
ERROR: indexing not defined for TrackedArray{…,Array{Float64,1}}
Stacktrace:
[1] setindex!(::TrackedArray{…,Array{Float64,1}}, ::Int64, ::Int64) at ./abstractarray.jl:966
[2] macro expansion at ./REPL.jl:97 [inlined]
[3] (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at ./event.jl:73
On the other hand getindex
is available
julia> w[1]
Tracked 0-dimensional Array{Float64,0}:
0.686571
Hi, I got this error while trying to use sparse matrices and vectors with Flux.
julia> a = TrackedArray(sparse([1, 2, 3]))
Tracked 3-element SparseVector{Int64,Int64}:
1
2
3
julia> a[2]
ERROR: MethodError: no method matching spzeros(::Type{Int64})
Closest candidates are:
spzeros(::Type{T}, ::Integer) where T at sparse/sparsevector.jl:47
spzeros(::Type{Tv}, ::Integer, ::Integer) where Tv at sparse/sparsematrix.jl:1375
spzeros(::Type{Tv}, ::Type{Ti<:Integer}, ::Integer) where {Tv, Ti<:Integer} at sparse/sparsevector.jl:48
...
Stacktrace:
[1] getindex(::TrackedArray{…,SparseVector{Int64,Int64}}, ::Int64) at /home/user/.julia/v0.6/Flux/src/tracker/lib.jl:9
From what I know, the problem is a call to similar
with empty tuple as last parameter (in function toarray
) - In Julia, there isn't such a thing as a 0-dimensional sparse array AFAIK.
Thank you
Hi,
by having the first argument of the train!
function named loss, I thought I needed to pass just the loss function (e. g. Flux.mse
) - Upon further reading it figured out that this wasn't the case as seen in the mnist example in model-zoo...
I think maybe better naming or a clearer documentation would help prevent this issue for future Flux beginners.
Thank you
I tried the https://github.com/MikeInnes/Flux.jl/blob/master/examples/MNIST.jl with Julia 6.0 by replacing maxnet(m)
by tf(m)
, which seems buggy. My TensorFlow.jl is simply installed by Pkg.add("TensorFlow")
, which doesn't give me any error. But
Flux.train!(model, train, η = 1e-3,
cb = [()->@show accuracy(m, test)])
gave me below:
INFO: Epoch 1
WARNING: .+ is no longer a function object; use broadcast(+, ...) instead
Stacktrace:
[1] depwarn(::String, ::Symbol) at ./deprecated.jl:70
[2] (::Base.##712#713)(::TensorFlow.Tensor{Float32}, ::TensorFlow.Tensor{Float32}) at ./deprecated.jl:346
[3] graph(::Base.##712#713, ::TensorFlow.Tensor{Float32}, ::TensorFlow.Tensor{Float32}) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:26
[4] graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:44
[5] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:64
[6] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[7] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[8] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[9] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[10] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[11] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[12] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[13] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Line, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:62
[14] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[15] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[16] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Frame{Flux.Affine}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:62
[17] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[18] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[19] interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:50
[20] interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex, ::TensorFlow.Tensor{Float32}, ::Vararg{TensorFlow.Tensor{Float32},N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:53
[21] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:66
[22] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[23] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[24] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[25] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[26] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[27] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[28] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[29] graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:44
[30] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:64
[31] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[32] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[33] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[34] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[35] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[36] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[37] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[38] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:66
[39] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[40] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[41] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[42] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[43] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[44] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[45] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[46] graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:44
[47] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:64
[48] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[49] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[50] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[51] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[52] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[53] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[54] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[55] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:66
[56] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[57] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[58] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[59] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[60] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[61] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[62] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[63] graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:44
[64] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:64
[65] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[66] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[67] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[68] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[69] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[70] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[71] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[72] interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:50
[73] interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex, ::TensorFlow.Tensor{Float32}, ::Vararg{TensorFlow.Tensor{Float32},N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:53
[74] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Chain, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:66
[75] #tograph#3(::Bool, ::Function, ::Flux.Chain, ::TensorFlow.Tensor{Float32}, ::Vararg{TensorFlow.Tensor{Float32},N} where N) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:72
[76] #makesession#4(::TensorFlow.Session, ::Function, ::Flux.Chain, ::Tuple{Array{Float32,2}}) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/model.jl:14
[77] (::Flux.TF.Model)(::Array{Float64,2}, ::Vararg{Array{Float64,2},N} where N) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/model.jl:69
[78] macro expansion at /Users/kai/.julia/v0.6/Flux/src/training.jl:33 [inlined]
[79] macro expansion at /Users/kai/.julia/v0.6/Juno/src/progress.jl:128 [inlined]
[80] macro expansion at /Users/kai/.julia/v0.6/Flux/src/training.jl:15 [inlined]
[81] macro expansion at /Users/kai/.julia/v0.6/Juno/src/progress.jl:128 [inlined]
[82] #train!#119(::Array{##5#6,1}, ::Int64, ::Float64, ::Function, ::Function, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at /Users/kai/.julia/v0.6/Flux/src/training.jl:29
[83] (::Flux.#kw##train!)(::Array{Any,1}, ::Flux.#train!, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at ./<missing>:0
[84] include_string(::String, ::String) at ./loading.jl:515
[85] execute_request(::ZMQ.Socket, ::IJulia.Msg) at /Users/kai/.julia/v0.6/IJulia/src/execute_request.jl:160
[86] eventloop(::ZMQ.Socket) at /Users/kai/.julia/v0.6/IJulia/src/eventloop.jl:8
[87] (::IJulia.##11#14)() at ./task.jl:335
while loading In[8], in expression starting on line 1
WARNING: .+ is no longer a function object; use broadcast(+, ...) instead
Stacktrace:
[1] depwarn(::String, ::Symbol) at ./deprecated.jl:70
[2] (::Base.##712#713)(::TensorFlow.Tensor{Float32}, ::TensorFlow.Tensor{Float32}) at ./deprecated.jl:346
[3] graph(::Base.##712#713, ::TensorFlow.Tensor{Float32}, ::TensorFlow.Tensor{Float32}) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:26
[4] graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:44
[5] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:64
[6] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[7] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[8] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[9] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[10] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[11] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[12] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[13] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Line, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:62
[14] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[15] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[16] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::DataFlow.Frame{Flux.Affine}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:62
[17] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[18] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[19] interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex, ::IVertex, ::Vararg{IVertex,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:50
[20] interpret(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex, ::TensorFlow.Tensor{Float32}, ::Vararg{TensorFlow.Tensor{Float32},N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:53
[21] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:66
[22] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[23] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[24] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[25] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[26] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Flux.Affine, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[27] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[28] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
[29] graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:44
[30] interp(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/backend/tensorflow/graph.jl:64
[31] imap(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/Flux/src/compiler/interp.jl:24
[32] (::DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[33] ilambda(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[34] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[35] iline(::Function, ::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Function, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:81
[36] (::DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}})(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::Vararg{Any,N} where N) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:13
[37] interpv(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,Flux.TF.#interp}}}}, ::IVertex) at /Users/kai/.julia/v0.6/DataFlow/src/interpreter.jl:43
without doing any training.
julia> using Flux: onehot, onehotbatch,
julia> using CuArrays
julia> x=cu(onehotbatch(1:3,1:4))
4×3 Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}:
true false false
false true false
false false true
false false false
julia> a=cu(rand(5,4))
5×4 CuArray{Float32,2}:
0.217491 0.0198405 0.881203 0.362562
0.966335 0.674233 0.0883822 0.461275
0.476342 0.58854 0.66264 0.731656
0.19153 0.894201 0.919836 0.0179432
0.201591 0.29232 0.863611 0.437137
julia> a*x
signal (11): Segmentation fault
while loading no file, in expression starting on line 0
createSentinel at /home/phuoc/packages/julia/usr/include/llvm/ADT/ilist.h:142 [inlined]
ensureHead at /home/phuoc/packages/julia/usr/include/llvm/ADT/ilist.h:147 [inlined]
getTail at /home/phuoc/packages/julia/usr/include/llvm/ADT/ilist.h:317 [inlined]
end at /home/phuoc/packages/julia/usr/include/llvm/ADT/ilist.h:366 [inlined]
end at /home/phuoc/packages/julia/usr/include/llvm/IR/BasicBlock.h:229 [inlined]
allocate_frame at /home/phuoc/packages/julia/src/llvm-gcroot.cpp:827
runOnFunction at /home/phuoc/packages/julia/src/llvm-gcroot.cpp:1235 [inlined]
runOnModule at /home/phuoc/packages/julia/src/llvm-gcroot.cpp:1209
_ZN4llvm6legacy15PassManagerImpl3runERNS_6ModuleE at /home/phuoc/packages/julia/usr/bin/../lib/libLLVM-3.9.so (unknown line)
LLVMRunPassManager at /home/phuoc/packages/julia/usr/bin/../lib/libLLVM-3.9.so (unknown line)
macro expansion at /home/phuoc/.julia/v0.6/LLVM/src/base.jl:22 [inlined]
LLVMRunPassManager at /home/phuoc/.julia/v0.6/LLVM/src/../lib/3.9/libLLVM_h.jl:2693 [inlined]
run! at /home/phuoc/.julia/v0.6/LLVM/src/passmanager.jl:34 [inlined]
#46 at /home/phuoc/.julia/v0.6/CUDAnative/src/jit.jl:365
Type at /home/phuoc/.julia/v0.6/LLVM/src/passmanager.jl:28
optimize! at /home/phuoc/.julia/v0.6/CUDAnative/src/jit.jl:333
unknown function (ip: 0x7f49d0aaab7d)
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
#compile_function#51 at /home/phuoc/.julia/v0.6/CUDAnative/src/jit.jl:406
unknown function (ip: 0x7f49d0a5ee19)
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
cufunction at /home/phuoc/.julia/v0.6/CUDAnative/src/jit.jl:459
macro expansion at /home/phuoc/.julia/v0.6/CUDAnative/src/execution.jl:108 [inlined]
_cuda at /home/phuoc/.julia/v0.6/CUDAnative/src/execution.jl:80
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
_unsafe_getindex! at /home/phuoc/.julia/v0.6/CuArrays/src/indexing.jl:63
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
jl_apply at /home/phuoc/packages/julia/src/julia.h:1424 [inlined]
jl_invoke at /home/phuoc/packages/julia/src/gf.c:51
macro expansion at ./multidimensional.jl:460 [inlined]
_unsafe_getindex at ./multidimensional.jl:453
macro expansion at ./multidimensional.jl:442 [inlined]
_getindex at ./multidimensional.jl:438 [inlined]
getindex at ./abstractarray.jl:882 [inlined]
* at /home/phuoc/.julia/v0.6/Flux/src/onehot.jl:21
unknown function (ip: 0x7f49d0a46fb6)
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
do_call at /home/phuoc/packages/julia/src/interpreter.c:75
eval at /home/phuoc/packages/julia/src/interpreter.c:242
jl_interpret_toplevel_expr at /home/phuoc/packages/julia/src/interpreter.c:34
jl_toplevel_eval_flex at /home/phuoc/packages/julia/src/toplevel.c:577
jl_toplevel_eval_in at /home/phuoc/packages/julia/src/builtins.c:496
eval at ./boot.jl:235
unknown function (ip: 0x7f49f724c84f)
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
eval_user_input at ./REPL.jl:66
unknown function (ip: 0x7f49f72ba95f)
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
macro expansion at ./REPL.jl:97 [inlined]
#1 at ./event.jl:73
unknown function (ip: 0x7f49d09f4a3f)
jl_call_fptr_internal at /home/phuoc/packages/julia/src/julia_internal.h:339 [inlined]
jl_call_method_internal at /home/phuoc/packages/julia/src/julia_internal.h:358 [inlined]
jl_apply_generic at /home/phuoc/packages/julia/src/gf.c:1933
jl_apply at /home/phuoc/packages/julia/src/julia.h:1424 [inlined]
start_task at /home/phuoc/packages/julia/src/task.c:267
unknown function (ip: 0xffffffffffffffff)
Allocations: 17784899 (Pool: 17782775; Big: 2124); GC: 23
fish: “julia” terminated by signal SIGSEGV (Address boundary error)
The distinction between a parameter and a tracked parameter is not clear cut. For example functions like
params
, forparams
, mapparams
operate on any member of an object, being them TrackedArrays or not.
On the other hand the function param
is essentially a constructor for a TrackedArray.
But not any parameter needs to be tracked, and some confusion may arise:
julia> type Linear; w; b; end
julia> Flux.treelike(Linear) #needed by params
# only one tracked member
julia> lin = Linear(Flux.param(rand(2,2)), zeros(2))
Linear(param([0.563915 0.232109; 0.461798 0.7487]), [0.0, 0.0])
# returning both the tracked and non-tracked member
julia> params(lin)
2-element Array{Any,1}:
param([0.563915 0.232109; 0.461798 0.7487])
[0.0, 0.0]
One way out would be the following:
params
as it is. A parameter is generally something characterizing an internal state of a layer.param
-> track
for creating TrackedArray
s.tracked
similarly to params
but returning only tracked variables.opt=SGD(params(net))
to opt=SGD(tracked(net))
(also opt=SGD(net)
may be convenient to have)bye
C
I have been testing out and playing with the Flux examples to get a feel for how things work. I'm particularly interested in the translation.jl example but noticed this disclaimer:
Note that this model excercises some of the more advanced parts of the compiler and isn't stable for general use yet.
I tried running it anyway just to see what would happen and didn't see any results. I do have a question though about the input. I do not see any input source specified there but then there are a few details i'm not familiar with. Is more needed to make this example work?
HI,
Flux looks really cool and I like the simplicity of the API. I'm not sure if this is the right platform for this discussion, but it would be nice if some of the magic parts of Flux were a little easier to follow. I think it would help if there were fewer macros and better use of types and multiple dispatch. Some examples include:
@net
to only providing syntactic sugar rather including other behaviour that isn't clearly explained in the docs (e.g., is there an easy way to reuse outputs more than once outside of the @net
syntax?). More explicit use of the model type hierarchy would help as, at least in the documentation, it is unclear that @net
is also subtyping Flex.Model
.abstract Backend
type MXNet <: Backend end
function compile(backend::MXNet, m::Flex.Model)
# this is where all the scary inspection of your model code is done in order to
# compile it into an MXModel
# Return MXModel
end
function execute!(m::MXModel)
# ...
end
I'm still reading through the code, so I apologize if I'm missing some of the requirements that necessitate the heavy use of macros and code evaluation.
julia> using Flux
INFO: Precompiling module Flux.
ERROR: LoadError: LoadError: StackOverflowError:
Stacktrace:
[1] include_from_node1(::String) at ./loading.jl:552
[2] include(::String) at ./sysimg.jl:14
[3] include_from_node1(::String) at ./loading.jl:552
[4] include(::String) at ./sysimg.jl:14
[5] anonymous at ./<missing>:2
while loading /Users/alha02/.julia/v0.6/Flux/src/dims/catmat.jl, in expression starting on line 42
while loading /Users/alha02/.julia/v0.6/Flux/src/Flux.jl, in expression starting on line 17
ERROR: Failed to precompile Flux to /Users/alha02/.julia/lib/v0.6/Flux.ji.
Stacktrace:
[1] compilecache(::String) at ./loading.jl:686
[2] _require(::Symbol) at ./loading.jl:473
[3] require(::Symbol) at ./loading.jl:386
This was from a fresh clone from this repo and I am using julia 0.6 pre-beta 459 on OSX. Do you know what the problem might be?
It would be great if we could have an implementation of smoothing splines together with GCV in this framework, using the MXNet or TensorFlow backends for performance
I am missing calculation of jacobians
julia> using Flux
julia> using Flux: back!
julia> x = randn(2)
2-element Array{Float64,1}:
-0.488323
0.143252
julia> y = Dense(2,2)(x)
Tracked 2-element Array{Float64,1}:
-0.0087318
0.0192173
julia> back!(y)
ERROR: MethodError: no method matching back!(::TrackedArray{…,Array{Float64,1}})
Closest candidates are:
back!(::Flux.Tracker.TrackedArray, ::Any) at /home/fredrikb/.julia/v0.6/Flux/src/tracker/back.jl:39
back!(::Flux.Tracker.TrackedArray{T,0,A} where A where T) at /home/fredrikb/.julia/v0.6/Flux/src/tracker/back.jl:43
Furthermore, the following confuses me
julia> x = param(randn(2))
Tracked 2-element Array{Float64,1}:
0.567788
-0.32716
julia> y = Dense(2,2)(x)
Tracked 2-element Array{Float64,1}:
0.00648598
-0.00119976
julia> back!(y[1])
julia> x.grad
2-element Array{Float64,1}:
-0.00194926
-0.000263109
while the following returns an error
julia> x[1].grad
ERROR: UndefRefError: access to undefined reference
It seems the gradient information is lost when accessing an element
It would be helpful to have an embedding layer for a large-vocabulary input sequence (http://pytorch.org/docs/master/nn.html#sparse-layers):
Something like:
m = Chain(
Embed(V, E),
LSTM(E, 256),
Dense(256, V),
softmax)
Line 42:
(m::Recur)(s::Seq) = Seq(map(m, x.data))
should be
(m::Recur)(s::Seq) = Seq(map(m, s.data))
I am trying to unroll LSTM but get the error below:
julia> using Flux
julia> using Flux.Compiler: unroll
julia> m = unroll(LSTM(2,2), 5)
ERROR: MethodError: no method matching postwalk(::Flux.Compiler.##17#18, ::Void)WARNING: Error showing method candidates, aborted
Stacktrace:
[1] #unrollgraph#29(::Array{Any,1}, ::Function, ::Flux.Recur{Flux.LSTMCell{Flux.Dense{NNlib.#σ,TrackedArray{…,Array{Float64,2}},TrackedArray{…,Array{Float64,1}}},Flux.Dense{Base.#tanh,TrackedArray{…,Array{Float64,2}},TrackedArray{…,Array{Float64,1}}},TrackedArray{…,Array{Float64,1}}}}, ::Int64) at /home/phuoc/.julia/v0.6/Flux/src/compiler/loops.jl:159
[2] unroll(::Flux.Recur{Flux.LSTMCell{Flux.Dense{NNlib.#σ,TrackedArray{…,Array{Float64,2}},TrackedArray{…,Array{Float64,1}}},Flux.Dense{Base.#tanh,TrackedArray{…,Array{Float64,2}},TrackedArray{…,Array{Float64,1}}},TrackedArray{…,Array{Float64,1}}}}, ::Int64) at /home/phuoc/.julia/v0.6/Flux/src/compiler/loops.jl:162
Is there a quick fix? Many thanks.
Hi, I would like to add Knet as backend.
Could anyone give me some advise on how and where to start?
Consider the following layer Res2
which contains two Dense
layers in a chain, with the function of the layer being activation(x + chain(x))
struct Res2
chain
end
Res2(inputwidth,width) = Res2(Chain(Dense(inputwidth,width,swish),Dense(width,inputwidth)))
(r::Res2)(x) = swish.(x .+ r.chain(x))
n = 5
np = 10
m = Chain(
Dense(n,np,swish),
Res2(np,np),
Res2(np,np),
Res2(np,np),
Res2(np,np),
Dense(np,n)
)
When I call params
on the model m
, the parameters of the Res2
layers do not show up. Is this a bug or have I misunderstood how to compose layers?
julia> params(m)
4-element Array{Any,1}:
param([-0.00680718 -0.00810156 … -0.0130675 -0.0217635; 0.00893031 -0.0112561 … -0.0132078 0.000226305; … ; 0.00464268 0.0015668 … 0.00110918 0.00111789; -0.00921265 -0.00389085 … 0.0112846 -0.00805689])
param([-0.0169659, -0.0180195, 0.000163538, -0.0111752, 0.013014, 0.00813894, 0.00311244, -0.00264249, -0.00702083, 0.0110054])
param([0.0106695 -0.0045525 … -0.0077128 -0.0104561; -0.0212993 -0.0137948 … 0.00591983 -0.0233666; … ; 0.00780111 0.0193906 … 0.00700548 0.0125713; -0.0177297 -0.00128902 … -0.00923536 9.27876e-5])
param([0.00431632, 0.00832052, -0.0134394, -0.00536808, -0.0129291])
I have been looking forward to playing with Flux.jl for a while and decided now is the time. I summarize my first experiences from reading through the docs, to hopefully highlight some potential improvements to what I realize is a very early version of documentation.
x*W
and sometimes as W*x
which confuses the reader (at least me) about input dimensions.mymodel[1:3](x)
would calculate the output of the third layer. This is really convenient, but I couldn't find it in the docs. I am also interested in knowing how I calculate all (or some) intermediate layer activations without calculating anything twice. As in Tensorflow run(session, [l1, l2, output], ...)
Flux.train!
tells me how to specify number of epochs, but I don't know if there's a default value or it is determiend automatically based on progress)I find Flux very interesting, especially after having watched a presentation of Flux somewhere on YouTube, and hope to be able to contribute some time in the future!
Hi,
When running the MNIST example as follows, there is an error. It seems there is an error
ERROR: LoadError: MethodError: Cannot `convert` an object of type Flux.OneHotMatrix{Array{Flux.OneHotVector,1}} to an object of type CLArrays.CLArray
This may have arisen from a call to the constructor CLArrays.CLArray(...),
since type constructors fall back to convert methods.
Stacktrace:
[1] CLArrays.CLArray(::Flux.OneHotMatrix{Array{Flux.OneHotVector,1}}) at ./sysimg.jl:24
[2] include_from_node1(::String) at ./loading.jl:569
[3] include(::String) at ./sysimg.jl:14
[4] process_options(::Base.JLOptions) at ./client.jl:305
[5] _start() at ./client.jl:371
while loading /Users/rveltz/work/prog_gd/julia/flux-mnist-cl.jl, in expression starting on line 20
because the type OneHotMatrix
has not been wrapped into CLArrays. Is it an easy fix?
Thank you for your help,
Best regards.
using Flux, MNIST
using Flux: onehotbatch, argmax, mse, throttle
using Base.Iterators: repeated
x, y = traindata()
y = onehotbatch(y, 0:9)
m = Chain(
Dense(28^2, 32, relu),
Dense(32, 10),
softmax)
# using CuArrays
# x, y = cu(x), cu(y)
# m = mapparams(cu, m)
using CLArrays
CLArrays.init(CLArrays.devices()[2])
cl = CLArray
x, y = cl(x), cl(y)
m = mapparams(cl, m)
loss(x, y) = mse(m(x), y)
dataset = repeated((x, y), 200)
evalcb = () -> @show(loss(x, y))
opt = SGD(params(m), 0.1)
Flux.train!(loss, dataset, opt, cb = throttle(evalcb, 5))
# Check the prediction for the first digit
argmax(m(x[:,1]), 0:9) == argmax(y[:,1], 0:9)
Fixed. Not sure how to delete this
I got this error when trying a sequence to sequence model as follow:
encode = LSTM(N, 256)
decode = Chain(LSTM(N, 256), Dense(256, N), softmax)
encode.(xs)
decode[1].cell.h = encode.cell.h
decode[1].cell.c = encode.cell.c
Shouldn't the states (h,c) be out of the LSTM model?
On current master, the following functions are no longer defined
back!
seems to have moved to FluxCore but is not exportedonehot
the file containing the function seems not to be includedjulia> Pkg.test("Flux")
INFO: Testing Flux
Test Summary: | Pass Total
Batching | 3 3
Test Summary: | Pass Total
Basics | 7 7
FeedForward interface: Error During Test
Got an exception of type DataFlow.Interpreter.Exception{AssertionError} outside of a @test
AssertionError: FullyConnected only accepts SymbolicNode either as positional or keyword arguments, not both.
in #FullyConnected#3931(::Array{Any,1}, ::Function, ::Type{MXNet.mx.SymbolicNode}, ::MXNet.mx.SymbolicNode, ::Vararg{MXNet.mx.SymbolicNode,N}) at symbolic-node.jl:654
in (::MXNet.mx.#kw##FullyConnected)(::Array{Any,1}, ::MXNet.mx.#FullyConnected, ::Type{MXNet.mx.SymbolicNode}, ::MXNet.mx.SymbolicNode, ::Vararg{MXNet.mx.SymbolicNode,N}) at <missing>:0
in #FullyConnected#3935(::Array{Any,1}, ::Function, ::MXNet.mx.SymbolicNode, ::Vararg{MXNet.mx.SymbolicNode,N}) at symbolic-node.jl:696
in (::MXNet.mx.#kw##FullyConnected)(::Array{Any,1}, ::MXNet.mx.#FullyConnected, ::MXNet.mx.SymbolicNode) at <missing>:0
in graph(::DataFlow.Interpreter.Context{DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iline,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ilambda,DataFlow.Interpreter.##1#2{Flux.#imap,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#iargs,DataFlow.Interpreter.##1#2{DataFlow.Interpreter.#ituple,Flux.MX.#graph′}}}}}}, ::Flux.Affine, ::MXNet.mx.SymbolicNode) at graph.jl:36
in #tograph#1(::Bool, ::Function, ::Flux.Capacitor, ::MXNet.mx.SymbolicNode, ::Vararg{MXNet.mx.SymbolicNode,N}) at /Users/michael/.julia/v0.5/Flux/src/backend/mxnet/graph.jl:87
in (::Flux.MX.#kw##tograph)(::Array{Any,1}, ::Flux.MX.#tograph, ::Flux.Capacitor, ::MXNet.mx.SymbolicNode, ::Vararg{MXNet.mx.SymbolicNode,N}) at ./<missing>:0
in #FeedForward#2(::Symbol, ::Symbol, ::MXNet.mx.Context, ::Type{T}, ::Flux.Chain) at /Users/michael/.julia/v0.5/Flux/src/backend/mxnet/model.jl:121
in macro expansion; at /Users/michael/.julia/v0.5/Flux/test/backend/mxnet.jl:26 [inlined]
in macro expansion; at ./test.jl:674 [inlined]
in macro expansion; at /Users/michael/.julia/v0.5/Flux/test/backend/mxnet.jl:25 [inlined]
in macro expansion; at ./test.jl:674 [inlined]
in anonymous at ./<missing>:?
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
in _start() at /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
INFO: The following warning is normal
[20:33:13] /Users/michael/.julia/v0.5/MXNet/deps/src/mxnet/dmlc-core/include/dmlc/./logging.h:300: [20:33:13] src/operator/tensor/./matrix_op-inl.h:460: Check failed: lshape[1] == rshape[0] (20 vs. 21) dot shape error: (1,20) X (21,15)
Stack trace returned 53 entries:
[bt] (0) 0 libmxnet.so 0x000000031865d4e8 _ZN4dmlc15LogMessageFatalD2Ev + 40
[bt] (1) 1 libmxnet.so 0x0000000318bbe02a _ZN5mxnet2op8DotShapeERKN4nnvm9NodeAttrsEPNSt3__16vectorINS1_6TShapeENS5_9allocatorIS7_EEEESB_ + 5050
[bt] (2) 2 libmxnet.so 0x0000000318ed957a _ZZN4nnvm4pass12_GLOBAL__N_19InferAttrINS_6TShapeEZNKS1_3$_0clENS_5GraphEEUlRKS3_E_DnEES5_OS5_T_PKcSC_SC_SC_SC_T0_T1_ENKUljbE_clEjb + 2698
[bt] (3) 3 libmxnet.so 0x0000000318ed802a _ZNSt3__110__function6__funcIN4nnvm4pass12_GLOBAL__N_13$_0ENS_9allocatorIS5_EEFNS2_5GraphES8_EEclEOS8_ + 2858
[bt] (4) 4 libmxnet.so 0x0000000318ec30cb _ZN4nnvm11ApplyPassesENS_5GraphERKNSt3__16vectorINS1_12basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEENS6_IS8_EEEE + 1419
[bt] (5) 5 libmxnet.so 0x0000000318c32cb0 _ZN4nnvm9ApplyPassENS_5GraphERKNSt3__112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEE + 208
[bt] (6) 6 libmxnet.so 0x0000000318c36281 _ZN4nnvm4pass10InferShapeENS_5GraphENSt3__16vectorINS_6TShapeENS2_9allocatorIS4_EEEENS2_12basic_stringIcNS2_11char_traitsIcEENS5_IcEEEE + 929
[bt] (7) 7 libmxnet.so 0x0000000318c60c13 _ZN5mxnet4exec13GraphExecutor9InitGraphEN4nnvm6SymbolERKNS_7ContextERKNSt3__13mapINS7_12basic_stringIcNS7_11char_traitsIcEENS7_9allocatorIcEEEES4_NS7_4lessISE_EENSC_INS7_4pairIKSE_S4_EEEEEERKNS7_6vectorINS_7NDArrayENSC_ISP_EEEEST_RKNSO_INS_9OpReqTypeENSC_ISU_EEEEST_ + 4163
[bt] (8) 8 libmxnet.so 0x0000000318c5f1af _ZN5mxnet4exec13GraphExecutor4InitEN4nnvm6SymbolERKNS_7ContextERKNSt3__13mapINS7_12basic_stringIcNS7_11char_traitsIcEENS7_9allocatorIcEEEES4_NS7_4lessISE_EENSC_INS7_4pairIKSE_S4_EEEEEERKNS7_6vectorINS_7NDArrayENSC_ISP_EEEEST_RKNSO_INS_9OpReqTypeENSC_ISU_EEEEST_PNS_8ExecutorE + 111
[bt] (9) 9 libmxnet.so 0x0000000318c65f0d _ZN5mxnet8Executor4BindEN4nnvm6SymbolERKNS_7ContextERKNSt3__13mapINS6_12basic_stringIcNS6_11char_traitsIcEENS6_9allocatorIcEEEES3_NS6_4lessISD_EENSB_INS6_4pairIKSD_S3_EEEEEERKNS6_6vectorINS_7NDArrayENSB_ISO_EEEESS_RKNSN_INS_9OpReqTypeENSB_IST_EEEESS_PS0_ + 285
[bt] (10) 10 libmxnet.so 0x0000000318c213ac MXExecutorBindEX + 1852
[bt] (11) 11 libmxnet.so 0x0000000318c20be9 MXExecutorBind + 89
[bt] (12) 12 ??? 0x00000003165226d8 0x0 + 13259384536
[bt] (13) 13 ??? 0x0000000316522949 0x0 + 13259385161
[bt] (14) 14 libjulia.0.5.1.dylib 0x0000000106aeb608 jl_apply_generic + 1000
[bt] (15) 15 ??? 0x00000003165216f8 0x0 + 13259380472
[bt] (16) 16 ??? 0x00000003165219e3 0x0 + 13259381219
[bt] (17) 17 libjulia.0.5.1.dylib 0x0000000106aeb608 jl_apply_generic + 1000
[bt] (18) 18 ??? 0x000000031651cb2c 0x0 + 13259361068
[bt] (19) 19 ??? 0x000000031651cc7e 0x0 + 13259361406
[bt] (20) 20 ??? 0x0000000316500f1e 0x0 + 13259247390
[bt] (21) 21 ??? 0x00000003165438aa 0x0 + 13259520170
[bt] (22) 22 ??? 0x0000000316543e77 0x0 + 13259521655
[bt] (23) 23 libjulia.0.5.1.dylib 0x0000000106aeb608 jl_apply_generic + 1000
[bt] (24) 24 libjulia.0.5.1.dylib 0x0000000106b013b8 do_call + 200
[bt] (25) 25 libjulia.0.5.1.dylib 0x0000000106aff73c eval + 860
[bt] (26) 26 libjulia.0.5.1.dylib 0x0000000106b17aa7 jl_toplevel_eval_flex + 1511
[bt] (27) 27 libjulia.0.5.1.dylib 0x0000000106af9333 jl_toplevel_eval_in_warn + 899
[bt] (28) 28 ??? 0x00000003165430d5 0x0 + 13259518165
[bt] (29) 29 ??? 0x00000003164f9a1e 0x0 + 13259217438
[bt] (30) 30 ??? 0x00000003164fa720 0x0 + 13259220768
[bt] (31) 31 libjulia.0.5.1.dylib 0x0000000106b17a03 jl_toplevel_eval_flex + 1347
[bt] (32) 32 libjulia.0.5.1.dylib 0x0000000106af54fe jl_parse_eval_all + 1214
[bt] (33) 33 libjulia.0.5.1.dylib 0x0000000106b1813b jl_load_ + 187
[bt] (34) 34 sys.dylib 0x00000001085221bf julia_include_from_node1_20312 + 479
[bt] (35) 35 sys.dylib 0x00000001085223cc jlcall_include_from_node1_20312 + 12
[bt] (36) 36 libjulia.0.5.1.dylib 0x0000000106aeb608 jl_apply_generic + 1000
[bt] (37) 37 libjulia.0.5.1.dylib 0x0000000106b013b8 do_call + 200
[bt] (38) 38 libjulia.0.5.1.dylib 0x0000000106aff73c eval + 860
[bt] (39) 39 libjulia.0.5.1.dylib 0x0000000106b0102c eval_body + 1180
[bt] (40) 40 libjulia.0.5.1.dylib 0x0000000106b17ab2 jl_toplevel_eval_flex + 1522
[bt] (41) 41 libjulia.0.5.1.dylib 0x0000000106af54fe jl_parse_eval_all + 1214
[bt] (42) 42 libjulia.0.5.1.dylib 0x0000000106b1813b jl_load_ + 187
[bt] (43) 43 sys.dylib 0x00000001085221bf julia_include_from_node1_20312 + 479
[bt] (44) 44 sys.dylib 0x00000001085223cc jlcall_include_from_node1_20312 + 12
[bt] (45) 45 libjulia.0.5.1.dylib 0x0000000106aeb608 jl_apply_generic + 1000
[bt] (46) 46 sys.dylib 0x0000000108546f88 julia_process_options_21666 + 2440
[bt] (47) 47 sys.dylib 0x0000000108548b5d julia__start_21657 + 1213
[bt] (48) 48 sys.dylib 0x0000000108549559 jlcall__start_21657 + 9
[bt] (49) 49 libjulia.0.5.1.dylib 0x0000000106aeb608 jl_apply_generic + 1000
[bt] (50) 50 julia 0x0000000106ad01e8 true_main + 104
[bt] (51) 51 julia 0x0000000106ad015c main + 108
[bt] (52) 52 julia 0x0000000106ad00d4 start + 52
Test Summary: | Pass Error Total
MXNet | 9 1 10
Backward Pass | 5 5
FeedForward interface | 1 1
Stack Traces | 3 3
ERROR: LoadError: LoadError: Some tests did not pass: 9 passed, 0 failed, 1 errored, 0 broken.
in finish(::Base.Test.DefaultTestSet) at ./test.jl:498
in macro expansion; at ./test.jl:681 [inlined]
in anonymous at ./<missing>:?
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in include_from_node1(::String) at ./loading.jl:488
in include_from_node1(::String) at /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
in _start() at /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?
while loading /Users/michael/.julia/v0.5/Flux/test/backend/mxnet.jl, in expression starting on line 4
while loading /Users/michael/.julia/v0.5/Flux/test/runtests.jl, in expression starting on line 19
================================[ ERROR: Flux ]=================================
failed process: Process(`/Applications/Julia-0.5.app/Contents/Resources/julia/bin/julia -Ccore2 -J/Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib --compile=yes --depwarn=yes --check-bounds=yes --code-coverage=none --color=yes --compilecache=yes /Users/michael/.julia/v0.5/Flux/test/runtests.jl`, ProcessExited(1)) [1]
================================================================================
ERROR: Flux had test errors
in #test#61(::Bool, ::Function, ::Array{AbstractString,1}) at ./pkg/entry.jl:749
in (::Base.Pkg.Entry.#kw##test)(::Array{Any,1}, ::Base.Pkg.Entry.#test, ::Array{AbstractString,1}) at ./<missing>:0
in (::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}})() at ./pkg/dir.jl:31
in cd(::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}}, ::String) at ./file.jl:59
in #cd#1(::Array{Any,1}, ::Function, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at ./pkg/dir.jl:31
in (::Base.Pkg.Dir.#kw##cd)(::Array{Any,1}, ::Base.Pkg.Dir.#cd, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at ./<missing>:0
in #test#3(::Bool, ::Function, ::String, ::Vararg{String,N}) at ./pkg/pkg.jl:258
in test(::String, ::Vararg{String,N}) at ./pkg/pkg.jl:258
versioninfo(true)
Julia Version 0.5.1
Commit 6445c82 (2017-03-05 13:25 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin13.4.0)
CPU: Intel(R) Core(TM) i7-5557U CPU @ 3.10GHz
WORD_SIZE: 64
uname: Darwin 16.4.0 Darwin Kernel Version 16.4.0: Thu Dec 22 22:53:21 PST 2016; root:xnu-3789.41.3~3/RELEASE_X86_64 x86_64 i386
Memory: 16.0 GB (1037.703125 MB free)
Uptime: 453797.0 sec
Load Avg: 1.72314453125 2.021484375 1.99951171875
Intel(R) Core(TM) i7-5557U CPU @ 3.10GHz:
speed user nice sys idle irq
#1 3100 MHz 232596 s 0 s 142994 s 1100455 s 0 s
#2 3100 MHz 79704 s 0 s 38395 s 1357794 s 0 s
#3 3100 MHz 212008 s 0 s 101053 s 1162834 s 0 s
#4 3100 MHz 83302 s 0 s 38650 s 1353939 s 0 s
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.7.1 (ORCJIT, broadwell)
Environment:
TERM = xterm-256color
PATH = /Applications/Julia-0.5.app/Contents/Resources/julia/bin:/Library/TeX/texbin:/Users/michael/anaconda/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/opt/X11/bin:/Library/TeX/texbin
XPC_FLAGS = 0x0
HOME = /Users/michael
FONTCONFIG_PATH = /Applications/Julia-0.5.app/Contents/Resources/julia/etc/fonts
Package Directory: /Users/michael/.julia/v0.5
59 required packages:
- ArrayViews 0.6.4
- Atom 0.5.9+ master
- BenchmarkTools 0.0.7
- Bio 0.4.7
- CSV 0.1.2
- Clustering 0.7.0
- ColorBrewer 0.3.0
- Contour 0.2.0
- CrossfilterCharts 0.1.1
- DataFrames 0.8.5+ master
- DataFramesMeta 0.1.3
- DataTables 0.0.1
- Distributions 0.12.1
- FactCheck 0.4.3
- FileIO 0.3.0
- Flux 0.1.0
- FreqTables 0.0.2
- GLM 0.6.1+ master
- GLPlot 0.0.4
- GLVisualize 0.2.2
- GR 0.19.0+ master
- Gadfly 0.5.3
- Glob 1.1.0
- Grid 0.4.2
- HypothesisTests 0.4.0
- IJulia 1.4.1
- ImageTransformations 0.1.0
- JLD 0.6.9
- LaTeXStrings 0.2.0
- Lint 0.2.5+ master
- MAT 0.3.1
- MXNet 0.2.1
- MarketData 0.6.0
- MixedModels 0.7.6
- NamedArrays 0.5.3
- NetCDF 0.4.1
- PhyloNetworks 0.5.0+ crsl4/master
- PlotRecipes 0.2.0+ master
- PlotThemes 0.1.1+ JuliaPlots/master
- PlotlyJS 0.5.2+ master
- ProfileView 0.1.5
- PyPlot 2.3.1
- Quandl 0.6.1
- Query 0.3.1
- RCall 0.6.4+ master
- RDatasets 0.2.0
- RecipesBase 0.1.0
- Reexport 0.0.3
- Requests 0.4.1
- Rsvg 0.0.2
- Shapefile 0.0.3
- SimJulia 0.3.14+ master
- SimpleTraits 0.3.0
- StaticArrays 0.3.0
- Stats 0.1.0
- TimeSeries 0.9.1+ JuliaStats/master
- TreeView 0.1.0
- UnicodePlots 0.2.3
- VisualRegressionTests 0.1.0
162 additional packages:
- ASTInterpreter 0.0.4
- AbstractTrees 0.0.4
- ArgParse 0.4.0
- AutoHashEquals 0.1.1
- AxisAlgorithms 0.1.6
- AxisArrays 0.0.5
- BGZFStreams 0.1.3
- BandedMatrices 0.2.1+ master
- BaseTestNext 0.2.2
- BinDeps 0.4.7
- Blink 0.5.1+ master
- Blosc 0.2.0
- BufferedStreams 0.3.0
- COFF 0.0.2
- CRC 1.2.0
- Cairo 0.2.35
- Calculus 0.2.1
- CatIndices 0.0.2
- CategoricalArrays 0.1.2
- CodeTools 0.4.3+ master
- Codecs 0.3.0
- ColorTypes 0.3.4
- ColorVectorSpace 0.4.0
- Colors 0.7.3
- Combinatorics 0.3.2
- CommonSubexpressions 0.0.1
- Compat 0.20.0
- Compose 0.4.5
- ComputationalResources 0.0.2
- Conda 0.5.1
- CoordinateTransformations 0.4.0
- CustomUnitRanges 0.0.4
- DWARF 0.1.0
- DataArrays 0.3.12
- DataFlow 0.1.0
- DataStreams 0.1.2
- DataStructures 0.5.3
- DiffBase 0.0.4
- Distances 0.4.1
- DocStringExtensions 0.3.1
- Documenter 0.9.0
- DualNumbers 0.3.0
- ELF 0.1.0
- EzXML 0.4.3
- FFTViews 0.0.2
- FixedPointNumbers 0.3.4
- FixedSizeArrays 0.2.5 master
- Formatting 0.2.0
- ForwardDiff 0.3.4
- FreeType 1.2.0 master
- FunctionWrappers 0.0.1
- GLAbstraction 0.3.1
- GLFW 1.2.2
- GLText 0.0.4
- GLWindow 0.3.2
- GZip 0.2.20
- Gallium 0.0.4+ master
- GeometryTypes 0.2.2 master
- Graphics 0.1.4
- Gtk 0.11.0
- GtkUtilities 0.2.2
- HDF5 0.7.3
- Hexagons 0.0.4
- Hiccup 0.1.1
- Homebrew 0.4.2
- HttpCommon 0.2.6
- HttpParser 0.2.0
- HttpServer 0.1.7
- ImageAxes 0.1.1
- ImageCore 0.1.4
- ImageFiltering 0.1.2
- ImageMetadata 0.2.1
- Images 0.8.0+ master
- ImmutableArrays 0.0.12
- IndexableBitVectors 0.1.1
- IndirectArrays 0.1.1
- Interpolations 0.3.8
- IntervalSets 0.0.3
- IntervalTrees 0.1.0
- Iterators 0.3.0
- JSON 0.8.3
- JuliaParser 0.7.4
- Juno 0.2.7+ master
- KernelDensity 0.3.2
- LNR 0.0.2
- Lazy 0.11.5 master
- LegacyStrings 0.2.0
- Libz 0.2.4
- LightGraphs 0.7.3
- LightXML 0.4.0
- LineSearches 0.1.5
- Loess 0.1.0
- MachO 0.0.4
- MacroTools 0.3.6
- MappedArrays 0.0.6
- MathProgBase 0.6.1
- MbedTLS 0.4.3
- Measures 0.0.3
- Media 0.2.5+ master
- MeshIO 0.0.6
- MetaPkg 0.0.0- master (unregistered)
- ModernGL 0.1.1
- Mustache 0.1.3
- Mux 0.2.3
- NLopt 0.3.4
- NaNMath 0.2.2
- NamedTuples 1.0.0
- NearestNeighbors 0.2.0
- Nettle 0.3.0
- NetworkLayout 0.0.1
- NullableArrays 0.1.0
- ObjFileBase 0.0.4
- OffsetArrays 0.2.14
- Optim 0.7.7
- PDMats 0.5.6
- Packing 0.0.4
- ParserCombinator 1.7.11
- PlotUtils 0.3.0+ JuliaPlots/master
- Plots 0.10.3+ JuliaPlots/master
- PolynomialFactors 0.0.3
- Polynomials 0.1.3
- PositiveFactorizations 0.0.4
- Primes 0.1.2
- PyCall 1.10.0
- QuadGK 0.1.1
- QuartzImageIO 0.2.1
- Quaternions 0.1.1
- RData 0.0.4
- RangeArrays 0.1.2
- Ratios 0.0.4
- Reactive 0.3.7 master
- Requires 0.3.0
- Rmath 0.1.6
- Roots 0.3.0
- Rotations 0.3.5
- SHA 0.3.2
- SIUnits 0.1.0
- ShowItLikeYouBuildIt 0.0.1
- Showoff 0.0.7
- SignedDistanceFields 0.1.0
- SortingAlgorithms 0.1.0
- SpatialEcology 0.0.0- use-views-for-subsetting (unregistered)
- SpecialFunctions 0.1.1
- StatPlots 0.2.1+ JuliaPlots/master
- StatsBase 0.13.1 JuliaStats/master
- StatsFuns 0.4.0
- StructIO 0.0.2
- TerminalUI 0.0.2
- TexExtensions 0.0.3
- TextWrap 0.1.6
- TikzGraphs 0.3.0
- TikzPictures 0.3.5
- TiledIteration 0.0.2
- Tokenize 0.1.6+ master
- URIParser 0.1.8
- UnicodeFun 0.0.2
- VT100 0.1.0
- VectorizedRoutines 0.0.2+ master
- WeakRefStrings 0.2.0
- WebSockets 0.2.1
- WoodburyMatrices 0.2.2
- ZMQ 0.4.1
Along with other recurrent layers we're missing. See the current implementations of LSTM and RNN for reference.
Trying to bundle the softmax
activation in with a Dense()
layer causes problems. Observe:
julia> model = Chain(Dense(10, 1), softmax); model(randn(10,10))
Tracked 1×10 Array{Float64,2}:
1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
Versus:
julia> model = Chain(Dense(10, 1, softmax)); model(randn(10,10))
ERROR: MethodError: objects of type ForwardDiff.Dual{Void,Float64,2} are not callable
Stacktrace:
[1] (::Flux.##58#59{NNlib.#softmax})(::ForwardDiff.Dual{Void,Float64,2}, ::ForwardDiff.Dual{Void,Float64,2}) at ./<missing>:0
[2] broadcast_t(::Function, ::Type{Any}, ::Tuple{Base.OneTo{Int64},Base.OneTo{Int64}}, ::CartesianRange{CartesianIndex{2}}, ::Array{ForwardDiff.Dual{Void,Float64,2},2}, ::Array{ForwardDiff.Dual{Void,Float64,2},1}) at ./broadcast.jl:256
[3] broadcast_c at ./broadcast.jl:319 [inlined]
[4] broadcast at ./broadcast.jl:434 [inlined]
[5] tracked_broadcast(::Function, ::TrackedArray{…,Array{Float64,2}}, ::TrackedArray{…,Array{Float64,1}}) at /root/.julia/v0.6/Flux/src/tracker/lib.jl:107
[6] (::Flux.Dense{NNlib.#softmax,TrackedArray{…,Array{Float64,2}},TrackedArray{…,Array{Float64,1}}})(::Array{Float64,2}) at /root/.julia/v0.6/Flux/src/layers/basic.jl:73
[7] mapfoldl_impl(::Base.#identity, ::Flux.##55#56, ::Array{Float64,2}, ::Array{Any,1}, ::Int64) at ./reduce.jl:43
[8] (::Flux.Chain)(::Array{Float64,2}) at /root/.julia/v0.6/Flux/src/layers/basic.jl:30
When trying to do run the very first piece of example code, @net f(x) = x .* x
, I get a bewildering error. This is in Juno but the same thing happens in the regular Julia command line. Pkg.test("Flux")
results in all tests passing.
> @net f(x) = x .* x
Unsupported model expression f(x) = begin # console, line 1:
x .* x
end
in include_string(::String, ::String) at loading.jl:441
in include_string(::String, ::String) at sys.dylib:?
in eval(::Module, ::Any) at boot.jl:234
in eval(::Module, ::Any) at sys.dylib:?
in (::Atom.##65#68)() at eval.jl:102
in withpath(::Atom.##65#68, ::Void) at utils.jl:30
in withpath(::Function, ::Void) at eval.jl:38
in macro expansion at eval.jl:101 [inlined]
in (::Atom.##64#67{Dict{String,Any}})() at task.jl:60
Here's my Julia version:
| | |_| | | | (_| | | Version 0.5.2 (2017-05-06 16:34 UTC)
_/ |\__'_|_|_|\__'_| | Official http://julialang.org/ release
|__/ | x86_64-apple-darwin13.4.0
I have been trying to figure out how to use the weightdecay
function and the Momentum
optimizer.
It seems weightdecay
should be passed to the optimiser
function, e.g., when an optimizer is created. None of the optimizers do however have an option to pass in a weight decay parameter. Am I supposed to create a separate optimizer for weight decay and momentum that is called before the optimizer that calls descent
?
Not really an issue here, but maybe we want to reconsider the arguments order in the constructor of the dense layer. Maybe it is just my personal taste, but it feels kind of unnatural to me that Dense(in, out)
represents an out x in
matrix, I'd rather have Dense(out, in).
I'm suggesting this because breaking things now is still quite ok, and this is a badly breaking change
There is an implementation in MXNet: apache/mxnet#6832
The eventual plan is to build a new compiler-level AD that better exploits Julia's compilation, provides a more function interface, and supports nested differentiation. A question here is how to support the grad(f, x)
style interface while also still allowing abstraction and modularity in layers and their weights.
I see this looking something like:
W = randn(5,5)
b = randn(5)
loss(x, y) = mse(W*x .+ b, y)
dW, db = grad(loss, (W, b), x, y)
W
and b
are treated as implicit arguments to the function; this is nice in that it's essentially the ideal functional interface but without the mess of hundreds of explicit arguments.
Models will implement params
, as they do now, and whatever arrays they return will be treated as trainable parameters (dparams = grad(model, params(model), args...)
). We'll also have a Freeze
layer to treat things as constant, e.g. m = Freeze(Dense(10, 5)); params(m) == []
. Freezing parameters is a little more coarse-grained compared to now, but that's small loss compared to the gains.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.