juliarobotics / incrementalinference.jl Goto Github PK
View Code? Open in Web Editor NEWClique recycling non-Gaussian (multi-modal) factor graph solver; also see Caesar.jl.
License: MIT License
Clique recycling non-Gaussian (multi-modal) factor graph solver; also see Caesar.jl.
License: MIT License
autoinit keyword not working as intended?
Following code
addNode!(fg, sym, Point2)
pp = PriorPoint2(MvNormal(pos, cov) )
addFactor!(fg, [sym;], pp, autoinit=true)
will throws error: BoundsError: attempt to access 0×100 Array{Float64,2} at index [1, 1]
when inferOverTree! tries to initialize first point2 :x1 again (even though it should be initialized?)
Adding one line:
addNode!(fg, sym, Point2)
pp = PriorPoint2(MvNormal(pos, cov) )
addFactor!(fg, [sym;], pp, autoinit=true)
IncrementalInference.doautoinit!(fg,sym)
removes the error and initializes all the point2 as expected
evalFactor2
produced out of bounds result 17 not < 15
:
https://travis-ci.org/JuliaRobotics/IncrementalInference.jl/jobs/394191898
Visible in DB version after inference has updated all local information on DB
module TestTest
using KernelDensityEstimate, HDF5
import KernelDensityEstimate: root
import HDF5: root
end
WARNING: ignoring conflicting import of HDF5.root into TestTest
# reverse produces
module TestTest2
using KernelDensityEstimate, HDF5
import HDF5: root
import KernelDensityEstimate: root
end
WARNING: ignoring conflicting import of KernelDensityEstimate.root into TestTest2
Importing these individually seems to work, but joined uncovers the conflict, although HDF5.root
and KernelDensityEstimate.root
are clearly different. Maybe in lower down dependencies?
Current code results in a bad graph with weird errors. Could also just add a test rather than hard fail if user is unsure.
Need to add these to the required installs:
sudo apt-get install libfontconfig1
sudo apt-get install gettext
sudo apt-get install libcairo2
sudo apt-get install libpango1.0-0
Some node labels are longer than a single letter and number. For example maybe one can add node labels:
:x1
:s1
:p2
:offset3
:cal10_3
:pt100_1
not discovering all the nodes as you would with ls(fg, sym, api=localapi)
Some kind of History object?
Seemingly after partials constraint upgrade. Left with some variables having large variance. Likely a missed assignment somewhere.
Likely also associated with multisession, since that is when the problem appeared.
See testGenericWrapParam.jl for example. When root clique has two variables, a singleton and pairwise constraint, the iterated does one iteration and propagates belief.
Solving, DB all seem to run fine, but x1 should much more tightly constrained by prior. This used to work, likely an issue with DataLayerAPI omission.
ensureAllInitialized
iterates over all nodes, and takes >1 second/node.
I am getting the following error when adding a factor under the default thread model (multithreaded):
julia> addFactor!(fg,[:p1_1,:p2_1],sf)
Error thrown in threaded loop on thread 2: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-8.6476e-11, -9.00376e-11, -7.83454e-11]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 3: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-6.63141e-12, 1.78934e-10, -2.86034e-11]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 0: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-8.6476e-11, -9.00376e-11, -7.83454e-11]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 1: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-6.63141e-12, 1.78934e-10, -2.86034e-11]), world=0x0000000000005bcd)[drct]
Error thrown in threaded loop on thread 1: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-1.757e-11, -6.54201e-11, 1.89608e-10]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 2: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-1.757e-11, -6.54201e-11, 1.89608e-10]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 3: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-1.757e-11, -6.54201e-11, 1.89608e-10]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 0: Base.MethodError(f=typeof(Sonar.ominus)(), args=(Array{Float64, 1}[], Array{Float64, 1}[-1.757e-11, -6.54201e-11, 1.89608e-10]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 3: Base.MethodError(f=Sonar.Point3Point3(Zij=Distributions.MvNormal{Float64, PDMats.PDMat{Float64, Array{Float64, 2}}, Array{Float64, 1}}(μ=Array{Float64, 1}[0, 0, 0], Σ=PDMats.PDMat{Float64, Array{Float64, 2}}(dim=3, mat=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], chol=Base.LinAlg.Cholesky{Float64, Array{Float64, 2}}(factors=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], uplo=Char(0x00000055))))), args=(Array{Float64, 1}[-6.77912e-11, 5.10863e-11, -5.84564e-11], IncrementalInference.FactorMetadata(factoruserdata=#<null>, variableuserdata=Array{Any, 1}[
RoME.Point3(dims=3, labels=Array{String, 1}[]),
RoME.Point3(dims=3, labels=Array{String, 1}[])], variablesmalldata=#<null>, solvefor=:p2_1, variablelist=Array{Symbol, 1}[
:p1_1,
:p2_1], dbg=false), 39, (Array{Float64, 2}[-1.60718, -0.827202, -0.560053, -1.0462, 1.18009, 1.21969, -1.81406, 0.362306, 0.128432, 0.728457, -0.165647, 0.719411, 2.05103, -1.85439, 0.261555, -0.301361, 0.602443, -1.38964, -1.0192, 1.11935, -0.387305, -0.24777, -0.894487, 0.727994, 0.392731, 1.50659, 0.629323, 0.850487, 0.373695, -0.487173, -0.933595, -0.940111, 0.584677, 1.25121, -1.79677, -1.14011, -0.103531, -0.742929, -1.12193, 0.860561, 0.605493, -2.10195, 0.00396295, 1.94143, 0.300195, 1.93993, -0.28632, -0.363934, 0.678636, -2.20704, 0.356065, 0.771001, -0.368969, 0.154438, 0.887804, -0.611348, 0.771007, 0.874874, -0.101308, -0.334538, -0.132682, 2.46788, 0.213221, -0.734656, -0.986243, -0.776817, -1.24047, -1.2143, 1.35807, 0.313686, 0.862886, 0.483994, 1.33577, -0.484943, -0.149989, -1.16246, -0.322664, -1.91383, -0.0700782, -0.589759, 0.494971, -0.668845, -0.255701, 1.5658, -0.942667, 0.954469, 0.0246011, -1.24151, -1.25566, 1.12589, -1.51048, 1.54752, -0.0774541, -0.381969, -0.697515, -0.208141, -0.631817, 1.53478, 0.0639398, 0.474297, -1.10951, 1.98979, -0.636412, -0.785697, 0.778927, -2.33789, 0.461508, 0.522421, 0.68605, -0.426236, -0.843736, 2.13346, -0.686019, -1.00737, -0.898299, 0.0109888, 1.12862, 2.30876, 0.898455, -0.0431169, 1.37305, 0.597929, 0.503052, 0.407354, 0.157966, -1.37054, -1.97835, 0.188369, 0.484641, -1.58252, 0.59619, 0.261537, -0.00671637, -1.12001, 0.404141, 0.789859, -0.0560027, -0.568049, 0.35966, 0.667681, 1.02214, 1.03014, -0.0972912, -0.400676, -0.962565, -1.01264, -0.335247, -0.0050443, 0.609426, 0.599653],), Array{Float64, 2}[-8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], Array{Float64, 2}[-6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 2: Base.MethodError(f=Sonar.Point3Point3(Zij=Distributions.MvNormal{Float64, PDMats.PDMat{Float64, Array{Float64, 2}}, Array{Float64, 1}}(μ=Array{Float64, 1}[0, 0, 0], Σ=PDMats.PDMat{Float64, Array{Float64, 2}}(dim=3, mat=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], chol=Base.LinAlg.Cholesky{Float64, Array{Float64, 2}}(factors=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], uplo=Char(0x00000055))))), args=(Array{Float64, 1}[-6.77912e-11, 5.10863e-11, -5.84564e-11], IncrementalInference.FactorMetadata(factoruserdata=#<null>, variableuserdata=Array{Any, 1}[
RoME.Point3(dims=3, labels=Array{String, 1}[]),
RoME.Point3(dims=3, labels=Array{String, 1}[])], variablesmalldata=#<null>, solvefor=:p2_1, variablelist=Array{Symbol, 1}[
:p1_1,
:p2_1], dbg=false), 27, (Array{Float64, 2}[-1.60718, -0.827202, -0.560053, -1.0462, 1.18009, 1.21969, -1.81406, 0.362306, 0.128432, 0.728457, -0.165647, 0.719411, 2.05103, -1.85439, 0.261555, -0.301361, 0.602443, -1.38964, -1.0192, 1.11935, -0.387305, -0.24777, -0.894487, 0.727994, 0.392731, 1.50659, 0.629323, 0.850487, 0.373695, -0.487173, -0.933595, -0.940111, 0.584677, 1.25121, -1.79677, -1.14011, -0.103531, -0.742929, -1.12193, 0.860561, 0.605493, -2.10195, 0.00396295, 1.94143, 0.300195, 1.93993, -0.28632, -0.363934, 0.678636, -2.20704, 0.356065, 0.771001, -0.368969, 0.154438, 0.887804, -0.611348, 0.771007, 0.874874, -0.101308, -0.334538, -0.132682, 2.46788, 0.213221, -0.734656, -0.986243, -0.776817, -1.24047, -1.2143, 1.35807, 0.313686, 0.862886, 0.483994, 1.33577, -0.484943, -0.149989, -1.16246, -0.322664, -1.91383, -0.0700782, -0.589759, 0.494971, -0.668845, -0.255701, 1.5658, -0.942667, 0.954469, 0.0246011, -1.24151, -1.25566, 1.12589, -1.51048, 1.54752, -0.0774541, -0.381969, -0.697515, -0.208141, -0.631817, 1.53478, 0.0639398, 0.474297, -1.10951, 1.98979, -0.636412, -0.785697, 0.778927, -2.33789, 0.461508, 0.522421, 0.68605, -0.426236, -0.843736, 2.13346, -0.686019, -1.00737, -0.898299, 0.0109888, 1.12862, 2.30876, 0.898455, -0.0431169, 1.37305, 0.597929, 0.503052, 0.407354, 0.157966, -1.37054, -1.97835, 0.188369, 0.484641, -1.58252, 0.59619, 0.261537, -0.00671637, -1.12001, 0.404141, 0.789859, -0.0560027, -0.568049, 0.35966, 0.667681, 1.02214, 1.03014, -0.0972912, -0.400676, -0.962565, -1.01264, -0.335247, -0.0050443, 0.609426, 0.599653],), Array{Float64, 2}[-8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], Array{Float64, 2}[-6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 0: Base.MethodError(f=Sonar.Point3Point3(Zij=Distributions.MvNormal{Float64, PDMats.PDMat{Float64, Array{Float64, 2}}, Array{Float64, 1}}(μ=Array{Float64, 1}[0, 0, 0], Σ=PDMats.PDMat{Float64, Array{Float64, 2}}(dim=3, mat=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], chol=Base.LinAlg.Cholesky{Float64, Array{Float64, 2}}(factors=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], uplo=Char(0x00000055))))), args=(Array{Float64, 1}[-6.77912e-11, 5.10863e-11, -5.84564e-11], IncrementalInference.FactorMetadata(factoruserdata=#<null>, variableuserdata=Array{Any, 1}[
RoME.Point3(dims=3, labels=Array{String, 1}[]),
RoME.Point3(dims=3, labels=Array{String, 1}[])], variablesmalldata=#<null>, solvefor=:p2_1, variablelist=Array{Symbol, 1}[
:p1_1,
:p2_1], dbg=false), 1, (Array{Float64, 2}[-1.60718, -0.827202, -0.560053, -1.0462, 1.18009, 1.21969, -1.81406, 0.362306, 0.128432, 0.728457, -0.165647, 0.719411, 2.05103, -1.85439, 0.261555, -0.301361, 0.602443, -1.38964, -1.0192, 1.11935, -0.387305, -0.24777, -0.894487, 0.727994, 0.392731, 1.50659, 0.629323, 0.850487, 0.373695, -0.487173, -0.933595, -0.940111, 0.584677,1.25121, -1.79677, -1.14011, -0.103531, -0.742929, -1.12193, 0.860561, 0.605493, -2.10195, 0.00396295, 1.94143, 0.300195, 1.93993, -0.28632, -0.363934, 0.678636, -2.20704, 0.356065, 0.771001, -0.368969, 0.154438, 0.887804, -0.611348, 0.771007, 0.874874, -0.101308, -0.334538, -0.132682, 2.46788, 0.213221, -0.734656, -0.986243, -0.776817, -1.24047, -1.2143, 1.35807, 0.313686, 0.862886, 0.483994, 1.33577, -0.484943, -0.149989, -1.16246, -0.322664, -1.91383, -0.0700782, -0.589759, 0.494971, -0.668845, -0.255701, 1.5658, -0.942667, 0.954469, 0.0246011, -1.24151,-1.25566, 1.12589, -1.51048, 1.54752, -0.0774541, -0.381969, -0.697515, -0.208141, -0.631817, 1.53478, 0.0639398, 0.474297, -1.10951, 1.98979, -0.636412, -0.785697, 0.778927, -2.33789, 0.461508, 0.522421, 0.68605, -0.426236, -0.843736, 2.13346, -0.686019, -1.00737, -0.898299, 0.0109888, 1.12862, 2.30876, 0.898455, -0.0431169, 1.37305, 0.597929, 0.503052, 0.407354, 0.157966, -1.37054, -1.97835, 0.188369, 0.484641, -1.58252, 0.59619, 0.261537, -0.00671637, -1.12001, 0.404141, 0.789859, -0.0560027, -0.568049, 0.35966, 0.667681, 1.02214, 1.03014, -0.0972912, -0.400676, -0.962565, -1.01264, -0.335247, -0.0050443, 0.609426, 0.599653],), Array{Float64, 2}[-8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, -8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], Array{Float64, 2}[-6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,-6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]), world=0x0000000000005bcd)
Error thrown in threaded loop on thread 1: Base.MethodError(f=Sonar.Point3Point3(Zij=Distributions.MvNormal{Float64, PDMats.PDMat{Float64, Array{Float64, 2}}, Array{Float64, 1}}(μ=Array{Float64, 1}[0, 0, 0], Σ=PDMats.PDMat{Float64, Array{Float64, 2}}(dim=3, mat=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], chol=Base.LinAlg.Cholesky{Float64, Array{Float64, 2}}(factors=Array{Float64, 2}[1, 0, 0, 0, 1, 0, 0, 0, 1], uplo=Char(0x00000055))))), args=(Array{Float64, 1}[-6.77912e-11, 5.10863e-11, -5.84564e-11], IncrementalInference.FactorMetadata(factoruserdata=#<null>, variableuserdata=Array{Any, 1}[
RoME.Point3(dims=3, labels=Array{String, 1}[]),
RoME.Point3(dims=3, labels=Array{String, 1}[])], variablesmalldata=#<null>, solvefor=:p2_1, variablelist=Array{Symbol, 1}[
:p1_1,
:p2_1], dbg=false), 14, (Array{Float64, 2}[-1.60718, -0.827202, -0.560053, -1.0462, 1.18009, 1.21969, -1.81406, 0.362306, 0.128432, 0.728457, -0.165647, 0.719411, 2.05103, -1.85439, 0.261555, -0.301361, 0.602443, -1.38964, -1.0192, 1.11935, -0.387305, -0.24777, -0.894487, 0.727994, 0.392731, 1.50659, 0.629323, 0.850487, 0.373695, -0.487173, -0.933595, -0.940111, 0.584677, 1.25121, -1.79677, -1.14011, -0.103531, -0.742929, -1.12193, 0.860561, 0.605493, -2.10195, 0.00396295, 1.94143, 0.300195, 1.93993, -0.28632, -0.363934, 0.678636, -2.20704, 0.356065, 0.771001, -0.368969, 0.154438, 0.887804, -0.611348, 0.771007, 0.874874, -0.101308, -0.334538, -0.132682, 2.46788, 0.213221, -0.734656, -0.986243, -0.776817, -1.24047, -1.2143, 1.35807, 0.313686, 0.862886, 0.483994, 1.33577, -0.484943, -0.149989, -1.16246, -0.322664, -1.91383, -0.0700782, -0.589759, 0.494971, -0.668845, -0.255701, 1.5658, -0.942667, 0.954469, 0.0246011, -1.24151, -1.25566, 1.12589, -1.51048, 1.54752, -0.0774541, -0.381969, -0.697515, -0.208141, -0.631817, 1.53478, 0.0639398, 0.474297, -1.10951, 1.98979, -0.636412, -0.785697, 0.778927, -2.33789, 0.461508, 0.522421, 0.68605, -0.426236, -0.843736, 2.13346, -0.686019, -1.00737, -0.898299, 0.0109888, 1.12862, 2.30876, 0.898455, -0.0431169, 1.37305, 0.597929, 0.503052, 0.407354, 0.157966, -1.37054, -1.97835, 0.188369, 0.484641, -1.58252, 0.59619, 0.261537, -0.00671637, -1.12001, 0.404141, 0.789859, -0.0560027, -0.568049, 0.35966, 0.667681, 1.02214, 1.03014, -0.0972912, -0.400676, -0.962565, -1.01264, -0.335247, -0.0050443, 0.609426, 0.599653],), Array{Float64, 2}[-8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -8.6476e-11, -9.00376e-11, -7.83454e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.63141e-12, 1.78934e-10, -2.86034e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], Array{Float64, 2}[-6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -6.77912e-11, 5.10863e-11, -5.84564e-11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]), world=0x0000000000005bcd)[2x0p,d3,N50],vertex [389] "p1_1p2_1f1"
works fine when in singlethreaded:
julia> addFactor!(fg,[:p1_1,:p2_1],sf,threadmodel=IIF.SingleThreaded)
vertex [390] "p1_1p2_1f2"
same as writegraphpdf for bayes tree or keyword argument to give wipeBuildNewTree a filepath?
re #118
addprocs(1)
...
rmprocs([2])
...
addprocs(...)
Pkg.test("IncrementalInference")
Then tests fail (Travis-CI).
Do multi-process inference over tree
child.attributes["label"] = "x5,x2,l1,x4,: "
child.attributes["label"] = "x3,: x2,x4,l1,"
child.attributes["label"] = "x1,: x2,"
child.attributes["label"] = "x7,x6,: x5,"
Start Clique x7,x6,: x5, =============================
====================== Clique x7,x6,: x5, =============================
Start Clique x1,: x2, =============================
====================== Clique x1,: x2, =============================
Start Clique x3,: x2,x4,l1, =============================
====================== Clique x3,: x2,x4,l1, =============================
Start Clique x5,x2,l1,x4,: =============================
asyncProcessPostStacks -- 1, cliq=x5,x2,l1,x4,: , start on child x7,x6,: x5, haskey=false
End Clique x1,: x2, =============================
End Clique x3,: x2,x4,l1, =============================
From worker 3: up w 0 msgs------------- functional mcmc ----------------x1,: x2,
From worker 3: #1 --
From worker 3: fmcmc! -- finished on x1,: x2,
From worker 3: ------------- functional mcmc ----------------x1,: x2,
From worker 4: up w 0 msgs------------- functional mcmc ----------------x3,: x2,x4,l1,
From worker 4: #1 --
From worker 4: fmcmc! -- finished on x3,: x2,x4,l1,
From worker 4: ------------- functional mcmc ----------------x3,: x2,x4,l1,
From worker 3: #1 -- x1 [2x,d1,N200],
From worker 3: fmcmc! -- finished on x1,: x2,
From worker 3: ------------- functional mcmc ----------------x1,: x2,
From worker 3: #1 -- x2 [direct]
From worker 3: fmcmc! -- finished on x1,: x2,
From worker 4: #1 -- x3 [4x,d1,N200],
From worker 4: fmcmc! -- finished on x3,: x2,x4,l1,
From worker 4: ------------- functional mcmc ----------------x3,: x2,x4,l1,
From worker 4: #1 -- x2 [direct] x4 [direct] l1 [direct]
From worker 4: fmcmc! -- finished on x3,: x2,x4,l1,
No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself.
Check the details on how to adjust your build configuration on: https://docs.travis-ci.com/user/common-build-problems/#Build-times-out-because-no-output-was-received
The build has been terminated
Running it locally and interrupting at this break point gives
From worker 4: ------------- functional mcmc ----------------x3,: x2,x4,l1,
From worker 4: #1 -- x2 [direct] x4 [direct] l1 [direct]
From worker 4: fmcmc! -- finished on x3,: x2,x4,l1,
^C
signal (2): Interrupt
while loading /home/dehann/.julia/v0.5/IncrementalInference/test/fourdoortest.jl, in expression starting on line 71
unknown function (ip: 0x7f164d3c1568)
uv__epoll_wait at /home/centos/buildbot/slave/package_tarball64/build/deps/srccache/libuv-8d5131b6c1595920dd30644cd1435b4f344b46c8/src/unix/linux-syscalls.c:321
uv__io_poll at /home/centos/buildbot/slave/package_tarball64/build/deps/srccache/libuv-8d5131b6c1595920dd30644cd1435b4f344b46c8/src/unix/linux-core.c:267
uv_run at /home/centos/buildbot/slave/package_tarball64/build/deps/srccache/libuv-8d5131b6c1595920dd30644cd1435b4f344b46c8/src/unix/core.c:354
process_events at ./libuv.jl:82
wait at ./event.jl:147
wait at ./event.jl:27
wait at ./event.jl:319
sleep at ./event.jl:360
asyncProcessPostStacks! at /home/dehann/.julia/v0.5/IncrementalInference/src/SolveTree01.jl:530
unknown function (ip: 0x7f142df0a3f0)
jl_call_method_internal at /home/centos/buildbot/slave/package_tarball64/build/src/julia_internal.h:189 [inlined]
jl_apply_generic at /home/centos/buildbot/slave/package_tarball64/build/src/gf.c:1942
#37 at ./task.jl:360
unknown function (ip: 0x7f142df08aaf)
jl_call_method_internal at /home/centos/buildbot/slave/package_tarball64/build/src/julia_internal.h:189 [inlined]
jl_apply_generic at /home/centos/buildbot/slave/package_tarball64/build/src/gf.c:1942
jl_apply at /home/centos/buildbot/slave/package_tarball64/build/src/julia.h:1392 [inlined]
start_task at /home/centos/buildbot/slave/package_tarball64/build/src/task.c:253
unknown function (ip: 0xffffffffffffffff)
unknown function (ip: 0xffffffffffffffff)
Allocations: 40912850 (Pool: 40895095; Big: 17755); GC: 82
signal (11): Segmentation fault
while loading /home/dehann/.julia/v0.5/IncrementalInference/test/fourdoortest.jl, in expression starting on line 71
=====================================================================[ ERROR: IncrementalInference ]======================================================================
InterruptException:
==========================================================================================================================================================================
ERROR: IncrementalInference had test errors
in #test#61(::Bool, ::Function, ::Array{AbstractString,1}) at ./pkg/entry.jl:740
in (::Base.Pkg.Entry.#kw##test)(::Array{Any,1}, ::Base.Pkg.Entry.#test, ::Array{AbstractString,1}) at ./<missing>:0
in (::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}})() at ./pkg/dir.jl:31
in cd(::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}}, ::String) at ./file.jl:59
in #cd#1(::Array{Any,1}, ::Function, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at ./pkg/dir.jl:31
in (::Base.Pkg.Dir.#kw##cd)(::Array{Any,1}, ::Base.Pkg.Dir.#cd, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at ./<missing>:0
in #test#3(::Bool, ::Function, ::String, ::Vararg{String,N}) at ./pkg/pkg.jl:258
in test(::String, ::Vararg{String,N}) at ./pkg/pkg.jl:258
Implement autoinit using localProduct
Remove deprecated code from FactorGraph01:addNode!
, specifically code related to stdev
here:
setDefaultNodeData!(vert, zeros(dims,N), zeros(0,0), dodims, N, dims, initialized=!autoinit, softtype=st)
Use ContinuousScalar
and autoinit=true
. New API is mostly backwards compatible with previous interface strategy.
create graph with N=100, then try run inferOverTree!(...,N=200) and
ERROR: BoundsError: attempt to access 3x100 Array{Float64,2}:
29.0817 29.4922 29.6258 29.6568 29.4491 … 29.1488 29.3001 29.2623 30.166 31.2157
2.24395 2.10491 2.56677 1.61801 1.33273 2.59031 0.557518 3.43693 1.11443 2.60181
0.158318 0.161798 0.157094 0.15671 0.160954 0.158168 0.157333 0.157259 0.152206 0.16651
at index [Colon(),101]
in throw_boundserror at abstractarray.jl:156
in solveSetSeps at /home/dehann/.julia/v0.4/IncrementalInference/src/TreePotentials02.jl:181
in evalPotential at /home/dehann/.julia/v0.4/IncrementalInference/src/TreePotentials02.jl:243
in evalFactor2 at /home/dehann/.julia/v0.4/IncrementalInference/src/TreePotentials01.jl:149
in findRelatedFromPotential at /home/dehann/.julia/v0.4/IncrementalInference/src/TreePotentials01.jl:154
in packFromLocalPotentials! at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:120
in cliqGibbs at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:140
in fmcmc! at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:179
in upGibbsCliqueDensity at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:279
in upMsgPassingRecursive at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:458
in inferOverTreeR! at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:734
define a factor where there static calibration parameters (such as extrinsics) can be solved as a separate inference problem after the primary SLAM solution completes. These two solves could in theory iterate from one to the other. The benefit of this is that calibration parameters should not always just be part of the SLAM solve, but calibration is so very similar in spirit. For example:
fg = ...
addFactor!(fg, [:x1:x2], SpecialFactor(..., specialVar) )
...
batchSolve!(fg)
# somehow trigger the internal Rao-Blackwellized calibration problem
batchSolveRaoBVars!(fg, [SpecialVarList] )
# now repeat the original SLAM problem with new "calibrated" values
batchSolve!(fg)
cc @pvazteixeira , @nicrip
repeated constraints with potentially different values added to the graph.
So we do not have to fight over tmp/fg.dot!
inferOverTree!(...)
without setting N=?
results in error.
Something like this:
fg = ...
dbgdest = Dict{Symbol, Any}()
tree = wipeBuildNewTree!(fg)
inferOverTree!(fg, tree, dbg=dbgdest)
dbgdest[:viatree][:cliq1][:up][...]
# or
dbgdest[:viafg][1]
This would also store contents contained in factor types that explicitly have the fieldname dbg
.
MethodError: convert
has no method matching convert(::Type{Array{IncrementalInference.CliqGibbsMC,1}}, ::Type{Union{}})
This may have arisen from a call to the constructor Array{IncrementalInference.CliqGibbsMC,1}(...),
since type constructors fall back to convert methods.
Closest candidates are:
call{T}(::Type{T}, ::Any)
convert{T}(::Type{Array{T,1}}, !Matched::Range{T})
convert{T,S,N}(::Type{Array{T,N}}, !Matched::SubArray{S,N,P<:AbstractArray{T,N},I<:Tuple{Vararg{Union{AbstractArray{T,1},Colon,Int64}}},LD})
...
in call at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:26
in upGibbsCliqueDensity at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:298
in anonymous at multi.jl:920
in run_work_thunk at multi.jl:661
in run_work_thunk at multi.jl:670
in anonymous at task.jl:58
in remotecall_fetch at multi.jl:747
in call_on_owner at multi.jl:793
in asyncProcessPostStacks! at /home/dehann/.julia/v0.4/IncrementalInference/src/SolveTree01.jl:625
Currently FactorGraph
contains both.
Looks like many solver processes and visualization processes end up getting EADDRNOTAVAIL errors. Very sporadic.
ERROR: LoadError: connect: address not available (EADDRNOTAVAIL)
in connect! at socket.jl:648
in connect at socket.jl:671
in open_stream at /home/dehann/.julia/v0.4/Requests/src/streaming.jl:186
in do_stream_request at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:350
in do_request at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:290
in get at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:413
in get at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:412
in request at /home/dehann/.julia/v0.4/Neo4j/src/Neo4j.jl:174
in get_vertex at /home/dehann/.julia/v0.4/CloudGraphs/src/CloudGraphs.jl:260
in getExVertFromCloud at /home/dehann/.julia/v0.4/IncrementalInference/src/CloudGraphIntegration.jl:16
in getExVertFromCloud at /home/dehann/.julia/v0.4/IncrementalInference/src/CloudGraphIntegration.jl:21
[inlined code] from /home/dehann/.julia/v0.4/Caesar/src/DBCollectionsViewerService.jl:65
in anonymous at no file:0
in include at ./boot.jl:261
in include_from_node1 at ./loading.jl:320
in process_options at ./client.jl:280
in _start at ./client.jl:378
while loading /home/dehann/.julia/v0.4/Caesar/src/DBCollectionsViewerService.jl, in expression starting on line 41
and
ERROR: LoadError: connect: address not available (EADDRNOTAVAIL)
in connect! at socket.jl:648
in connect at socket.jl:671
in open_stream at /home/dehann/.julia/v0.4/Requests/src/streaming.jl:186
in do_stream_request at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:350
in do_request at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:290
in get at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:413
in get at /home/dehann/.julia/v0.4/Requests/src/Requests.jl:412
in request at /home/dehann/.julia/v0.4/Neo4j/src/Neo4j.jl:174
in get_vertex at /home/dehann/.julia/v0.4/CloudGraphs/src/CloudGraphs.jl:260
in getExVertFromCloud at /home/dehann/.julia/v0.4/IncrementalInference/src/CloudGraphIntegration.jl:16
Some factors may have user defined configuration files, that although saved in the jld's, change over time and are no longer compatible with the new computational types. Usual options include saving the IIF/RoME/Caesar git SHA and tell the user go there. What else can we do?
cc @mc2922
In case user wants to see only their output -- won't be the default.
cc @mc2922
This initialization procedure with setValKDE appears to take into account tight GPS priors correctly, whereas doautoinit appears to initialize around (0,0) only.
for sm in [Symbol("x$i") for i in 1:5]
IIF.doautoinit!(fg,sm)
setValKDE!(getVert(fg,sm),kde!(getSample(getfnctype(getVert(fg, Symbol("$(sm)f1"),nt=:fnc)),N)[1]))
end
See: PropFac/examples/testBackwards.jl
Currently, node labels are assumed to start with either x
or l
labels, followed by numbers only. It would be useful to have a more flexible convention on node labels, supporting, for instance, things like p74_31
I tried adding the newest RoMEPlotting stuff, but there seems to be some conflicts with required versions. Here is some of them:
(v1.0) pkg> dev RoMEPlotting
Updating git-repo `https://github.com/JuliaRobotics/RoMEPlotting.jl.git`
[ Info: Path `/home/johan/.julia/dev/RoMEPlotting` exists and looks like the correct package, using existing path instead of cloning
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package IncrementalInference [904591bb]:
IncrementalInference [904591bb] log:
├─possible versions are: [0.0.1-0.0.2, 0.1.0, 0.2.0-0.2.11, 0.3.0-0.3.9, 0.4.0-0.4.5] or uninstalled
├─restricted to versions 0.4.4-* by RoMEPlotting [238d586b], leaving only versions 0.4.4-0.4.5
│ └─RoMEPlotting [238d586b] log:
│ ├─possible versions are: 0.0.1 or uninstalled
│ └─RoMEPlotting [238d586b] is fixed to version 0.0.1+
└─restricted to versions 0.4.3 by an explicit requirement — no versions left
(v1.0) pkg> add IncrementalInference
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package NLsolve [2774e3e8]:
NLsolve [2774e3e8] log:
├─possible versions are: [0.1.0-0.1.4, 0.2.0, 0.3.0-0.3.3, 0.4.0, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.3, 0.8.0, 0.9.0-0.9.1, 0.10.0-0.10.1, 0.11.0, 0.12.0-0.12.1, 0.13.0, 0.14.0-0.14.1, 1.0.0-1.0.1, 1.1.0-1.1.1, 2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1] or uninstalled
├─restricted by julia compatibility requirements to versions: 3.0.0-3.0.1 or uninstalled
├─restricted by compatibility requirements with Caesar [62eebf14] to versions: [2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1], leaving only versions: 3.0.0-3.0.1
│ └─Caesar [62eebf14] log:
│ ├─possible versions are: [0.0.1-0.0.5, 0.1.0-0.1.1, 0.2.0-0.2.3] or uninstalled
│ └─restricted to versions 0.2.1 by an explicit requirement, leaving only versions 0.2.1
└─restricted by compatibility requirements with IncrementalInference [904591bb] to versions: 2.1.0 — no versions left
└─IncrementalInference [904591bb] log:
├─possible versions are: [0.0.1-0.0.2, 0.1.0, 0.2.0-0.2.11, 0.3.0-0.3.9, 0.4.0-0.4.5] or uninstalled
└─restricted to versions 0.4.5 by an explicit requirement, leaving only versions 0.4.5
this suddenly seems to fail for some reason:
using IncrementalInference
...
addprocs(3)
...
using IncrementalInference
Possibly losing information or not reconstructing the FactorGraph
object when loading from jld2 file.
Some situations require the measurement sampling process to know which variable is being solved for.
adopting issue defined in JuliaLang/METADATA.jl#19282
Local testing breaks in a newly created test file here (error below):
cc @pkofod -- thanks for taking a look.
I'm little at a loss as to what is happening. I am using NLsolve in a functional way: Not just passing one function to nlsolve
but actually programmatically building the required function and passing that ccw = CommonConvWrapper
wrapper function to nlsolve(ccw, x0)
. ccw
then internally does a few things and calls a function defined by the user with a much larger parameter list, that is in ccw.usrfnc!
. This all works fine in NLsolve v2.1.0
.
MethodError: no method matching iterate(::Nothing)
Closest candidates are:
iterate(!Matched::Core.SimpleVector) at essentials.jl:589
iterate(!Matched::Core.SimpleVector, !Matched::Any) at essentials.jl:589
iterate(!Matched::ExponentialBackOff) at error.jl:171
...
copyto!(::Array{Float64,1}, ::Nothing) at abstractarray.jl:646
(::getfield(NLsolve, Symbol("#f!#1")){CommonConvWrapper{OneDimensionTest{Normal{Float64}}}})(::Array{Float64,1}, ::Array{Float64,1}) at helpers.jl:6
(::getfield(NLSolversBase, Symbol("#fj!#19")){getfield(NLsolve, Symbol("#f!#1")){CommonConvWrapper{OneDimensionTest{Normal{Float64}}}},DiffEqDiffTools.JacobianCache{Array{Float64,1},Array{Float64,1},Array{Float64,1},Val{:central},Float64,Val{true}}})(::Array{Float64,1}, ::Array{Float64,2}, ::Array{Float64,1}) at oncedifferentiable.jl:93
value_jacobian!!(::OnceDifferentiable{Array{Float64,1},Array{Float64,2},Array{Float64,1}}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Float64,1}) at interface.jl:130
value_jacobian!! at interface.jl:128 [inlined]
trust_region_(::OnceDifferentiable{Array{Float64,1},Array{Float64,2},Array{Float64,1}}, ::Array{Float64,1}, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Float64, ::Bool, ::NLsolve.NewtonTrustRegionCache{Array{Float64,1}}) at trust_region.jl:119
trust_region at trust_region.jl:229 [inlined]
trust_region at trust_region.jl:229 [inlined]
#nlsolve#14(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Static, ::Float64, ::Bool, ::Int64, ::Float64, ::typeof(nlsolve), ::OnceDifferentiable{Array{Float64,1},Array{Float64,2},Array{Float64,1}}, ::Array{Float64,1}) at nlsolve.jl:23
#nlsolve at nlsolve.jl:0 [inlined]
#nlsolve#15 at nlsolve.jl:60 [inlined]
nlsolve(::Function, ::Array{Float64,1}) at nlsolve.jl:50
top-level scope at none:0
include_string(::Module, ::String, ::String) at loading.jl:1005
include_string(::Module, ::String, ::String, ::Int64) at eval.jl:30
(::getfield(Atom, Symbol("##114#119")){String,Int64,String})() at eval.jl:94
withpath(::getfield(Atom, Symbol("##114#119")){String,Int64,String}, ::String) at utils.jl:30
withpath at eval.jl:46 [inlined]
#113 at eval.jl:93 [inlined]
with_logstate(::getfield(Atom, Symbol("##113#118")){String,Int64,String}, ::Base.CoreLogging.LogState) at logging.jl:397
with_logger at logging.jl:493 [inlined]
#112 at eval.jl:92 [inlined]
hideprompt(::getfield(Atom, Symbol("##112#117")){String,Int64,String}) at repl.jl:85
macro expansion at eval.jl:91 [inlined]
(::getfield(Atom, Symbol("##111#116")))(::Dict{String,Any}) at eval.jl:86
handlemsg(::Dict{String,Any}, ::Dict{String,Any}) at comm.jl:164
(::getfield(Atom, Symbol("##19#21")){Array{Any,1}})() at task.jl:259
For example, a sequence of poses joined by factors and a separate landmark with a prior.
attributes["numposes"] = 0 if MM created landmark
It is possible to change the data types in IIF such that loadjld no longer works. Augment the existing save/loadjld functions to also load an "older" existing jld file.
Defaults to 200 in some instances, particularly JuliaRobotics/RoME.jl/test/testDidsonFunctions.jl before the line L1pts = evalFactor2(fg, f2, fg.IDs[:l1])
FactorGraph01.jl
has line:
setDefaultFactorNode!(fgl, newvert, Xi, deepcopy(usrfnc))
deepcopy seems unnecessary.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.