Coder Social home page Coder Social logo

stor-i / gaussianprocesses.jl Goto Github PK

View Code? Open in Web Editor NEW
307.0 307.0 53.0 27.27 MB

A Julia package for Gaussian Processes

Home Page: https://stor-i.github.io/GaussianProcesses.jl/latest/

License: Other

Julia 10.27% Jupyter Notebook 89.52% Python 0.11% MATLAB 0.10%

gaussianprocesses.jl's People

Contributors

andreasnoack avatar baggepinnen avatar bisraelsen avatar blegat avatar chris-nemeth avatar chrisrackauckas avatar cstjean avatar devmotion avatar dressel avatar fairbrot avatar github-actions[bot] avatar jbrea avatar kykim0 avatar lootie avatar maximerischard avatar mgcth avatar michaelchirico avatar nemethis avatar nosferican avatar okonsamuel avatar pitmonticone avatar red-portal avatar samuelwiqvist avatar scheidan avatar szcf-weiya avatar thomaspinder avatar tkelman avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gaussianprocesses.jl's Issues

Observation noise documentation error

In the 1-D regression example, logObsNoise is presented as "log standard deviation of noise", but in the code, this input is never multiplied by 2 when using exp. Also, the printout of a GP seems to indicate that logObsNoise should be a variance. I think it's inconsistent as is.

As a side note, should the fieldname be different? Like maybe logNoiseVar? It's longer and uglier, but explicit.

Julia 0.5 error on `optimize!()`

I am getting the following error and a not sure what is wrong. I am calling optimize!() with a valid gp. I wasn't having this problem in Julia 0.4, and just switched to Julia 0.5.

ERROR: LoadError: MethodError: no method matching set_params!(::GaussianProcesses.GP, ::Float64; noise=true, mean=true, kern=true)
Closest candidates are:
  set_params!(::GaussianProcesses.GP, ::Array{Float64,1}; noise, mean, kern) at /home/brett/.julia/v0.5/GaussianProcesses/src/GP.jl:275
  set_params!{K<:GaussianProcesses.Kernel}(::GaussianProcesses.Masked{K<:GaussianProcesses.Kernel}, ::Any) at /home/brett/.julia/v0.5/GaussianProcesses/src/kernels/masked_kernel.jl:55 got unsupported keyword arguments "noise", "mean", "kern"
 in #optimize!#19(::Bool, ::Bool, ::Bool, ::Optim.ConjugateGradient{Void,Optim.##29#31,LineSearches.#hagerzhang!}, ::Array{Any,1}, ::Function, ::GaussianProcesses.GP) at /home/brett/.julia/v0.5/GaussianProcesses/src/optimize.jl:20
 in optimize!(::GaussianProcesses.GP) at /home/brett/.julia/v0.5/GaussianProcesses/src/optimize.jl:17
 in macro expansion; at /home/brett/GitProjects/TALAF_publications/Figures/scripts/1dBO_example.jl:130 [inlined]
 in anonymous at ./<missing>:?
 in include_from_node1(::String) at ./loading.jl:488
while loading /home/brett/GitProjects/TALAF_publications/Figures/scripts/1dBO_example.jl, in expression starting on line 96

Use Plots.jl for plotting

Using the Plots.jl package would remove the need to define optional package dependencies and define plotting functions for each plotting back end. This would open up many new plotting back-ends (such as Plotly) and only the skeleton package RecipesBase.jl would need to be added to REQUIRE.

Benchmarks against other Gaussian Processes packages

At this point, it would be interesting to run some benchmarks with simple kernels and moderately large datasets to see how this package compares to others in terms of performance. Other packages of interest include:

  • gpml
  • GPy
  • GPflow
  • any others?

Non constant observation noise

Hi

Thank you for creating this package. I only found out about GPR a few months ago and I was hoping to implement it in an algorithm I hope to write soon.
Going through the Readme it looks as though you can only specify a single value for the (log) observation noise which I assume is applied to all observations equally.
In my case (I do research in X-ray crystallography) each observation is given it's own error. Is there support for specifying a Vector of (log) observation noise to correspond to each observation?

Wrong gp.cK calculation in GP.jl function update_mll! if noise is used

  • the calculation of gp.cK in function update_mll! in GP.jl has wrong results if a noise is used
  • in line 55 a 2* should be added for the logNoise

old version
gp.cK = PDMat(crossKern(gp.x,gp.k) + exp(gp.logNoise)*eye(gp.nobsv) + 1e-8*eye(gp.nobsv))

new code version
gp.cK = PDMat(crossKern(gp.x,gp.k) + exp(2*gp.logNoise)*eye(gp.nobsv) + 1e-8*eye(gp.nobsv))

i checked the results with some testdata from the gpml matlab box, maybe someone can add some logical tests too ;-)

@document and Julia 0.4

The current master produces an error when the module is loaded with julia v0.4 as Docile is not imported and so @document is not available.

Check argument types

Now that we can have non-Gaussian data, (e.g. Vector{Bool} in the classification case), it would be useful if we could introduce an ArgumentError when initialising the GP() function.

Extend to non-Gaussian likelihoods

At the moment, parameter inference works on the assumption that the observations are Gaussian. This provides a closed form expression for the marginal likelihood used by the optimizer. This needs to be extended to non-Gaussian likelihoods where inference is performed using, for example, Laplace approximations or expectation-propagation.

Sign of derivative of log likelihood with respect to mean parameters seems to have the wrong sign

The gradient of the log likelihood with respect to the mean parameter doesn't match its numerical approximation. In fact it seems to be the right number but the wrong sign. The gradient is computed in GP.jl as gp.dmLL[i+noise] = -dot(Mgrads[:,i],gp.alpha). I'm not sure how that's obtained, so I don't know if that minus sign should just be removed, or if the issue runs deeper.

using GaussianProcesses

# Simulate example
n = 10
x = 2π * rand(n)
μ = 0.2
y = μ + sin(x) + 0.05*randn(n)
mConst = MeanConst(0.0) # constant mean
kern = SE(0.0,0.0)
logObsVar = -1.0
gp = GP(x,y,mConst,kern, logObsVar)      # Fit the GP

GaussianProcesses.update_mll_and_dmll!(gp)
prev_mLL=gp.mLL
prev_dmLL=gp.dmLL
prev_params=GaussianProcesses.get_params(gp)

dθ=[0.0, 1.0, 0.0, 0.0]*1e-4 # increment mean parameter
GaussianProcesses.set_params!(gp, prev_params.+dθ)
GaussianProcesses.update_mll_and_dmll!(gp)

println("change in log likelihood: ", gp.mLL-prev_mLL)
println("expected change in log likelihood: ", dot(prev_dmLL, dθ))

change in log likelihood: 4.723900128666969e-5
expected change in log likelihood: -4.7253655770280115e-5

No problem for the other parameters:

prev_mLL=gp.mLL
prev_dmLL=gp.dmLL

prev_params=GaussianProcesses.get_params(gp)

dθ=[1.0, 0.0, 1.0, 1.0]*1e-4 # increment all other parameters
GaussianProcesses.set_params!(gp, prev_params.+dθ)
GaussianProcesses.update_mll_and_dmll!(gp)

println("change in log likelihood: ", gp.mLL-prev_mLL)
println("expected change in log likelihood: ", dot(prev_dmLL, dθ))

change in log likelihood: -0.0007003607397644274
expected change in log likelihood: -0.0007003395285220553

predict forces Sigma to have all non-negative elements

In GP.jl, the "_predict" function computes Sigma, then does Sigma = max(Sigma,0) before returning it.
Unless this isn't the standard element-wise "max" function, this is obviously not what you want, since covariance matrices can have negative off-diagonal elements.

Base computations on PDmats.jl

Great package! Especially the having the gradient ready for optim is very neat.

It may be a good idea to base the computations on the PDMats package. This would

  1. outsource problems of numerical correct implementation (although I think you did a good job here),
  2. make use of sparsity of covariance matrices for kernels with compact support.

The later point was actually my motivation to include a sparse matrices in PDMats.

PS: Also MLKernels might be interesting to look at.

Numerical issues relating to Cholesky decomposition

The Gaussian Processes fitting sometimes fails due to a PosDef error, especially when observation noise is low. This also leads to the occasional failure of the optimize! function. Error can be avoided by increasing noise parameter of GP, or by fixing noise in the optimize!

An automated way of dealing with this will be required, for instance the Rasmussen package for Matlab automatically adds small amounts of noise to the covariance matrix when the noise parameter is very small.

Using keyword arguments in optimize!

Hi all,

I have a problem with passing keyword arguments from optimize! to the function optimize of the Optim.jl package.

When I try to set the number of iterations for the optimization (i.e. setting the keyword argument iterations) I get:

kwarg_problem

What is the proper way to use keyword arguments in optimize!?

Information about my system:
juliaversioninfo

Version of GaussianProcesses.jl:
gpversion

Variational approximations

In the case of non-Gaussian likelihoods, we're using the mcmc function to infer the model parameters and latent function, which we can think of as our posterior. Alternatively, we could use variational inference to get an approximation to the posterior. This is done using optimisation rather than sampling so should be considerably faster.

We could implement the ADVI algorithm, which in our case is even simpler as our latent function and parameters are already defined on the unconstrained real space. Therefore, we'd essentially be using optimisation find a Normal approximation to our posterior.

Write more rigourous unit tests

Unit tests as a minimum should verify that all mean and kernels function without error. Some verification that output is correct (for example by comparison with other packages) is also desirable.

Introducing GeoStats.jl

I am happy to introduce GeoStats.jl, a package that contains generalizations of Gaussian processes to situations where the mean of the random field is unknown and where more general covariance metrics are needed. It includes Gaussian processes as a special case and supports many other features for geostatistical analysis.

GeoStats.jl has a lot of overlap with GaussianProcesses.jl at the implementation level, and some of you might find it useful in your work. I opened this issue to share the project with you.

I am planning to unify many other geostatistical methods in future releases, please feel free to watch the project or mention it in the README if you think it can be useful to other people in the community.

I have written a few examples to illustrate the current features:

http://nbviewer.jupyter.org/github/juliohm/GeoStats.jl/tree/master/examples/

Unit tests for GPMC

Current unit tests are set for the exact GP constructor and should be extended to GPMC as well.

Function predict in Gp.jl seems to be slow for default case (full_cov = false)

hello,

i tried some performance tests for the predict funtion in Gp.jl. For my testdata the default case was 3x-5x slower then the full cov case.
Maybe someone can replace the original upper version with the faster one in the lower code example.
Additional i will suggest to add some performance tests, to ensure that coming changes will not decrease the performance.
(Note: as always for performance tests julia needs some 'warmup' calls before timing stays stable.)

Original "slow" version:

function predict(gp::GP, x::Matrix{Float64}; full_cov::Bool=false)
    size(x,1) == gp.dim || throw(ArgumentError("Gaussian Process object and input observations do not have consistent dimensions"))
    if full_cov
        return _predict(gp, x)
    else
        ## calculate prediction for each point independently
            mu = Array(Float64, size(x,2))
            Sigma = similar(mu)
        for k in 1:size(x,2)
            out = _predict(gp, x[:,k:k])
            mu[k] = out[1][1]
            Sigma[k] = out[2][1]
        end
        return mu, Sigma
    end
end

Faster one:

function predict(gp::GP, x::Matrix{Float64}; full_cov::Bool=false)
    size(x,1) == gp.dim || throw(ArgumentError("Gaussian Process object and input observations do not have consistent dimensions"))
    mu, Sigma = _predict(gp, x);
    if ~full_cov # use reduced predictive covariance
      Sigma = diag(Sigma);
    end
    return mu, Sigma
end

Please correct me if some informations are lost in my suggestion.

Optimize! causes a DomainError

A domain error is triggered when running

mZero = MeanZero()                             
kern = Mat(5/2,[0.0,0.0],0.0)     
gp = GP(X,Y,mZero,kern)
optimize!(gp; method=Optim.ConjugateGradient())

I'm using

GaussianProcesses             0.4.0+
Julia Version 0.5.1

I've struggled to find a minimal example to trigger the error, the best I could do is for the values of X and Y given at the end.

It seems to me that the error comes from line 107 in GP.jl
gp.mLL = -dot((gp.y - μ),gp.alpha)/2.0 - logdet(gp.cK)/2.0 - gp.nobsv*log(2π)/2.0

when returns NaN when gp.cK has a negative determinant.

Running this code on my laptop where

GaussianProcesses             0.4.0
Julia Version 0.5.1-pre+31

generates a different error when running optimize!(gp; method=Optim.ConjugateGradient())

ERROR: MethodError: no method matching set_params!(::GaussianProcesses.GP, ::Float64; noise=true, mean=true, kern=true)
Closest candidates are:
  set_params!(::GaussianProcesses.GP, ::Array{Float64,1}; noise, mean, kern) at /home/art/.julia/v0.5/GaussianProcesses/src/GP.jl:275
  set_params!{K<:GaussianProcesses.Kernel}(::GaussianProcesses.Masked{K<:GaussianProcesses.Kernel}, ::Any) at /home/art/.julia/v0.5/GaussianProcesses/src/kernels/masked_kernel.jl:55 got unsupported keyword arguments "noise", "mean", "kern"
 in #optimize!#19(::Bool, ::Bool, ::Bool, ::Optim.ConjugateGradient{Void,Optim.##29#31,LineSearches.#hagerzhang!}, ::Array{Any,1}, ::Function, ::GaussianProcesses.GP) at /home/art/.julia/v0.5/GaussianProcesses/src/optimize.jl:20
 in (::GaussianProcesses.#kw##optimize!)(::Array{Any,1}, ::GaussianProcesses.#optimize!, ::GaussianProcesses.GP) at ./<missing>:0

I'm using the data

Y = [-0.000160691 0.000561494 -0.000308228 4.14104e-5 0.000306943 6.24922e-5 -0.00013596 9.83276e-5 -0.000105637 -2.11221e-5 0.00373866 0.000200135 -0.000462546 -0.000230539 0.0003362 -0.000120488 0.000201228 0.000141567 6.60807e-5 -0.000240906 0.00527562 0.00112132 0.000880385 -0.000602714 -0.000203268 2.6165e-5 0.000257117 0.000272523 -0.000526565 -0.000142842 0.0113493 0.00218881 0.000720203 0.000591266 0.000606935 0.000629937 1.19301e-5 0.000336753 1.32784e-5 5.98963e-5 0.00307527 0.00158222 0.000883546 0.000434129 -0.000172114 0.000570647 -0.000293091 -0.000187017 0.000111851 0.00037517][:]

X =[0.01 0.2
0.01 0.4
0.01 0.6
0.01 0.8
0.01 1.0
0.01 1.2
0.01 1.4
0.01 1.6
0.01 1.8
0.01 2.0
0.02 0.2
0.02 0.4
0.02 0.6
0.02 0.8
0.02 1.0
0.02 1.2
0.02 1.4
0.02 1.6
0.02 1.8
0.02 2.0
0.03 0.2
0.03 0.4
0.03 0.6
0.03 0.8
0.03 1.0
0.03 1.2
0.03 1.4
0.03 1.6
0.03 1.8
0.03 2.0
0.04 0.2
0.04 0.4
0.04 0.6
0.04 0.8
0.04 1.0
0.04 1.2
0.04 1.4
0.04 1.6
0.04 1.8
0.04 2.0
0.05 0.2
0.05 0.4
0.05 0.6
0.05 0.8
0.05 1.0
0.05 1.2
0.05 1.4
0.05 1.6
0.05 1.8
0.05 2.0]'

Time to draw samples grows quadratically with number of samples

I would have thought that the time to draw samples grows linearly with the number of samples drawn (although I know next to nothing about GPs). It seems to grow roughly quadratically (using setup from README "Sampling from the GP"):

julia> @time prior=rand(gp, linspace(-5,5,10), 10);
  0.000228 seconds (348 allocations: 10.891 KB)

julia> @time prior=rand(gp, linspace(-5,5,100), 10);
  0.002107 seconds (3.27 k allocations: 621.484 KB)

julia> @time prior=rand(gp, linspace(-5,5,1000), 10);
  0.112869 seconds (33.06 k allocations: 54.125 MB, 8.04% gc time)

julia> @time prior=rand(gp, linspace(-5,5,10000), 10);
 15.259130 seconds (339.22 k allocations: 5.223 GB, 2.18% gc time)

Is this correct?

Sampling from prior or posterior distribution

Is it possible to draw random samples from a GP? At the very least, sampling from the prior would be very easy to implement. This would be useful for me and others as well, I'd expect. I think the conventional API would be along the lines of:

gp = GP(x,y,m,k)
rand(gp,10) # draw 10 samples from posterior

I'm also interested in sampling from the prior distribution. This is perhaps a separate issue, but I would prefer that there be a separate function fit!(gp::GP,x::Array,y::Array) to fit the model. This way I could draw samples from the prior like so:

gp = GP(m,k)  # create prior
rand(gp,10)   # draw 10 samples from prior
fit!(gp,x,y)   # fit to data
rand(gp,10)   # draw 10 samples from posterior

Edit/Update: I dug into the code a little bit and found the full_cov option, which suggests a wrapper along the lines of:

using Distributions

function rand(gp::GP, n::Int; x_ax=collect(linspace(0,1)))
    mu,sigma = predict(gp,x_ax;full_cov=true)
    return rand(MvNormal(mu,sigma),n)
end

I think this sort of works? I tend to get a Base.LinAlg.PosDefException unless I add a small identity matrix to sigma. Also, I have been trouble using gp = GP() to sample from the prior. Any help on this is greatly appreciated!

Use RDatasets in classification notebook

The data in the notebook directory is available through the RDatasets package:

using RDatasets

crab_data = datasets("MASS", "crabs")

Using RDatasets will eliminate the need to store the data files currently being used. It may also serve as a good example of how to construct a GP object from a DataFrame.

Link functions

For GP classification we have a probit link function. It would be nice if we could allow users to choose a different link function, e.g. logit.

Can't make plot() work in Julia 0.5

I am using the linear_trend.jl example:

using GaussianProcesses
using PyPlot

x=[-4.0,-3.0,-1.0,0.0,2.0];
y = 2.0x + 0.5rand(5);
xpred = collect(-5.0:0.1:5.0);
mLin = MeanLin([0.5]) # linear mean function
kern = SE(0.0,0.0) # squared exponential kernel function
gp = GP(x,y,mLin,kern) # fit the GP

This is supposed to use the function 'PyPlot.plot()' defined in src/glue/PyPlot.jl,

but it doesn't:

plot(gp)

This works, and the gp object is correctly returned. But when I do 'plot(gp)', I get a long error:

ERROR: PyError (:PyObject_Call) <type 'exceptions.TypeError'>
TypeError('float() argument must be a string or a number',)
File "/usr/local/lib/python2.7/site-packages/matplotlib/pyplot.py", line 3161, in plot
ret = ax.plot(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/matplotlib/init.py", line 1819, in inner
return func(ax, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/matplotlib/axes/_axes.py", line 1383, in plot
self.add_line(line)
File "/usr/local/lib/python2.7/site-packages/matplotlib/axes/_base.py", line 1703, in add_line
self._update_line_limits(line)
File "/usr/local/lib/python2.7/site-packages/matplotlib/axes/_base.py", line 1725, in update_line_limits
path = line.get_path()
File "/usr/local/lib/python2.7/site-packages/matplotlib/lines.py", line 938, in get_path
self.recache()
File "/usr/local/lib/python2.7/site-packages/matplotlib/lines.py", line 634, in recache
y = np.asarray(yconv, np.float
)
File "/usr/local/lib/python2.7/site-packages/numpy/core/numeric.py", line 482, in asarray
return array(a, dtype, copy=False, order=order)

in pyerr_check at /usr/home/ko/.julia/v0.5/PyCall/src/exception.jl:56 [inlined]
in pyerr_check at /usr/home/ko/.julia/v0.5/PyCall/src/exception.jl:61 [inlined]
in macro expansion at /usr/home/ko/.julia/v0.5/PyCall/src/exception.jl:81 [inlined]
in #_pycall#66(::Array{Any,1}, ::Function, ::PyCall.PyObject, ::GaussianProcesses.GP, ::Vararg{GaussianProcesses.GP,N}) at /usr/home/ko/.julia/v0.5/PyCall/src/PyCall.jl:550
in _pycall(::PyCall.PyObject, ::GaussianProcesses.GP, ::Vararg{GaussianProcesses.GP,N}) at /usr/home/ko/.julia/v0.5/PyCall/src/PyCall.jl:538
in #pycall#70(::Array{Any,1}, ::Function, ::PyCall.PyObject, ::Type{PyCall.PyAny}, ::GaussianProcesses.GP, ::Vararg{GaussianProcesses.GP,N}) at /usr/home/ko/.julia/v0.5/PyCall/src/PyCall.jl:572
in pycall(::PyCall.PyObject, ::Type{PyCall.PyAny}, ::GaussianProcesses.GP, ::Vararg{GaussianProcesses.GP,N}) at /usr/home/ko/.julia/v0.5/PyCall/src/PyCall.jl:572
in #plot#85(::Array{Any,1}, ::Function, ::GaussianProcesses.GP, ::Vararg{GaussianProcesses.GP,N}) at /usr/home/ko/.julia/v0.5/PyPlot/src/PyPlot.jl:172
in plot(::GaussianProcesses.GP, ::Vararg{GaussianProcesses.GP,N}) at /usr/home/ko/.julia/v0.5/PyPlot/src/PyPlot.jl:169
in _init at /usr/local/lib/julia/sys.so:? (repeats 2 times)
in eval_user_input(::Any, ::Base.REPL.REPLBackend) at ./REPL.jl:64
in macro expansion at ./REPL.jl:95 [inlined]
in (::Base.REPL.##3#4{Base.REPL.REPLBackend})() at ./event.jl:68

julia>

Flexible priors

The current interface for setting priors is restricted to simple models. This need to be expanded to be applicable for the fixed kernels.

The best way forward is probably to implement parameters as objects, making it easier to assign priors to the parameters.

RQIso cov function is type-unstable

using GaussianProcesses
k2=RQIso(0.0,0.0,0.0)
cov(k2, 3.0)
@code_warntype cov(k2, 3.0)

produces

Variables:
  #self#::Base.#cov
  rq::GaussianProcesses.RQIso
  r::Float64

Body:
  begin 
      SSAValue(2) = (Core.getfield)(rq::GaussianProcesses.RQIso,:σ2)::Float64
      SSAValue(1) = (Base.box)(Base.Float64,(Base.add_float)(1.0,(Base.box)(Base.Float64,(Base.div_float)(r::Float64,(Base.box)(Base.Float64,(Base.mul_float)((Base.box)(Base.Float64,(Base.mul_float)(2.0,(Core.getfield)(rq::GaussianProcesses.RQIso,)::Float64)),(Core.getfield)(rq::GaussianProcesses.RQIso,:ℓ2)::Float64))))))
      SSAValue(0) = (Base.box)(Base.Float64,(Base.neg_float)((Core.getfield)(rq::GaussianProcesses.RQIso,)::Float64))
      return (SSAValue(2) * $(Expr(:invoke, LambdaInfo for ^(::Float64, ::Float64), :(Base.^), SSAValue(1), SSAValue(0))))::ANY
  end::ANY

I think this is actually a julia bug, but I'm not sure how to reproduce it...

MethodError in predict_y when full_cov true

d, n = 3, 10

x = 2π * rand(d, n)
y = Float64[sum(sin.(x[:,i])) for i in 1:n]/d
mZero = MeanZero()
kern = SE(0.0,0.0)

gp = GPE(x, y, mZero, kern)
y_pred, sig = predict_y(gp, x)  # no problem
y_pred, sig = predict_y(gp, x; full_cov=true)  # MethodError
MethodError: no method matching +(::PDMats.PDMat{Float64,Array{Float64,2}}, ::Float64)
Closest candidates are:
  +(::Any, ::Any, ::Any, ::Any...) at operators.jl:424
  +(::Bool, ::T<:AbstractFloat) where T<:AbstractFloat at bool.jl:96
  +(::Float64, ::Float64) at float.jl:375
  ...

Stacktrace:
 [1] #predict_y#79(::Bool, ::Function, ::GaussianProcesses.GPE, ::Array{Float64,2}) at /Users/imolk/Library/Julia/packages/v0.6/GaussianProcesses/src/GPE.jl:221
 [2] (::GaussianProcesses.#kw##predict_y)(::Array{Any,1}, ::GaussianProcesses.#predict_y, ::GaussianProcesses.GPE, ::Array{Float64,2}) at ./<missing>:0

Features vs GPy

How does GaussianProcesses.jl compare - feature-wise - with GPy?
Is it production-ready?
Thanks!

Syntax changes for julia v0.6

I'm going to start a new branch for updating the syntax for julia v0.6. I'm going to assume we're not bothered with retaining v0.5 syntax compatibility, as that's a lot more work. Is that ok?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.