Comments (7)
You mean the prediction functions do not work at all with interaction covariates, or you are struggling to make them working in a manner that you want them to? Could you be a bit more specific?
An even better way would be to provide a minimal working example demonstrating your problem. Not necessarily with you data - quite the contrary actually, since simulated data that sufficiently represents your modelling task/questions would be the best.
from hmsc.
thanks for your quick answer, what I mean is that I am not sure how to interpret my beta parameter estimates and the beta plot when using an interaction term between 2 covariates. So I thought I will use the prediction function in the same way as in normal regression modelling but I tried to use the "constructGradient" function with the focal variable being the interaction covariate but it doesn't seem to work. I am not sure if the construct gradient is the best way to interpret interaction terms but I am just not sure about what other way will be the best... sorry if it is a very stupid question! thanks
For example:
set.seed(1)
n = 50
x = rnorm(n)
x2 = rnorm(n)
beta1 = 0
beta2 = 1
L = beta1 + beta2*x
y1 = L + rnorm(n, sd = 1)
Y = as.matrix(y1)
XData = data.frame(x = x, x2=x2)
m = Hmsc(Y = Y, XData = XData, XFormula = ~ x*x2)
nChains = 2
thin = 5
samples = 1000
transient = 500*thin
verbose = 500*thin
m = sampleMcmc(m, thin = thin, samples = samples,
transient = transient, nChains = nChains,
verbose = verbose)
Gradient = constructGradient(m, focalVariable = "x:x2") # where here I want to visualise the prediction of the interaction term.
from hmsc.
constructGradient
really cannot do this. It produces a data frame which has variables needed to build the model matrix. That model frame contains variables that the formula needs to build the model matrix which is used internally. For instance, for factor variables it will contain the original factors, but not their contrasts (which are used internally in model matrix). For terms like ~ poly(x, 2)
it will contain only x
and the polynomial is built internally. Same for your interaction term: your formula ~x*x2
contains only two variables x
and x2
and these are internally expanded to terms of model matrix ((Intercept)
, x
, x2
, x:x2
).
There is a clear reason for this. When you set variables to arbitrary values in model frame, the internally calculated variables should reflect these arbitrary values. That is, term x:x2
should be the product of your fixed arbitrary values x
and x2
. There is no way of doing this other way round: there is an infinite number of pairs of x
and x2
that give you the same product x:x2
. You don't have variable x:x2
, but only variables x
and x2
, and x:x2
is calculated from these variables internally.
You cannot inspect the interaction alone, and therefore your approach was doomed originally (even if some magic would make it work in constructGradient
). Interaction is an ... uhh ... interaction and it only makes sense when inspected together with main effects. My feeling is that constructGradient
is too simple a tool for effective inspection of interactions, but you should either compare the results against similar model without interaction, or craft yourself a more appropriate design, such as a grid of x
and x2
and plot the results as a surface or perspective plot. Even that can be difficult to understand and interpret.
There is an option of setting non.focalVariables
to a constant values (non.focalVariables = list(variable = list(type = 3, value))
) in which case the plot against x
is linearly related to the plot against x:x2
, but you should do this to several values of x2
how the interaction influences the results.
Numerical interactions terms can be difficult to interpret, but they may be the best choice in this case.
from hmsc.
OK, thanks for this! yes, I thought construct gradient was not the way to go but thought I will ask in case I missed something.
I guess my question will be then how to interpret output from plot beta with my interaction term? I am guessing the plot beta here is irrelevant and the comparison method with a model without interaction term is maybe the way to go (as suggested above)?
But then I still have the same issue of interpretation, if the model with interaction is different/"better" than the other model, then what is the explanation of this? For example, if my variables are "Distance to feature" and "Elevation" if my model with an interaction term between the two covariates fits better than the model without an interaction term, then how do I interpret the results of this model: with more distance, higher elevation or lower elevation, we detect fewer/more species?
Thanks
from hmsc.
I think my issue is more about the interpretation of the beta results itself because the interaction makes it hard to interpret. For example, that is one plot I got with interaction terms between multiple covariates. However, I am not sure what is the best interpretation of this result. For example, for the interaction term Distance:Management (mgt), I get a lot of negative responses but I am not sure how to interpret that: is it when both Distance and management increase that you do get not detection of species?
Thanks for the help
Solene
from hmsc.
from hmsc.
Great thanks! That helps a lot.
I will give it a go with some plots!
Cheers
Solene
from hmsc.
Related Issues (20)
- HSMC usage to infer microbial communities? [discussion] HOT 1
- Spatial random variable with 9,738 coordinates causes R to crash HOT 2
- Interpretation of model coefficients in a multivariate poisson GLM with spatial random effect
- incorrect number of dimensions HOT 3
- Can not predict at the same coordinates used to train the model
- Missing help for `importPosteriorFromHPC` function
- Error in cross validation: missing value where TRUE/FALSE needed
- predict with Yc instead of constructGradient to avoid "Error: vector memory exhausted (limit reached?)" ?
- Interpretation of `predictEtaMean` / `predictEtaMeanField` arguments of the predict function
- In cor(lbeta[[i]][k, ], lmu[[i]][k, ]) : the standard deviation is zero HOT 2
- Unexpected trace plots for alpha parameters of a GPP model HOT 4
- Error in `importPosteriorFromHPC` for GPP/Hmsc-hpc models with `alignPost = TRUE` HOT 1
- Spatial Model running extremely slow HOT 6
- Error while converting Hmsc model object to JSON: `Error in rcpp_to_json(x, unbox, digits, numeric_dates, factors_as_string, : negative length vectors are not allowed` HOT 3
- im getting this error in running the Uhlig code
- Question about making predictions when using a hurdle approach
- Inconsistency in spatial model variance partitioning
- Issue with constructGradient() Function in HMSC Package HOT 1
- The use of `setPriors` in `computePredictedValues` function HOT 1
- sampling bias correction in Hmsc (Is it possible to provide a weight to sampling units) HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from hmsc.