Coder Social home page Coder Social logo

simulator's People

Contributors

adw96 avatar jacobbien avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

simulator's Issues

large parameter values error

Hi Jacob! When using large parameter values, I get a mysterious error message that took me a long time to debug. The issue is that the files are written to, e.g., normal_random_variables/n_1e+05/model.Rdata, which fails is_valid_component_name. Can scipen in options() be temporarily changed?

I know I can have n_over_a_thousand as an input instead, so I'm reporting this to help people in future from spending as much time as I did time debugging that mysterious error.

Thanks for your help!

make_data <- function(n) {
  new_model(name = "normal_random_variables",
            label = sprintf("Some data with n=", n),
            params = list(n = n),
            simulate = function(n, nsim) {
              replicate(nsim, rnorm(n), simplify=F)
            })
}
mysterious_error <- new_simulation(name = "large-values",
                                   label = "Large values break even the best software") %>%
  generate_model(make_data, seed = 1,
                 n = as.list(c(1e3, 1e4, 1e5, 1e6)),
                 vary_along = c("n")) 

Determine which method is crashing when running in parallel

Hi, Jacob,

First, this is a fantastic package and it has helped me a lot -- thanks so much for producing and maintaining it.

I was wondering if there is any way to detect which method is crashing my simulation when running run_method in parallel. For example, here is a toy example with 2 methods, one of which randomly crashes.

test-crash.txt

I would love to be able to determine which method is crashing, and be able to pull the simulated data that's crashing it so I can explore the behaviour of the method and debug.

Can the simulator help me with this?

Thanks for your help!

Amy

Error: sha1() has no method for the [phyloseq] class

Hi fellow simulator users - posting an issue and a workaround to save time for others. If you are working with models that take in S4 objects, you may get the error Error: sha1() has no method for the [...] class. For example, I was running something like

new_simulation("Fancy Sim", "my-sim") %>%
  generate_model(my_model, 
                 ps = my_nonstandard_S4_object, 
                 sig_sq = as.list(seq(0, 10, length.out = 11)),
                 vary_along = "sig_sq") %>%
  simulate_from_model(nsim = 10)

and my error was Error: sha1() has no method for the 'phyloseq' class.

The specific line that fails in simulator is digest::sha1(nonstandard_S4_object), which, as far as I can tell, is used to create unique identifiers for each draw from the model. If your S4 object doesn't have a representation using digest::sha1, your simulator code will fail.

In my case, I could convert a phyloseq S4 object to a tibble/data frame using speedyseq::as_tibble a la

new_simulation("Fancy Sim", "my-sim") %>%
  generate_model(my_model_v2, 
                 ps = my_nonstandard_S4_object %>% speedyseq::as_tibble, 
                 sig_sq = as.list(seq(0, 10, length.out = 11)),
                 vary_along = "sig_sq") %>%
  simulate_from_model(nsim = 10)

and this worked fine (with a corresponding modification to my_model).

Hope this helps others - I had no idea what sha1() was and had to do some digging, so I hope this saves you some time ๐Ÿ˜ธ

Vector model parameters in models lead to extremely long file names (which can exceed Windows file name limit)

I've run in to an issue when trying to simulate data from a model with many parameters, including parameters that can be vectors. The names for the model directories for which vector arguments are passed are very long, eventually exceeding Window's permitted file name length leading to an error.

Below is a reproducible example:

library(simulator)

make_data <- function(beta1, beta2, beta3, beta4) {
  new_model(name = "test",
            label = "test",
            params = list(beta1 = beta1, beta2 = beta2, beta3 = beta3, beta4 = beta4),
            simulate = function(nsim, beta1, beta2, beta3, beta4) {
              data <- vector(mode = "list", length = nsim)
              for(i in 1:nsim){
                x <- rnorm(1000)
                x_matrix <- matrix(x, nrow = 200, ncol = 5)
                y1 <- x_matrix %*% beta1
                y2 <- x_matrix %*% beta2
                y3 <- x_matrix %*% beta3
                y4 <- x_matrix %*% beta4
                data[[i]] <- cbind(x_matrix, y1, y2, y3, y4)
              }
              return(data)
            })
}

sim <- new_simulation(name = "long_names",
                      label = "file names get too long") %>%
  generate_model(make_data, seed = 123,
                 beta1 = list(c(1,1,1,1,1), c(2,2,2,2,2)),
                 beta2 = list(c(1,1,1,1,1), c(2,2,2,2,2)),
                 beta3 = list(c(1,1,1,1,1), c(2,2,2,2,2)),
                 beta4 = list(c(1,1,1,1,1), c(2,2,2,2,2)),
                 vary_along = c("beta1", "beta2", "beta3", "beta4")) %>% 
  simulate_from_model(nsim = 2) 
# I get error here as windows cannot create new directories due to file name length

Models with different parameters are run using the same seed

Thanks for developing this package - I really like the way it structures simulations.

I've been running some simulations and have noticed some unexpected correlations in my results. I've traced this to the following issue:

Expected behavior: Data generated from a model with different parameter values will be independent

Observed behaviour: Some data generated from models with different parameters is perfectly correlated

Example

library(simulator)

make_data <- function(n_covariates) {
  new_model(name = "test",
            label = paste0("n_covariates = ", n_covariates),
            params = list(n_covariates = n_covariates),
            simulate = function(n_covariates, nsim) {
            data <- vector(mode = "list", length = nsim)
              for(i in 1:nsim){
              x <- vector(mode = "list", length = n_covariates)
              for(j in 1:n_covariates){
                x[[j]]<-  rnorm(1000)
              }
            data[[i]] <- x
            }
            return(data)
            })
}

sim <- new_simulation(name = "correlated_draws",
               label = "data from different models is the same") %>%
  generate_model(make_data, seed = 1234,
               n_covariates = list(1,2,3,4,5,6,7,8,9,10),
               vary_along = c("n_covariates")) %>% 
  simulate_from_model(nsim = 2) 
  
x1_model1.1 <- draws(sim)[[1]]@draws$r1.1[[1]]
x1_model2.1 <- draws(sim)[[2]]@draws$r1.1[[1]]

cor(x1_model1.1, x1_model2.1) # This is 1, I'd expect this to be zero as different draws should be independent

I believe this issue is caused by generate_model passing the same seed to generate_model_single for each different value of the parameters being varied along. This even happens if no seed is set as the default is for a seed of 123 to be used. A work around is to use seed = NULL in generate model.

simple simulation is too slow

First of all, thank you very much for this package. I really like its modularity.

I am running a simple toy simulation to assess the performance of the sample mean at sample sizes 5, 10 and 15 if the underlying distribution is standard Gaussian. Unfortunately, the simulation is taking a lot more time than I expected.

If I implement the simulation using the package "SimDesign", I get the following benchmark for 1000 replicates

   user  system elapsed 
   1.93    0.37    2.47

while for the "simulator" package, I am getting

user  system elapsed 
 333.61    0.08  334.13

Here's the code that I am using

#MSE of the sample mean
library(SimDesign)
library(simulator)

#SimDesign implementation

n <- c(5, 10, 15)

Design <- expand.grid(n = n)

Generate <- function(condition, fixed_objects = NULL){
  rnorm(n = condition$n)
}

Analyse <- function(condition, dat, fixed_objects = NULL){
  mean(dat)
}

Summarise <- function(condition, results, fixed_objects = NULL){
  c(MSE = mean(results^2))
}

system.time(results <- runSimulation(design = Design, replications = 1000, generate = Generate, analyse = Analyse, summarise = Summarise))      

#simulator implementation
make_normal_model <- function(n){
  new_model(name = "normal", label = sprintf("N(0, 1) with sample size %s", n), params = list(n = n), simulate = function(n, nsim){
    Y <- replicate(n = nsim, expr = rnorm(n = n))
    return(split(Y, col(Y)))
  })
}

mean_method <- new_method(name = "mean", label = "Sample Mean", method = function(model, draw){
  mean(draw)
})

squaredError_metric <- new_metric(name = "se", label = "Squared Error", metric = function(model, out){
  return((out$out)^2)})

system.time({sim <- new_simulation(name = "mean", label = "mean") %>% generate_model(make_normal_model, n = list(5, 10, 15), vary_along = "n") %>% simulate_from_model(nsim = 1000) %>% run_method(methods = list(mean_method)) %>% evaluate(metrics = list(squaredError_metric))})

tabulate_eval(sim, metric_name = "se")

I would really appreciate it if you can point me to where the problem lies or if this is an issue with the simulator.

PS: I am using R version 3.6.0 on a PC with Win 10 x64, Core i3, (magnetic) HDD and 4GB RAM.
Also, the problem seems to be with the "run_method" part as it takes much longer time than simulating the data and applying the metric.

Thank you.

Best way to combine models parameters with evals

Hi everyone -- I was looking for an easier way to combine information about the model parameters for each simulation with the resulting data.frame(evals) object, and @jacobbien kindly suggested the following easy solution that I'm posting here for the benefit of others who may want this functionality.

For a simulation called sim, you can access the model information in the following way

ev_df <- sim %>% evals %>% as.data.frame
model_df <- sim %>% model %>% as.data.frame
ev_with_model_params <- dplyr::right_join(model_df, ev_df, by = c("name" = "Model"))

In particular, the columns of model_df contain the model parameters, saving you from string processing ev_df$Model to get the model parameters.

Hope this helps someone and many thanks to @jacobbien for providing this solution!

Problem in `is_valid_rij_list()`

Here is a minimal example from @gregfaletto showing the problem:

library(simulator)

set.seed(12375)

simulate_func <- function(x, nsim){
  
  ret_list <- list()
  
  for(i in 1:nsim){
    ret_list[[i]] <- 1
  }
  
  return(ret_list)
}

error_sim_model <- function(x){
  
  my_model <- new_model(name = "error_sim_model", 
                        label = "err_model",
                        params = list(x=x),
                        simulate = simulate_func
  )
  return(my_model)
}

bad_meth <- new_method("bad_meth", "Bad method",
                       method = function(model, draw) {
  if(as.logical(rbinom(n=1, size=1, prob=0.5))){
    return(list(a=2))
  } else{
    return(list(b=3))
  }
  return(ret)
}
)

sim <- new_simulation("sim", "Error sim")

sim <- generate_model(sim, error_sim_model, x=2)

sim <- simulate_from_model(sim, nsim = 20)

sim <- run_method(sim, list(bad_meth))

Add Model Component to existing simulation

In the Benjamini-Hochberg Vignette you show how to add more draws to an existing simulation. But I'm trying to add more Model Components to a simulation. The reprex below is my attempt. But it looks like the draws are being regenerated for the existing Model Components. Is there a way such that only the newly added Model Component gets new draws?

library(simulator)
library(mvtnorm)
make_correlated_pvalues <- function(n, pi0, rho) {
  # Gaussian copula model...
  #
  # n pvalues, the first n*pi0 of which are null, coming from a multivariate
  # normal with all correlations rho.
  sigma <- matrix(rho, n, n)
  diag(sigma) <- 1
  n0 <- round(n * pi0)
  delta <- 2 # size of signal
  mu <- rep(c(0, delta), c(n0, n - n0)) # n0 are null
  new_model(name = "correlated-pvalues",
            label = sprintf("pi0 = %s, rho = %s", pi0, rho),
            params = list(n = n, rho = rho, sigma = sigma,
                          pi0 = pi0, mu = mu, delta = delta,
                          nonnull = which(mu != 0)),
            simulate = function(n, mu, sigma, nsim) {
              # this function must return a list of length nsim
              x <- rmvnorm(nsim, mean = mu, sigma = sigma)
              pvals <- 1 - pnorm(x)
              return(split(pvals, row(pvals))) # make each row its own list element
            })
}


make_bh <- function(q) {
  # q is the desired level of control for the FDR
  new_method(name = paste0("bh", q),
             label = sprintf("BH (q = %s)", q),
             settings = list(q = q),
             method = function(model, draw, q) {
               p <- sort(draw)
               cutline <- seq(model$n) * q / model$n
               threshold <- max(p[p <= cutline], 0)
               list(rejected = which(draw <= threshold))
             })
}

qvalues <- c(0.05, 0.1, 0.2)
bh_methods <- sapply(qvalues, make_bh)



fdp <- new_metric(name = "fdp",
                  label = "false discovery proportion",
                  metric = function(model, out) {
                    fp <- setdiff(out$rejected, model$nonnull)
                    nd <- max(length(out$rejected), 1)
                    return(length(fp) / nd)
                  })

nd <- new_metric(name = "nd",
                 label = "number of discoveries",
                 metric = function(model, out) length(out$rejected))


name_of_simulation <- "fdr"
sim <- new_simulation(name = name_of_simulation,
                      dir = "simulation/",
                      label = "False Discovery Rate") %>%
  generate_model(make_correlated_pvalues, seed = 123,
                 n = 100,
                 pi0 = 0.8,
                 rho = 0.1) %>%
  simulate_from_model(nsim = 2, index = 1:2) %>%
  run_method(bh_methods, parallel = list(socket_names = 4, libraries = "mvtnorm")) %>%
  evaluate(list(fdp, nd))
#> ..Created model and saved in correlated-pvalues/n_100/pi0_0.8/rho_0.1/model.Rdata
#> ..Simulated 2 draws in 0.08 sec and saved in correlated-pvalues/n_100/pi0_0.8/rho_0.1/r1.Rdata
#> ..Simulated 2 draws in 0.07 sec and saved in correlated-pvalues/n_100/pi0_0.8/rho_0.1/r2.Rdata
#> Shutting down cluster.
#> ..Performed BH (q = 0.05) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.05) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.1) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.1) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.2) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.2) in 0 seconds (on average over 2 sims)
#> ..Created 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 1, method_name = "bh0.05", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 2, method_name = "bh0.05", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 1, method_name = "bh0.1", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 2, method_name = "bh0.1", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 1, method_name = "bh0.2", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 2, method_name = "bh0.2", out_loc = "out", simulator.files = "files") 
#> in parallel.
#> ..Evaluated BH (q = 0.05) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.05) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.1) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.1) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.2) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.2) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)

sim %>% model()
#> Model Component
#>  name: correlated-pvalues/n_100/pi0_0.8/rho_0.1
#>  label: pi0 = 0.8, rho = 0.1
#>  params: n rho sigma pi0 mu delta nonnull
#>  (Add @params to end of this object to see parameters.)
#>  (Add @simulate to end of this object to see how data is simulated.)

sim <- sim %>% 
  generate_model(make_correlated_pvalues, seed = 123,
                 n = 100,
                 pi0 = 1,
                 rho = 0.9) %>%
  simulate_from_model(nsim = 2, index = 1:2) %>%
  run_method(bh_methods, parallel = list(socket_names = 4, libraries = "mvtnorm")) %>%
  evaluate(list(fdp, nd))
#> ..Created model and saved in correlated-pvalues/n_100/pi0_1/rho_0.9/model.Rdata
#> ..Simulated 2 draws in 0.03 sec and saved in correlated-pvalues/n_100/pi0_0.8/rho_0.1/r1.Rdata
#> ..Simulated 2 draws in 0.02 sec and saved in correlated-pvalues/n_100/pi0_0.8/rho_0.1/r2.Rdata
#> ..Simulated 2 draws in 0.03 sec and saved in correlated-pvalues/n_100/pi0_1/rho_0.9/r1.Rdata
#> ..Simulated 2 draws in 0.04 sec and saved in correlated-pvalues/n_100/pi0_1/rho_0.9/r2.Rdata
#> Shutting down cluster.
#> ..Performed BH (q = 0.05) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.05) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.1) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.1) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.2) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.2) in 0 seconds (on average over 2 sims)
#> ..Created 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 1, method_name = "bh0.05", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 2, method_name = "bh0.05", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 1, method_name = "bh0.1", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 2, method_name = "bh0.1", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 1, method_name = "bh0.2", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_0.8/rho_0.1", index = 2, method_name = "bh0.2", out_loc = "out", simulator.files = "files") 
#> in parallel.
#> Shutting down cluster.
#> ..Performed BH (q = 0.05) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.05) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.1) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.1) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.2) in 0 seconds (on average over 2 sims)
#> ..Performed BH (q = 0.2) in 0 seconds (on average over 2 sims)
#> ..Created 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_1/rho_0.9", index = 1, method_name = "bh0.05", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_1/rho_0.9", index = 2, method_name = "bh0.05", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_1/rho_0.9", index = 1, method_name = "bh0.1", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_1/rho_0.9", index = 2, method_name = "bh0.1", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_1/rho_0.9", index = 1, method_name = "bh0.2", out_loc = "out", simulator.files = "files") 
#> new("OutputRef", dir = "/tmp/RtmpVM9xW2/reprex153e551eed9f/simulation", model_name = "correlated-pvalues/n_100/pi0_1/rho_0.9", index = 2, method_name = "bh0.2", out_loc = "out", simulator.files = "files") 
#> in parallel.
#> ..Evaluated BH (q = 0.05) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.05) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.1) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.1) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.2) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.2) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.05) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.05) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.1) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.1) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.2) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)
#> ..Evaluated BH (q = 0.2) in terms of 
#> false discovery proportion, number of discoveries, Computing time (sec)

sim %>% model()
#> [[1]]
#> Model Component
#>  name: correlated-pvalues/n_100/pi0_0.8/rho_0.1
#>  label: pi0 = 0.8, rho = 0.1
#>  params: n rho sigma pi0 mu delta nonnull
#>  (Add @params to end of this object to see parameters.)
#>  (Add @simulate to end of this object to see how data is simulated.)
#> 
#> [[2]]
#> Model Component
#>  name: correlated-pvalues/n_100/pi0_1/rho_0.9
#>  label: pi0 = 1, rho = 0.9
#>  params: n rho sigma pi0 mu delta nonnull
#>  (Add @params to end of this object to see parameters.)
#>  (Add @simulate to end of this object to see how data is simulated.)
#> 
#> attr(,"class")
#> [1] "listofModels" "list"

Created on 2019-12-04 by the reprex package (v0.3.0)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.