Coder Social home page Coder Social logo

bluegreen-labs / snotelr Goto Github PK

View Code? Open in Web Editor NEW
14.0 5.0 10.0 60.53 MB

a snow data network (SNOTEL) R package

Home Page: https://bluegreen-labs.github.io/snotelr/

License: GNU Affero General Public License v3.0

R 98.17% CSS 1.83%
snotel climate-data data-retrieval precipitation-data r-package rstats snow

snotelr's Introduction

snotelr

R-CMD-check codecov CRAN_Status_Badge DOI

snotelr is an R toolbox to facilitate easy SNOTEL data exploration and downloads through a convenient R shiny based GUI. In addition it provides a routine to extract basic snow phenology metrics.

How to cite this package in your article

You can cite this package like this "we obtained data from SNOTEL using the snotelr R package (Hufkens 2022)". Here is the full bibliographic reference to include in your reference list:

Hufkens, K. (2022). snotelr: a toolbox to facilitate easy SNOTEL data exploration and downloads in R. Zenodo. https://doi.org/10.5281/zenodo.7012728.

Installation

stable release

To install the current stable release use a CRAN repository:

install.packages("snotelr")
library("snotelr")

The use of the GUI requires the installation of additional packages, which are side loaded.

install.packages(c("DT","shinydashboard", "plotly", "leaflet"))

development release

To install the development releases of the package run the following commands:

if(!require(remotes)){install.packages("remotes")}
remotes::install_github("bluegreen-labs/snotelr")
library("snotelr")

Vignettes are not rendered by default, if you want to include additional documentation please use:

if(!require(remotes)){install.packages("remotes")}
remotes::install_github("bluegreen-labs/snotelr", build_vignettes = TRUE)
library("snotelr")

Use

Most people will prefer the GUI to explore data on the fly. To envoke the GUI use the following command:

library(snotelr)
snotel_explorer()

This will start a shiny application with an R backend in your default browser. The first window will display all site locations, and allows for subsetting of the data based upon state or a bounding box. The bounding box can be selected by clicking top-left and bottom-right.

map

The plot data tab allows for interactive viewing of the soil water equivalent (SWE) data together with a covariate (temperature, precipitation). The SWE time series will also mark snow phenology statistics, mainly the day of:

  • first snow melt
  • a continuous snow free season (last snow melt)
  • first snow accumulation (first snow deposited)
  • continuous snow accumulation (permanent snow cover)
  • seasonal maximum SWE (and its amount)

All values are provided as relative to January first of the year mentioned (spring), and absolute dates.

time_series

To access the full list of SNOTEL sites and associated meta-data use the snotel_info() function.

# returns the site info as snotel_metadata.txt in the current working directory
snotel_info(path = ".") 

# export to data frame
meta-data <- snotel_info(path = NULL) 

# show some lines of the data frame
head(meta-data)

To query data for e.g. site 924 as shown in the image above use:

snotel_download(site_id = 924)

For in depth analysis the statistics in the GUI can be retrieved using the snotel_phenology() function

# with df a SNOTEL file or data frame in your R workspace
snotel_phenology(df)

References

Hufkens, K. (2022). snotelr: a toolbox to facilitate easy SNOTEL data exploration and downloads in R. Zenodo. https://doi.org/10.5281/zenodo.7012728.

Acknowledgements

This project was in part supported by the National Science Foundation’s Macro-system Biology Program (award EF-1065029) and the Marie Skłodowska-Curie Action (H2020 grant 797668). Logo design elements are taken from the FontAwesome library according to these terms, where the globe element was inverted and intersected.

snotelr's People

Contributors

dfosterhill avatar dschneiderch avatar khufkens avatar potterzot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

snotelr's Issues

Possible Typos

Hi!

Thanks for the repository. While exploring it a little bit, I came across a few things that I think might be typos. Please look and them and see if they are.

  1. download_snotel()

In the documentation for this function, an example given is
# df = snotel_download(site = snotel_info())
which snotel_download seems to be a backwards version of the real function. Obviously this isn't a function, and I think the words were just reversed.

  1. snotel_phenology()

In the readme, it advertised the function snow_phenology(), which I think you meant to say snotel_phenology()

GUI launch failure from spaces in library path name

Hi there. Thank you for your work developing this package. It will be very helpful for my research. I have a similar issue to the previous user: failure of the GUI, except this time I think it has to do with the install path for the library.

[1] "no metadata cached, downloading metadata"
'C:/Users/Mic' is not recognized as an internal or external command,
operable program or batch file.
Warning: running command 'C:\Windows\system32\cmd.exe /c C:/Users/Mic Newc/Documents/R/win-library/3.1/snotelr/phantomjs/phantomjs.exe scrape.js > scrape.html' had status 1
Warning in shell(sprintf("%sphantomjs.exe scrape.js > scrape.html", phantomjs_path)) :
'C:/Users/Mic Newc/Documents/R/win-library/3.1/snotelr/phantomjs/phantomjs.exe scrape.js > scrape.html' execution failed with error code 1
Warning: Error in strsplit: non-character argument
Stack trace (innermost first):
42: strsplit
41: lapply
40: tolower
39: snotel.info
2: shiny::runApp
1: snotel.explorer
Error in strsplit(df$site_name, "\(") : non-character argument

Do you have any suggestions?

create static meta-data file?

Even with RSelenium and wdman on CRAN the whole routine remains a mess to unit test and the phantomjs solution feels contrived.

Considering pulling the data yearly through a Travis CI script, pushing back to the repo, and flagging for re-release. More work but probably cleaner.

Where can I find the downloaded data?

Hello Koen,

Thanks for developing this package. Looks very great especially the shiny app!

I have trouble finding the downloaded data. Where can I find it within my workspace?

dat <- download.snotel(site = 1263)
> head(dat)
NULL

new CRAN release v1.2

prepare for release:

  • Check CRAN issues
  • urlchecker::url_check()
  • devtools::check(document = FALSE)
  • rhub::check_for_cran()
  • Update cran-comments.md
  • Update NEWS.md
  • R CMD check --as-cran
  • git push

Submit to CRAN:

  • devtools::submit_cran()
  • Approve email
  • Accepted email

CRAN release 1.4

prepare for release:

  • Check CRAN issues
  • urlchecker::url_check()
  • devtools::check(document = FALSE)
  • rhub::check_for_cran()
  • Update cran-comments.md
  • Update NEWS.md
  • R CMD check --as-cran
  • git push

Submit to CRAN:

  • devtools::submit_cran()
  • Approve email
  • Accepted email
  • tag release on github

segfault when loading shiny app

Very neat package!
Unfortunately the gui won't load completely. I get a segfault, I think while downloading the metadata.
I'm on macOS 10.11.6 using R 3.3.2 installed with homebrew.

> snotel.explorer()

Listening on http://127.0.0.1:5235
data.table 1.9.6  For help type ?data.table or https://github.com/Rdatatable/data.table/wiki
The fastest way to learn (by data.table authors): https://www.datacamp.com/courses/data-analysis-the-data-table-way
[1] "no metadata cached, downloading metadata"
sh: /data/Dropbox/Research_Projects/working/evergreen_phenology/code/phantomjs/./phantomjs_osx: No such file or directory

 *** caught segfault ***
address 0x60, cause 'memory not mapped'

Traceback:
 1: .Call("xml2_doc_namespaces", PACKAGE = "xml2", doc)
 2: doc_namespaces(doc)
 3: xml_ns.xml_document(x)
 4: xml_ns(x)
 5: xpath_search(x$node, x$doc, xpath = xpath, nsMap = ns, num_results = Inf)
 6: xml_find_all.xml_node(x, make_selector(css, xpath))
 7: xml2::xml_find_all(x, make_selector(css, xpath))
 8: html_nodes.default(main, sel_data)
 9: rvest::html_nodes(main, sel_data)
10: eval(expr, envir, enclos)
11: eval(lhs, parent, parent)
12: rvest::html_nodes(main, sel_data) %>% rvest::html_table()
13: snotel.info(path = ".")
14: ..stacktraceon..({    library(shiny)    library(shinydashboard)    library(leaflet)    library(plotly)    library(DT)    library(data.table)    library(zoo)    path = sprintf("%s/shiny/snotel_explorer", path.package("snotelr"))    setwd(tempdir())    if (!file.exists("snotel_metadata.csv")) {        print("no metadata cached, downloading metadata")        snotel.info(path = ".")    }    m_time = file.info("snotel_metadata.csv")$mtime    diff_days = as.Date(m_time) - as.Date(Sys.Date())    if (diff_days > 30) {        print("metadata outdated (> 30 days), refreshing")        snotel.info()    }    df = as.data.frame(fread("snotel_metadata.csv", header = TRUE,         sep = ","))    myIcon = icons(iconUrl = sprintf("%s/snow_icon.svg", path),         iconWidth = 17, iconHeight = 17)    df$preview = apply(df, 1, function(x) paste("<table width=200px, border=0px>",         "<tr>", "<td><b>", x[3], "</b></td>", "</tr>", "<tr>",         "<td>", "Elev.: ", x[9], " m", "</td>", "</tr>", "<tr>",         "<td>", "Start Data: ", x[5], "</td>", "</tr>", "<tr>",         "<td>", "End Data: ", x[6], "</td>", "</tr>", "</table>",         sep = ""))    server = function(input, output, session) {        v1 = reactiveValues()        v2 = reactiveValues()        reset = reactiveValues()        row_clicked = reactiveValues()        filteredData = function() {            if (!is.null(isolate(v2$lat))) {                if (input$state == "ALL") {                  df[which(df$latitude < isolate(v1$lat) & df$latitude >                     isolate(v2$lat) & df$longitude > isolate(v1$lon) &                     df$longitude < isolate(v2$lon)), ]                }                else {                  df[which(df$latitude < isolate(v1$lat) & df$latitude >                     isolate(v2$lat) & df$longitude > isolate(v1$lon) &                     df$longitude < isolate(v2$lon) & df$state ==                     input$state), ]                }            }            else {                if (input$state == "ALL") {                  return(df)                }                else {                  return(df[df$state == input$state, ])                }            }        }        getValueData = function(table) {            nr_sites = length(unique(table$site_id))            output$site_count = renderInfoBox({                valueBox(nr_sites, "Sites", icon = icon("list"),                   color = "blue")            })            nr_years = round(sum((as.Date(df$end) - as.Date(df$start))/365,                 na.rm = TRUE))            output$year_count = renderInfoBox({                valueBox(nr_years, "Snow Seasons", icon = icon("list"),                   color = "blue")            })        }        output$map = renderLeaflet({            map = leaflet(df) %>% addTiles("http://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}.jpg",                 attribution = "Tiles &copy; Esri &mdash; Source: Esri, i-cubed, USDA, USGS, AEX, GeoEye, Getmapping, Aerogrid, IGN, IGP, UPR-EGP, and the GIS User Community",                 group = "World Imagery") %>% addProviderTiles("OpenTopoMap",                 group = "Open Topo Map") %>% addMarkers(lat = ~latitude,                 lng = ~longitude, icon = myIcon, popup = ~preview) %>%                 addLayersControl(baseGroups = c("World Imagery",                   "Open Topo Map"), position = c("topleft"),                   options = layersControlOptions(collapsed = TRUE)) %>%                 setView(lng = -116, lat = 41, zoom = 4)        })        observe({            leafletProxy("map", data = filteredData()) %>% clearMarkers() %>%                 addMarkers(lat = ~latitude, lng = ~longitude,                   icon = ~myIcon, popup = ~preview)            output$table = DT::renderDataTable({                tmp = filteredData()[, -c(1, 4, 7:10, 12)]                return(tmp)            }, selection = "single", options = list(lengthMenu = list(c(5,                 10), c("5", "10")), pom = list("longitude")),                 extensions = "Responsive")            getValueData(filteredData())        })        observeEvent(input$map_click, {            if (!is.null(isolate(v2$lat))) {                v1$lat = NULL                v2$lat = NULL                v1$lon = NULL                v2$lon = NULL                leafletProxy("map", data = filteredData()) %>%                   clearMarkers() %>% clearShapes() %>% addMarkers(lat = ~latitude,                   lng = ~longitude, icon = ~myIcon, popup = ~preview)                getValueData(filteredData())            }            else {                if (!is.null(isolate(v1$lat))) {                  v2$lat = input$map_click$lat                  v2$lon = input$map_click$lng                }                else {                  v1$lat = input$map_click$lat                  v1$lon = input$map_click$lng                  leafletProxy("map", data = filteredData()) %>%                     clearMarkers() %>% addMarkers(lat = ~latitude,                     lng = ~longitude, icon = ~myIcon, popup = ~preview) %>%                     addCircleMarkers(lng = isolate(v1$lon), lat = isolate(v1$lat),                       color = "red", radius = 3, fillOpacity = 1,                       stroke = FALSE)                }            }            if (!is.null(isolate(v2$lat))) {                tmp = filteredData()                if (dim(tmp)[1] != 0) {                  leafletProxy("map", data = tmp) %>% clearMarkers() %>%                     addMarkers(lat = ~latitude, lng = ~longitude,                       icon = ~myIcon, popup = ~preview) %>% addRectangles(lng1 = isolate(v1$lon),                     lat1 = isolate(v1$lat), lng2 = isolate(v2$lon),                     lat2 = isolate(v2$lat), fillColor = "transparent",                     color = "grey")                  output$table = DT::renderDataTable({                    tmp = filteredData()[, -c(1, 4, 7:10, 12)]                    return(tmp)                  }, selection = "single", options = list(lengthMenu = list(c(5,                     10), c("5", "10")), pom = list("longitude")),                     extensions = c("Responsive"))                  getValueData(filteredData())                }                else {                  v1$lat = NULL                  v2$lat = NULL                  v1$lon = NULL                  v2$lon = NULL                  leafletProxy("map", data = filteredData()) %>%                     clearMarkers() %>% clearShapes() %>% addMarkers(lat = ~latitude,                     lng = ~longitude, icon = ~myIcon, popup = ~preview)                }            }        })        downloadData = function(myrow) {            if (length(myrow) == 0) {                return(NULL)            }            df = filteredData()            site = df$site_id[as.numeric(myrow)]            progress = shiny::Progress$new()            on.exit(progress$close())            progress$set(message = "Status:", value = 0)            progress$set(value = 0.2, detail = "Reading SNOTEL data")            status = list.files(getwd(), pattern = sprintf("^snotel_%s\\.csv$",                 site))[1]            if (is.na(status)) {                status = try(download.snotel(site = site, path = "."))            }            if (inherits(status, "try-error")) {                progress$set(value = 0.3, detail = "download error!")                return(NULL)            }            else {                file = list.files(getwd(), pattern = sprintf("^snotel_%s\\.csv$",                   site))[1]                data = read.table(file, header = TRUE, sep = ",")                data = snotel.metric(data)                progress$set(value = 0.5, detail = "Calculating snow phenology")                transitions = snow.phenology(data)                print(transitions)                output = list(data, transitions)                progress$set(value = 1, detail = "Done")                return(output)            }        }        inputData = reactive({            downloadData(as.numeric(input$table_row_last_clicked))        })        output$time_series_plot = renderPlotly({            labels_covariate_col = "rgb(231,41,138)"            covariate_col = "rgba(255,51,0,0.5)"            primary_col = "rgba(51,102,255,0.8)"            envelope_col = "rgba(128,128,128,0.05)"            ltm_col = "rgba(128,128,128,0.8)"            labels = c(`Snow Water Equivalent (mm)` = "snow_water_equivalent",                 `temperature (C)` = "temperature_mean", `precipitation (mm)` = "precipitation")            primary_label = names(labels)[which(labels == input$primary)]            covariate_label = names(labels)[which(labels == input$covariate)]            data = inputData()            plot_data = data[[1]]            transition_data = data[[2]]            if (is.null(plot_data) || nrow(plot_data) == 0) {                ax = list(title = "", zeroline = FALSE, showline = FALSE,                   showticklabels = FALSE, showgrid = FALSE)                if (length(input$table_row_last_clicked) != 0) {                  p = plot_ly(x = 0, y = 0, text = "NO DATA AVAILABLE, SELECT A NEW SITE FOR PLOTTING",                     mode = "text", textfont = list(color = "#000000",                       size = 16)) %>% layout(xaxis = ax, yaxis = ax)                }                else {                  p = plot_ly(x = 0, y = 0, text = "SELECT A SITE FOR PLOTTING",                     mode = "text", textfont = list(color = "#000000",                       size = 16)) %>% layout(xaxis = ax, yaxis = ax)                }            }            else {                plot_data$primary = plot_data[, which(colnames(plot_data) ==                   input$primary)]                plot_data$covariate = plot_data[, which(colnames(plot_data) ==                   input$covariate)]                plot_data$primary = zoo::na.approx(plot_data$primary,                   na.rm = FALSE)                plot_data$covariate = zoo::na.approx(plot_data$covariate,                   na.rm = FALSE)                plot_data$date = as.Date(plot_data$date)                plot_data$doy = as.numeric(format(plot_data$date,                   "%j"))                plot_data$year = as.numeric(format(plot_data$date,                   "%Y"))                first_snow_melt = as.Date(sprintf("%s-%s", transition_data$year,                   transition_data$first_snow_melt), "%Y-%j")                cont_snow_acc = as.Date(sprintf("%s-%s", transition_data$year,                   transition_data$cont_snow_acc), "%Y-%j")                last_snow_melt = as.Date(sprintf("%s-%s", transition_data$year,                   transition_data$last_snow_melt), "%Y-%j")                first_snow_acc = as.Date(sprintf("%s-%s", transition_data$year,                   transition_data$first_snow_acc), "%Y-%j")                max_swe_date = as.Date(sprintf("%s-%s", transition_data$year,                   transition_data$max_swe_doy), "%Y-%j")                if (input$plot_type == "daily") {                  ay1 = list(title = primary_label, tickfont = list(color = primary_col),                     titlefont = list(color = primary_col), showgrid = FALSE)                  ay2 = list(tickfont = list(color = covariate_col),                     titlefont = list(color = covariate_col),                     overlaying = "y", title = "", side = "right",                     showgrid = FALSE)                  p = plot_ly(data = plot_data, x = ~date, y = ~covariate,                     yaxis = "y2", mode = "lines", type = "scatter",                     name = covariate_label, line = list(color = covariate_col)) %>%                     add_trace(y = ~primary, mode = "lines", type = "scatter",                       yaxis = "y1", line = list(color = primary_col),                       name = primary_label) %>% add_trace(x = first_snow_melt,                     y = rep(0, length(first_snow_melt)), mode = "markers",                     type = "scatter", yaxis = "y1", marker = list(color = "red",                       symbol = "square"), line = list(width = 0),                     name = "first snow melt") %>% add_trace(x = last_snow_melt,                     y = rep(0, length(last_snow_melt)), mode = "markers",                     type = "scatter", yaxis = "y1", marker = list(color = "red",                       symbol = "circle"), line = list(width = 0),                     name = "last snow melt") %>% add_trace(x = first_snow_acc,                     y = rep(0, length(first_snow_acc)), mode = "markers",                     type = "scatter", yaxis = "y1", marker = list(color = "blue",                       symbol = "circle"), line = list(width = 0),                     name = "first snow accumulation") %>% add_trace(x = cont_snow_acc,                     y = rep(0, length(cont_snow_acc)), mode = "markers",                     type = "scatter", yaxis = "y1", marker = list(color = "blue",                       symbol = "square"), line = list(width = 0),                     name = "continuous snow accumulation") %>%                     add_trace(x = max_swe_date, y = transition_data$max_swe,                       mode = "markers", type = "scatter", yaxis = "y1",                       marker = list(color = "green", symbol = "square"),                       line = list(width = 0), name = "maximum SWE") %>%                     layout(xaxis = list(title = "Date"), yaxis = ay1,                       yaxis2 = ay2, showlegend = TRUE, title = sprintf("Site ID: %s",                         filteredData()[as.numeric(input$table_row_last_clicked),                           11]))                }                else if (input$plot_type == "yearly") {                  ltm = plot_data %>% group_by(doy) %>% summarise(mn = mean(primary),                     sd = sd(primary), doymn = mean(doy))                  p = ltm %>% plot_ly(x = ~doymn, y = ~mn, mode = "lines",                     type = "scatter", name = "LTM", line = list(color = ltm_col),                     inherit = FALSE) %>% add_trace(x = ~doymn,                     y = ~mn - sd, mode = "lines", type = "scatter",                     fill = "none", line = list(width = 0, color = envelope_col),                     showlegend = FALSE, name = "SD") %>% add_trace(x = ~doymn,                     y = ~mn + sd, type = "scatter", mode = "lines",                     fill = "tonexty", line = list(width = 0,                       color = envelope_col), showlegend = TRUE,                     name = "SD") %>% add_trace(data = plot_data,                     x = ~doy, y = ~primary, split = ~year, type = "scatter",                     mode = "lines", name = primary_label, line = list(color = "Set1"),                     showlegend = TRUE) %>% layout(xaxis = list(title = "DOY"),                     yaxis = list(title = primary_label), title = sprintf("Site ID: %s",                       filteredData()[as.numeric(input$table_row_last_clicked),                         11]))                }                else if (input$plot_type == "snow_phen") {                  if (is.null(transition_data)) {                    ax = list(title = "", zeroline = FALSE, showline = FALSE,                       showticklabels = FALSE, showgrid = FALSE)                    p = plot_ly(x = 0, y = 0, text = "NO SNOW PHENOLOGY DATA AVAILABLE",                       mode = "text", textfont = list(color = "#000000",                         size = 16)) %>% layout(xaxis = ax, yaxis = ax)                  }                  else {                    if (nrow(transition_data) < 9) {                      ax = list(title = "", zeroline = FALSE,                         showline = FALSE, showticklabels = FALSE,                         showgrid = FALSE)                      p = plot_ly(x = 0, y = 0, text = "NOT ENOUGH DATA FOR A MEANINGFUL ANALYSIS",                         mode = "text", textfont = list(color = "#000000",                           size = 16)) %>% layout(xaxis = ax,                         yaxis = ax)                    }                    else {                      sos_col = "rgb(231,41,138)"                      eos_col = "rgba(231,41,138,0.4)"                      gsl_col = "rgba(102,166,30,0.8)"                      ay1 = list(title = "DOY", showgrid = FALSE)                      ay2 = list(overlaying = "y", title = "Days",                         side = "left", showgrid = FALSE)                      reg_sos = lm(transition_data$first_snow_melt ~                         transition_data$year)                      reg_eos = lm(transition_data$cont_snow_acc ~                         transition_data$year)                      reg_eos_sum = summary(reg_eos)                      reg_sos_sum = summary(reg_sos)                      r2_sos = round(reg_sos_sum$r.squared, 2)                      slp_sos = round(reg_sos_sum$coefficients[2,                         1], 2)                      r2_eos = round(reg_eos_sum$r.squared, 2)                      slp_eos = round(reg_eos_sum$coefficients[2,                         1], 2)                      p = plot_ly(x = transition_data$year, y = transition_data$first_snow_melt,                         mode = "markers", type = "scatter", name = "SOS") %>%                         add_trace(x = transition_data$year, y = transition_data$cont_snow_acc,                           mode = "markers", type = "scatter",                           name = "EOS") %>% add_trace(x = transition_data$year[as.numeric(names(reg_sos$fitted.values))],                         y = reg_sos$fitted.values, mode = "lines",                         type = "scatter", name = sprintf("R2: %s| slope: %s",                           r2_sos, slp_sos), line = list(width = 2)) %>%                         add_trace(x = transition_data$year[as.numeric(names(reg_eos$fitted.values))],                           y = reg_eos$fitted.values, type = "scatter",                           mode = "lines", name = sprintf("R2: %s| slope: %s",                             r2_eos, slp_eos), line = list(width = 2)) %>%                         layout(xaxis = list(title = "Year"),                           yaxis = ay1, showlegend = TRUE)                    }                  }                }            }        })    }})
15: eval(expr, envir, enclos)
16: eval(exprs, envir)
17: sourceUTF8(serverR, envir = new.env(parent = globalenv()))
18: func(fname, ...)
19: serverSource()
20: serverFuncSource()
21: withReactiveDomain(NULL, serverFuncSource())
22: force(expr)
23: withRestoreContext(shinysession$restoreContext, {    msg$data <- applyInputHandlers(msg$data)    switch(msg$method, init = {        serverFunc <- withReactiveDomain(NULL, serverFuncSource())        if (!identicalFunctionBodies(serverFunc, appvars$server)) {            appvars$server <- serverFunc            if (!is.null(appvars$server)) {                attr(appvars$server, "shinyServerFunction") <- TRUE                registerDebugHook("server", appvars, "Server Function")            }        }        if (.globals$showcaseOverride && exists(".clientdata_url_search",             where = msg$data)) {            mode <- showcaseModeOfQuerystring(msg$data$.clientdata_url_search)            if (!is.null(mode)) shinysession$setShowcase(mode)        }        shinysession$manageInputs(msg$data)        if (!is.null(msg$data$.clientdata_singletons)) {            shinysession$singletons <- strsplit(msg$data$.clientdata_singletons,                 ",")[[1]]        }        local({            args <- argsForServerFunc(serverFunc, shinysession)            withReactiveDomain(shinysession, {                do.call(wrapFunctionLabel(appvars$server, "server",                   ..stacktraceon = TRUE), args)            })        })    }, update = {        shinysession$manageInputs(msg$data)    }, shinysession$dispatch(msg))    shinysession$manageHiddenOutputs()    if (exists(".shiny__stdout", globalenv()) && exists("HTTP_GUID",         ws$request)) {        shiny_stdout <- get(".shiny__stdout", globalenv())        writeLines(paste("_n_flushReact ", get("HTTP_GUID", ws$request),             " @ ", sprintf("%.3f", as.numeric(Sys.time())), sep = ""),             con = shiny_stdout)        flush(shiny_stdout)        flushReact()        writeLines(paste("_x_flushReact ", get("HTTP_GUID", ws$request),             " @ ", sprintf("%.3f", as.numeric(Sys.time())), sep = ""),             con = shiny_stdout)        flush(shiny_stdout)    }    else {        flushReact()    }    flushAllSessions()})
24: withReactiveDomain(shinysession, {    if (is.character(msg))         msg <- charToRaw(msg)    traceOption <- getOption("shiny.trace", FALSE)    if (isTRUE(traceOption) || traceOption == "recv") {        if (binary)             message("RECV ", "$$binary data$$")        else message("RECV ", rawToChar(msg))    }    if (identical(charToRaw("\003\xe9"), msg))         return()    msg <- decodeMessage(msg)    if (is.null(shinysession$restoreContext)) {        bookmarkStore <- getShinyOption("bookmarkStore", default = "disable")        if (bookmarkStore == "disable") {            shinysession$restoreContext <- RestoreContext$new()        }        else {            shinysession$restoreContext <- RestoreContext$new(msg$data$.clientdata_url_search)        }    }    withRestoreContext(shinysession$restoreContext, {        msg$data <- applyInputHandlers(msg$data)        switch(msg$method, init = {            serverFunc <- withReactiveDomain(NULL, serverFuncSource())            if (!identicalFunctionBodies(serverFunc, appvars$server)) {                appvars$server <- serverFunc                if (!is.null(appvars$server)) {                  attr(appvars$server, "shinyServerFunction") <- TRUE                  registerDebugHook("server", appvars, "Server Function")                }            }            if (.globals$showcaseOverride && exists(".clientdata_url_search",                 where = msg$data)) {                mode <- showcaseModeOfQuerystring(msg$data$.clientdata_url_search)                if (!is.null(mode)) shinysession$setShowcase(mode)            }            shinysession$manageInputs(msg$data)            if (!is.null(msg$data$.clientdata_singletons)) {                shinysession$singletons <- strsplit(msg$data$.clientdata_singletons,                   ",")[[1]]            }            local({                args <- argsForServerFunc(serverFunc, shinysession)                withReactiveDomain(shinysession, {                  do.call(wrapFunctionLabel(appvars$server, "server",                     ..stacktraceon = TRUE), args)                })            })        }, update = {            shinysession$manageInputs(msg$data)        }, shinysession$dispatch(msg))        shinysession$manageHiddenOutputs()        if (exists(".shiny__stdout", globalenv()) && exists("HTTP_GUID",             ws$request)) {            shiny_stdout <- get(".shiny__stdout", globalenv())            writeLines(paste("_n_flushReact ", get("HTTP_GUID",                 ws$request), " @ ", sprintf("%.3f", as.numeric(Sys.time())),                 sep = ""), con = shiny_stdout)            flush(shiny_stdout)            flushReact()            writeLines(paste("_x_flushReact ", get("HTTP_GUID",                 ws$request), " @ ", sprintf("%.3f", as.numeric(Sys.time())),                 sep = ""), con = shiny_stdout)            flush(shiny_stdout)        }        else {            flushReact()        }        flushAllSessions()    })})
25: messageHandler(binary, msg)
26: withCallingHandlers(expr, error = function(e) {    if (is.null(attr(e, "stack.trace", exact = TRUE))) {        calls <- sys.calls()        attr(e, "stack.trace") <- calls        stop(e)    }})
27: captureStackTraces(expr)
28: withCallingHandlers(captureStackTraces(expr), error = function(cond) {    if (inherits(cond, "shiny.silent.error"))         return()    if (isTRUE(getOption("show.error.messages"))) {        printError(cond, full = full, offset = offset)    }})
29: withLogErrors(messageHandler(binary, msg))
30: handler(binary, message)
31: doTryCatch(return(expr), name, parentenv, handler)
32: tryCatchOne(expr, names, parentenv, handlers[[1L]])
33: tryCatchList(expr, classes, parentenv, handlers)
34: tryCatch(expr, error = function(e) {    call <- conditionCall(e)    if (!is.null(call)) {        if (identical(call[[1L]], quote(doTryCatch)))             call <- sys.call(-4L)        dcall <- deparse(call)[1L]        prefix <- paste("Error in", dcall, ": ")        LONG <- 75L        msg <- conditionMessage(e)        sm <- strsplit(msg, "\n")[[1L]]        w <- 14L + nchar(dcall, type = "w") + nchar(sm[1L], type = "w")        if (is.na(w))             w <- 14L + nchar(dcall, type = "b") + nchar(sm[1L],                 type = "b")        if (w > LONG)             prefix <- paste0(prefix, "\n  ")    }    else prefix <- "Error : "    msg <- paste0(prefix, conditionMessage(e), "\n")    .Internal(seterrmessage(msg[1L]))    if (!silent && identical(getOption("show.error.messages"),         TRUE)) {        cat(msg, file = stderr())        .Internal(printDeferredWarnings())    }    invisible(structure(msg, class = "try-error", condition = e))})
35: try(handler(binary, message))
36: (function (handle, binary, message) {    for (handler in .wsconns[[as.character(handle)]]$.messageCallbacks) {        result <- try(handler(binary, message))        if (inherits(result, "try-error")) {            .wsconns[[as.character(handle)]]$close()            return()        }    }})("140370466651744", FALSE, "{\"method\":\"init\",\"data\":{\"state\":\"ALL\",\"primary\":\"snow_water_equivalent\",\"covariate\":\"temperature_mean\",\"plot_type\":\"daily\",\".clientdata_output_site_count_hidden\":false,\".clientdata_output_year_count_hidden\":false,\".clientdata_output_map_hidden\":false,\".clientdata_output_table_hidden\":true,\".clientdata_output_time_series_plot_hidden\":true,\".clientdata_pixelratio\":2,\".clientdata_url_protocol\":\"http:\",\".clientdata_url_hostname\":\"127.0.0.1\",\".clientdata_url_port\":\"5235\",\".clientdata_url_pathname\":\"/\",\".clientdata_url_search\":\"\",\".clientdata_url_hash_initial\":\"\",\".clientdata_singletons\":\"\",\".clientdata_allowDataUriScheme\":true}}")
37: eval(substitute(expr), envir, enclos)
38: evalq((function (handle, binary, message) {    for (handler in .wsconns[[as.character(handle)]]$.messageCallbacks) {        result <- try(handler(binary, message))        if (inherits(result, "try-error")) {            .wsconns[[as.character(handle)]]$close()            return()        }    }})("140370466651744", FALSE, "{\"method\":\"init\",\"data\":{\"state\":\"ALL\",\"primary\":\"snow_water_equivalent\",\"covariate\":\"temperature_mean\",\"plot_type\":\"daily\",\".clientdata_output_site_count_hidden\":false,\".clientdata_output_year_count_hidden\":false,\".clientdata_output_map_hidden\":false,\".clientdata_output_table_hidden\":true,\".clientdata_output_time_series_plot_hidden\":true,\".clientdata_pixelratio\":2,\".clientdata_url_protocol\":\"http:\",\".clientdata_url_hostname\":\"127.0.0.1\",\".clientdata_url_port\":\"5235\",\".clientdata_url_pathname\":\"/\",\".clientdata_url_search\":\"\",\".clientdata_url_hash_initial\":\"\",\".clientdata_singletons\":\"\",\".clientdata_allowDataUriScheme\":true}}"),     <environment>)
39: doTryCatch(return(expr), name, parentenv, handler)
40: tryCatchOne(expr, names, parentenv, handlers[[1L]])
41: tryCatchList(expr, names[-nh], parentenv, handlers[-nh])
42: doTryCatch(return(expr), name, parentenv, handler)
43: tryCatchOne(tryCatchList(expr, names[-nh], parentenv, handlers[-nh]),     names[nh], parentenv, handlers[[nh]])
44: tryCatchList(expr, classes, parentenv, handlers)
45: tryCatch(evalq((function (handle, binary, message) {    for (handler in .wsconns[[as.character(handle)]]$.messageCallbacks) {        result <- try(handler(binary, message))        if (inherits(result, "try-error")) {            .wsconns[[as.character(handle)]]$close()            return()        }    }})("140370466651744", FALSE, "{\"method\":\"init\",\"data\":{\"state\":\"ALL\",\"primary\":\"snow_water_equivalent\",\"covariate\":\"temperature_mean\",\"plot_type\":\"daily\",\".clientdata_output_site_count_hidden\":false,\".clientdata_output_year_count_hidden\":false,\".clientdata_output_map_hidden\":false,\".clientdata_output_table_hidden\":true,\".clientdata_output_time_series_plot_hidden\":true,\".clientdata_pixelratio\":2,\".clientdata_url_protocol\":\"http:\",\".clientdata_url_hostname\":\"127.0.0.1\",\".clientdata_url_port\":\"5235\",\".clientdata_url_pathname\":\"/\",\".clientdata_url_search\":\"\",\".clientdata_url_hash_initial\":\"\",\".clientdata_singletons\":\"\",\".clientdata_allowDataUriScheme\":true}}"),     <environment>), error = function (x) x, interrupt = function (x) x)
46: .Call("httpuv_run", PACKAGE = "httpuv", timeoutMillis)
47: run(timeoutMs)
48: service(timeout)
49: serviceApp()
50: withCallingHandlers(expr, error = function(e) {    if (is.null(attr(e, "stack.trace", exact = TRUE))) {        calls <- sys.calls()        attr(e, "stack.trace") <- calls        stop(e)    }})
51: captureStackTraces({    scheduleFlush()    while (!.globals$stopped) {        serviceApp()        Sys.sleep(0.001)    }})
52: ..stacktraceoff..(captureStackTraces({    scheduleFlush()    while (!.globals$stopped) {        serviceApp()        Sys.sleep(0.001)    }}))
53: shiny::runApp(appDir, display.mode = "normal", launch.browser = TRUE)
54: snotel.explorer()

Possible actions:
1: abort (with core dump, if enabled)
2: normal R exit
3: exit R without saving workspace
4: exit R saving workspace

phenology labels

Are the end of snow(EOS)/start of snow(SOS)labels on the phenology plot reversed? I'm assuming that's what the acronyms refer to. But I'm not totally sure. Thanks.

image

alternative cache of site metadata for computers without port access

Some users (e.g. myself) don't have sufficient privileges to open a port with RSelenium as done here. Would there be interest in an alternative caching method that doesn't require the memoise and wdman packages but rather saves the site metadata as a tempfile? I'm not familiar with memoise but an alternative that doesn't require opening a port would be helpful.

Error in strsplit(df$site_name, "\\(") : non-character argument

Hi. I have used snotelr for a few years. It worked great for me last week. This morning, all of my scripts are now throwing errors. When I do something as simple as:

library(snotelr)
meta_data <- snotel_info()
head(meta_data)

I get the error:

Error in strsplit(df$site_name, "\(") : non-character argument

I would appreciate any information about how to resolve this. I didn't change anything from last week to this week.

Add new images

Dynamic renders removed doc directory. Include new images.

First run of snotelr fails (dependencies)

Hi there, just using snotelr for the first time here; looks like an awesome package. I ran into a first-install issue that may be pertinent for other users.

R version 4.1.2 (2021-11-01)
Platform: x86_64-apple-darwin17.0 (64-bit)
Running under: macOS Monterey 12.0.1

The following ran fine in R, except for a console output warning that I didn't have the foresight to copy. And, more importantly, the new window that opened in my browser gave a "dashboard header" error and failed to produce the map/plot GUI.

install.packages("snotelr")
library(snotelr)
snotel_explorer()

I had to install 3 additional packages before the explorer worked on my browser:

install.packages(c("shinydashboard", "plotly", "leaflet"))

Once those were installed, all was good. Seems like these should be dependencies, rather than suggests.

first and last snow melt day of year capped at 179

Hello, it looks like both the "first_snow_melt_doy" and "last_snow_melt_doy" values in the SNOTEL phenology data are capped at DOY 179.

You can also see the effects of this in the phenology plot below (site id 679). Is this an error and am I missing something in the interpretation of the data? Thank you. - Jeff

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.