Coder Social home page Coder Social logo

climate4r's People

Contributors

cofinoa avatar gutierjm avatar jbedia avatar jesusff avatar miturbide avatar rmanzanas avatar zequihg50 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

climate4r's Issues

Service temporarily unavailable

Hello,
I got the below error while trying to access the open data:
Error : Service temporarily unavailable
The server is temporarily unable to service your request, please try again later.
May I know is there any way to access or download the data in this situation ?

This is the code:

loginUDG(username = "", password = "")

data("PRUDENCEregions", package = "visualizeR")
bb <- PRUDENCEregions["FR"]@bbox
lonLim <- c(-5,9); latLim <- c(42,51)

grid.list <- lapply(var.list, function(x) {
loadGridData(dataset = "ECMWF_ERA-Interim-ESD",
var = x,
lonLim = lonLim,
latLim = latLim,
years = 1980:2008)
}
)

Error doing bias correction for CMIP6 tasmax

#********************************************************************************************

load observed "tasmax" data

obs.tx <- loadGridData(dataset = "tasmax_1995.2014_K.nc",

  •                    var = 'tasmax', lonLim = c(20.25,94.75), latLim = c(10.25,39.75),
    
  •                    season = 1:12, years = 1995:2014)
    

[2022-07-12 08:28:03] Defining geo-location parameters
[2022-07-12 08:28:03] Defining time selection parameters
[2022-07-12 08:28:03] Retrieving data subset ...
[2022-07-12 08:28:07] Done

h.tx <- loadGridData(dataset = "tasmax_day_ACCESS-ESM1-5_historical.r1i1p1f1.gn.1995-2014.nc",

  •                       var = 'tasmax', lonLim = c(20.25,94.75), latLim = c(10.25,39.75),
    
  •                       season  = 1:12, years = 1995:2014)
    

[2022-07-12 08:28:07] Defining geo-location parameters
[2022-07-12 08:28:07] Defining time selection parameters
[2022-07-12 08:28:07] Retrieving data subset ...
[2022-07-12 08:28:10] Done

BCC.temp.eqm <- biasCorrection(y = obs.tx, x = h.tx,

  •                            newdata = h.tx,
    
  •                            method = "eqm",
    
  •                            extrapolation = "constant",
    
  •                            wet.threshold = 0.1)
    

[2022-07-12 08:28:10] Trying to determine the time zone...
[2022-07-12 08:28:10] Time zone identified and set to GMT
See 'setGridDates.asPOSIXlt' to change the time zone
[2022-07-12 08:28:13] Already complete date record. Nothing was done
[2022-07-12 08:28:14] Trying to determine the time zone...
[2022-07-12 08:28:14] Time zone identified and set to GMT
See 'setGridDates.asPOSIXlt' to change the time zone
[2022-07-12 08:28:16] Already complete date record. Nothing was done
[2022-07-12 08:28:17] Trying to determine the time zone...
[2022-07-12 08:28:17] Time zone identified and set to GMT
See 'setGridDates.asPOSIXlt' to change the time zone
[2022-07-12 08:28:19] Already complete date record. Nothing was done
[2022-07-12 08:28:27] Argument precipitation is set as FALSE, please ensure that this matches your data.
[2022-07-12 08:28:33] Number of windows considered: 1...
[2022-07-12 08:28:34] Bias-correcting 1 members separately...
Error in if (class(mat) == "numeric") mat <- as.matrix(mat) :
the condition has length > 1

How can I solve this problem? the unit for the tasmax datasets is K

How to save biascorrected grids?

After biasCorrection command, I need to save the resulting grid in netCDF format. I use grid2nc command and the file is correctly created. But when I try to reopen ti with loadGridData there is this error:

Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, : java.lang.ArrayIndexOutOfBoundsException: Index 3 out of bounds for length 3

Actually, I note that files correctly opened by loadGridData (i.e. ESGF model outputs) have this output:

dataInventory("path_to_file")
$pr =
$Description
'Precipitation'
...... etc........
$Dimensions
$time
...etc....
$lat
...etc...
$lon
...etc...

On the contrary files from bias-corrected grids have this type of output:

dataInventory("path_to_file")
$pr =
$Description
'Precipitation'
...... etc........
$Dimensions
$lat
...etc...
$lon
...etc...

It seems theere is no time dimension and the error seems to depend on it (Am I right???).
But time data are actually presente in the grid! Only the format is different indeed.
It is so:

$Dates
$start
'2021-01-01 UTC'.....................

Otherwise in "ordinary" grids it is so:

$Dates
$start
'2021-01-01 12:00:00 GMT'.....................

Anybody can help me to try to solve?

Singular results of biascorrection()

I am trying to use biascorrection() with isimip3 algorithm. Here the code:

pr.proj.BC <- biasCorrection(pr.arcis.ORIG.hist, pr.hist, pr.proj, precipitation=TRUE, wet.threshold=1,method="isimip3")

Now:
pr.arcis.ORIG.hist is a grid 2x2km for period 1971-2005. It is an interpolation of station data. Variable: daily cumulative precipitation
pr.hist is a grid 12x12km for period 1971-2005. It is the output of a CORDEX model in historical period. Variable: daily cumulative precipitation
pr.proj is a grid 12x12km for period 2006-2015. It is the output of a CORDEX model in forecast period. Variable: daily cumulative precipitation.

The issue I realize is that in some grid points the resulting grid pr.proj.BC have negative values for pr variable. How is it possible? Is there an error in the R script. Is the mistake mine?

PackagesNotFoundError: The following packages are not available from current channels: Climate4R

Hi there,

I have downloaded Miniconda and used the code given on Github to create a new environment and to download the climate4r packages. However, I get the error PackagesNotFoundError after using the command below.

conda install -c conda-forge -c r -c defaults -c santandermetgroup climate4r

This may be a stupid question but I am new to Miniconda/Anaconda and do not know how to resolve this.

Python dependencies for 'downscaleR'

@zequihg50 highlighted the following in issue #60:

Note however that downscaleR has several dependencies on python that need to be considered to be included in the package. See the scripts.

This applies to the isimip3 implementation for bias correction.

biasCorrection() show eorror about data dimension

I used monthly observation data and model historical output to do bias correction, which had same time length and data structure. But R report an error : Error in arr[, , , ind, , ] <- grid[["Data"]] : number of items to replace is not a multiple of replacement length
7b4a141ff591b6d0b9c71fefb9eceae
And when I input "biasCorrection()", R showed that Error in match.arg(method, choices = c("delta", "scaling", "eqm", "pqm", : 'arg'的长度必需为一
In addition: Warning message:
In if (method == "gqm") stop("'gqm' is not a valid choice anymore. Use method = 'pqm' instead and set fitdistr.args = list(densfun = 'gamma')") : the condition has length > 1 and only the first element will be used

94a9c4806ce22e3bb891fcacc92511f

Because I'm a beginner for R, I have no idea to solve this problem,can you help me?

fillGridDates :: Error in aperm.default(grid$Data, perm)

Re: bias correction using RCM data (REMO2015). I have RCM nc files that I have translated to the climate4R format:

Screen Shot 2021-02-17 at 4 38 31 PM

This was modeled on the test data in climate4R.datasets. The base raster is +proj=longlat +datum=WGS84 +no_defs, so not the original RCM grid. I have a biasCorrection() call that works just fine--tested with the climate4R.datasets. My issue is that when I use my y data (and the matching x, which is identical except for Data) I get this error:

Error in aperm.default(grid$Data, perm) :
'perm' is of wrong length 6 (!= 9)

I have diagnosed this as coming from the fillGridDates(y) line in biasCorrection(). However, I am unable to find a fix. I should note the data are consecutive, there are no time gaps to infill. The data reference the SAM CORDEX region, 5600 days on a 367 x 445 grid, missing data (non-land pixels) are NaN.

Why does climate4R use too many cpu? TOO SLOW

options(java.parameters = "-Xmx8g")

library(climate4R.UDG)
library(loadeR)
library(loadeR.2nc)
library(transformeR)
library(climate4R.datasets)
library(downscaleR)
library(visualizeR)
library(VALUE)
library(climate4R.value)

vars <- c("var151","var165","var166") #psl; uas; vas
varp <- c("var131@85000","var132@85000","var129@50000") #131-ua; 132-va; 130-ta; 129-zg;
grid.list <- lapply(vars, function(x) {
loadGridData(dataset =
"/home/inspur/working/climate4r/ERA-I/box_surface_interim_1979_2018.nc",
var = x,
years = 1990:2018)
}
)
grid.listp <- lapply(varp, function(x) {
loadGridData(dataset =
"/home/inspur/working/climate4r/ERA-I/box_pressure_interim_1979_2018.nc",
var = x,
years = 1990:2018)
}
)
pred <- downscaleCV(xs, wsobs, folds = 3, sampling.strategy = "kfold.chronological",
scaleGrid.args = list(type = "standardize"),
method = "GLM",
prepareData.args = list(
"spatial.predictors" = list(which.combine = getVarNames(xs), v.exp = 0.9)))

the downscaleCV method uses 12603% of one CPU and TOO SLOW why?
A 30*40 box of ERA-I dataset was used to downscaling dataset is small enough why take so many resource???
here is the cenos7 top result:
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
553757 inspur 20 0 105.6g 74.1g 28200 R 12603 7.4 440:22.97 R

loadGridData fails to find named variable in netCDF file

netcdf file in question:

netcdf file:/Users/dev/Downloads/MPI-ESM-LR_FWI_ET017_cal.nc {
  dimensions:
    time = UNLIMITED;   // (23741 currently)
    lon = 69;
    lat = 64;
  variables:
    double time(time=23741);
      :standard_name = "time";
      :long_name = "time";
      :units = "days since 1971-01-01 UTC";
      :calendar = "proleptic_gregorian";
      :axis = "T";
      :_ChunkSizes = 512U; // uint

    double lon(lon=69);
      :standard_name = "longitude";
      :long_name = "longitude";
      :units = "degrees_east";
      :axis = "X";

    double lat(lat=64);
      :standard_name = "latitude";
      :long_name = "latitude";
      :units = "degrees_north";
      :axis = "Y";

    float fwi(time=23741, lat=64, lon=69);
      :_FillValue = 1.0E20f; // float
      :missing_value = 1.0E20f; // float
      :description = "Fire Weather Index";
      :longname = "fwi";
      :units = "-";
      :_ChunkSizes = 1U, 64U, 69U; // uint
> x <- loadGridData(dataset = "MPI-ESM-LR_FWI_ET017_cal.nc", var = "fwi") 
Error: Variable requested not found
Check 'dataInventory' output and/or dictionary 'identifier'.
> dataInventory("MPI-ESM-LR_FWI_ET017_cal.nc")
[2021-06-15 12:03:44] Doing inventory ...
Error: No variables found

So loadGridData can't find the var named fwi... I should note this is take 2. The original calendar was gregorian (not proleptic_gregorian). Loading that file yielded:

Jun 15, 2021 12:05:57 PM ucar.nc2.dt.grid.GridCoordSys <init>
SEVERE: N/A: Error reading time coord= time
java.lang.IllegalArgumentException: days since 1971-01-01 utc does not match (\w*)\s*since\s*([\+\-\d]+)([ t]([\.\:\d]*)([ \+\-]\S*)?z?)?$
[snip]

I switched calendars with CDO and that has disappeared the var even though it is there per ncdump...

conda install fails, biasCorrection isimip3 fails

After clean install using miniconda as per github:

out <- biasCorrection(y = y, x = x, newdata = newdata, method = "isimip3")
Error in match.arg(method, choices = c("delta", "scaling", "eqm", "pqm",  : 
  'arg' should be one of “delta”, “scaling”, “eqm”, “pqm”, “gpqm”, “mva”, “loci”, “ptr”, “variance”

R is also out of date. It is R version 3.6.1 (2019-07-05).

Also

WARNING: Your current version of transformeR (v1.7.2) is not up-to-date
WARNING: Your current version of downscaleR (v3.1.0) is not up-to-date

The other BC methods work but I want to use isimip3 and qdm, which are missing.

So I try clean new install using

    > library(devtools)
    > install_github(c("SantanderMetGroup/loadeR.java",
                 "SantanderMetGroup/climate4R.UDG",
                 "SantanderMetGroup/loadeR",
                 "SantanderMetGroup/transformeR",
                 "SantanderMetGroup/visualizeR",
                 "SantanderMetGroup/downscaleR"))

As isimip3 needs dask I install using conda

> out <- biasCorrection(y = y, x = x, newdata = newdata, method = "isimip3")
Time difference of 1 days
Time difference of 1 days
Time difference of 1 days
[2021-05-03 16:45:33] Argument precipitation is set as FALSE, please ensure that this matches
 your data.
[2021-05-03 16:45:34] Number of windows considered: 1...
[2021-05-03 16:45:34] Bias-correcting 1 members separately...
Error in py_run_file_impl(file, local, convert) : 
  ModuleNotFoundError: No module named 'dask'
Detailed traceback:
  File "<string>", line 122, in <module>
  File "/home/miniconda3/lib/R/library/reticulate/python/rpytools/loader.py", line 4
4, in _import_hook
    level=level
In addition: Warning messages:
1: In fillGridDates(y) : Undefined time zone
2: In fillGridDates(x) : Undefined time zone
3: In fillGridDates(newdata) : Undefined time zone

I am stuck on why isimip3 fails. dask module is installed.

cannot access cordex datasets

Hi there,

I want to use cordex-noncommercial datasets, but the "loadGridData" function fails to find files, which were searched by "UDG.datasets(pattern = "CORDEXi-EAS-22.*rcp85", full.info=TRUE)".

Also, I can't access the dataset from "https://data.meteo.unican.es/udg-tap/home", although I have "Cordex-Noncommercial" item on authorized datasets.

Can you help me on accessing the dataset?

Error when trying to load daily precipitation data

I want to create a climatology for cmip6 precipitation data, but I want each year of data that is used to compute the climatology to reflect the monthly total precipitation as opposed to the mean. The climatology will then be the mean total precipitation across the grid. I think I do this with the loadGridData command outlined below, but I am getting an error.

testPR<-makeAggregatedDataset(source.dir="C:/Cmip6/pr/daily",recursive = T, ncml.file = "testPR.ncml")
#create dictionary and convert to mm
dictionaryPR <- tempfile(pattern = "cmipPR", fileext = ".dic")
writeLines(c("identifier,short_name,time_step,lower_time_bound,upper_time_bound,cell_method,offset,scale,deaccum,derived,interface",
             "pr,pr,24h,0,24,sum,0,86400,0,0,"), dictionaryPR)

I think there are two potential routes here: (1)

#Pull down the data aggregating by daily total/sum
testPR<-loadGridData("testPR.ncml","pr",dictionary = dictionaryPR,aggr.m = "sum",time="DD")

I don't know if this will work as I am getting the following:

[2020-11-09 15:13:09] Defining harmonization parameters for variable "pr"
[2020-11-09 15:13:09] Defining geo-location parameters
[2020-11-09 15:13:09] Defining time selection parameters
NOTE: Daily data will be monthly aggregated
[2020-11-09 15:13:09] Retrieving data subset ...
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  : 
  java.lang.OutOfMemoryError: Java heap space

Does this error reflect an issue with my hardware, or is it something that you can suggest a fix for (I should mention this is only loading three (year) files as a test case)?

Alternatively, I could apply the following per year of data in a loop; unless there is a better way?

#Create the climatology (mean monthly values for total rainfall)
testclimatology<-climatology(testPR,clim.fun = list(FUN = "sum", na.rm = TRUE))

Many thanks.

Howcan I address this error: Error in aperm.default(grid$Data, perm) : 'perm' is of wrong length 6 (!= 4)

I'm trying to regrid Cordex data to the Eobs grid (As in your example 1 at: [https://github.com/SantanderMetGroup/climate4R/blob/master/notebooks/2018_climate4R_example1.Rmd]
I am only interested in the regridding and masking of the data (not bias correcting), so I am only pulling down a small amount of the eobs data:

`#Lat and Lon to maximise the area contained in the Cordex file`
lon <- c(-40.375, 64.4)
lat <- c(22.2,72.42)# 66.65)

eobs<-"http://opendap.knmi.nl/knmi/thredds/dodsC/e-obs_0.25regular/tx_0.25deg_reg_v17.0.nc"
#load the eobs data
Eobs.grid <- loadGridData(eobs, var = "tx",
                   season = 1:12,
                   years = 1971:1980,
                   lonLim = lon,
                   latLim = lat,
                   aggr.m = "sum",
                   condition = "GT",
                   threshold = 25)

makeAggregatedDataset(source.dir="C:/UsersDocuments/Climex/tasmax/",recursive = T, ncml.file = "test.ncml")
di <- dataInventory("test.ncml")
str(di$tasmax)

#---------------------------------CREATE THE DIC FILE-----------------------------------------------------------------
dictionary <- tempfile(pattern = "dicCDX", fileext = ".dic")
#Fill it with conversion factor for celcius. 
writeLines(c("identifier,short_name,time_step,lower_time_bound,upper_time_bound,cell_method,offset,scale,deaccum,derived,interface","tasmax,tasmax,24h,0,24,max,-273.15,1,0,0,"), dictionary)

#------------PULL DOWN THE DATA
test<-loadGridData("test.ncml","tasmax",dictionary = dictionary,lonLim = lon,latLim = lat)
#Get the grid from the Eobs data and apply it to the Cordex data
test <- interpGrid(test, getGrid(Eobs.grid))

#---------------Create monthly climatologies-----------------------------------
library(dplyr)

l <- lapply(1:12, function(i) {
  subsetGrid(test, season = i) %>% climatology()
})

mg <- do.call("makeMultiGrid", c(l, skip.temporal.check = TRUE))
mg$Data<-drop(mg$Data)

#Make the land mask
eobs.mask <- gridArithmetics(Eobs.grid, 0, operator = "*")
#Transpose the data
#eobs.mask$Data<-aperm(eobs.mask$Data)
#Apply the land mask

masked_data <- gridArithmetics(mg, eobs.mask, operator = "+")

Then I get the error:

Error in aperm.default(grid$Data, perm) :
'perm' is of wrong length 6 (!= 4)

When I look at the attributes of the 2 lists, I see that the dates are different:

image

Versus:

image

So I made the dates identical using:
eobs.mask$Dates<-mg$Dates
But still I get the error. I also tried transposing the data within both the lists, but the error persists. Has this something to do with the dimensions? I saw there was an issue with this error elsewhere: SantanderMetGroup/downscaleR#30






  

Error in (min(piece, na.rm = TRUE) - head):(max(piece, na.rm = TRUE) + : result would be too long a vector

Hi,
I tried to bias correct CMIP6 data using era5. I tried using eqm, pqm and qdm. They all returned thesame error. Only ISIMIP3 worked.
Please see the error below.

precipitation

BCC_pr_eqm <- biasCorrection(y = OBS_pr, x = BCC_pr, precipitation = TRUE,

  •                          method = "eqm", wet.threshold = 0.1, window = c(30, 15), cross.val = "kfold",
    
  •                          folds = fold, parallel=T , max.ncores=12, ncores= 16)
    

[2021-12-09 20:48:29] Trying to determine the time zone...
[2021-12-09 20:48:29] Time zone identified and set to GMT
See 'setGridDates.asPOSIXlt' to change the time zone
[2021-12-09 20:48:29] Trying to determine the time zone...
[2021-12-09 20:48:29] Time zone identified and set to GMT
See 'setGridDates.asPOSIXlt' to change the time zone
[2021-12-09 20:48:29] Trying to determine the time zone...
[2021-12-09 20:48:29] Time zone identified and set to GMT
See 'setGridDates.asPOSIXlt' to change the time zone
Validation 1, 5 remaining
[2021-12-09 20:48:30] Argument precipitation is set as TRUE, please ensure that this matches your data.
Error in (min(piece, na.rm = TRUE) - head):(max(piece, na.rm = TRUE) + :
result would be too long a vector
In addition: Warning messages:
1: In array(data = c(as.numeric(yearList[seq(2, length(yearList), 3)]), :
NAs introduced by coercion
2: In array(data = c(as.numeric(yearList[seq(2, length(yearList), 3)]), :
NAs introduced by coercion
3: In min(indDays[indObs]) :
no non-missing arguments to min; returning Inf
4: In max(indDays[indObs]) :
no non-missing arguments to max; returning -Inf

problem with dependency "terra" for package "geoprocessoR"

Dependency ‘terra >= 1.5.12' is required for package ‘geoprocessoR’. This is the error message:

Error: package or namespace load failed for ‘geoprocessoR’ in loadNamespace(i, c(lib.loc, .libPaths()), versionCheck = vI[[i]]):
namespace ‘terra’ 1.4-22 is being loaded, but >= 1.5.12 is required

Creating monthly climatologies with climatology and loadGridData

This isn't an issue, but more of a request for advice: I want to ascertain if I am understanding/using the 'climatology' function and the 'loadGridData' function properly.

I want to create a monthly climatology of cmip data. I have six example historical files in a folder (1849-1855 daily) for tasmax which I aggregated and then loaded using:
test<-loadGridData("test.ncml","tasmax",dictionary = dictionary,aggr.m = "mean")
This returns a dataset of size: 73 x 256 x 512. What I want is a final file of size 12 x 256 x 512 (one grid for each month averaged across all the years chosen)

I think I need to write a loop, wherein I load each month of data at a time and compute the monthly climatology:

for (i in1:12){
test<-loadGridData("test.ncml","tasmax",dictionary = dictionary,season=i)
A [i]<- lapply (1, function (x) climatology(test))
rm(test)
}

Am I missing something here in the 'aggr' function? It seems that the aggr.m option gives all the (individual) month means (As opposed to a mean for all Januarys, a mean for all Februarys, a mean for all Marchs etc).
Thanks for your time.

Failing to install loaderR and climate4R.climdex

Hi!

I'm following the installation instructions from README.md, but failing to install loadeR and climate4R.climdex with both options:

  1. Option 1: conda install -y -c conda-forge -c r -c defaults -c santandermetgroup r-climate4r cannot resolve the environment for more than 5 hours (I assume that using conda instead of mamba is ok);
  2. Option 2: Installation of the packages results in non-zero exit status (see screenshot attached).

image

When I'm trying to load loadeR or climate4R.climdex I get the following:

image

Importantly, transformeR, downscaleR and visualizeR are loading.

Any solutions? And thank you in advance 👍 !

Difficulties in installing the package

Kind developers,
I'm experimenting some issues during the installation of the package climate4R.climdex. I followed the instructions in your readme but here it is the result:

Error: Failed to install 'climate4R.climdex' from GitHub:
(converted from warning) installation of package ‘C:/Users/MAZANE~1/AppData/Local/Temp/RtmpUrPQal/file2f806ead3cca/climate4R.climdex_0.2.1.tar.gz’ had non-zero exit status

Thank you.

Can't load models from UDG.datasets()

It's been a while since I've used climate4R. Yesterday I ran the same script that used to work a few months ago, but now it is not possible to loadGridData.
e.g.,
tasmin.hist = loadGridData(dataset = "CORDEX-EUR-11_CNRM-CERFACS-CNRM-CM5_historical_r1i1p1_CLMcom-CCLM4-8-17_v1",
var = "tas",
lonLim = lon,
latLim = lat)
Which gives the error:
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
java.lang.NullPointerException: Cannot invoke "org.apache.http.client.methods.CloseableHttpResponse.close()" because "closeableHttpResponse" is null

Is this related with different versions of the packages? Some guidance would be greatly welcomed!

loadGridData() doesn't recognize vertical levels

Hi,

The "24h_ok.nc" contains 4 dimensions: 3 longitudes, 3 latitudes, 50 levels and 61 times; for 3 variables: "ssrd", "strd" and "tp".

When I try to apply loadGridData or dataInventory() I get the 50 vertical levels for the "ssrd" and "slrd" variables, but only levels 4, 20 and 37 for "tp".

I checked using other commands and libraries, for instance: ncdf4::nc_open() and I was able to retrieve the 50 levels for "tp", any idea about what could be the problem when using loadGridData()?

24h_ok.zip

image
image
image

I know, I am able to get the level if I use loadGridData(forecast_24h, var=paste0(v,"@",1)) where "v" is the target variable, but the problem is I am only getting the levels 4,20 and 37 for "tp".

Thank you very much in advance!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.