Coder Social home page Coder Social logo

geocompx / geocompr Goto Github PK

View Code? Open in Web Editor NEW
1.5K 44.0 580.0 229.92 MB

Geocomputation with R: an open source book

Home Page: https://r.geocompx.org/

License: Other

R 87.98% CSS 0.08% TeX 11.56% Makefile 0.28% Dockerfile 0.11%
r spatial book geo geospatial geospatial-data simple-features raster maps mapping-tools education rstats rspatial geos gdal postgis course geography geocomputation

geocompr's Introduction

Geocomputation with R

Binder RstudioCloud Actions Docker discord Open in GitHub Codespaces

Introduction

This repository hosts the code underlying Geocomputation with R, a book by Robin Lovelace, Jakub Nowosad, and Jannes Muenchow. If you find the contents useful, please cite it as follows:

Lovelace, Robin, Jakub Nowosad and Jannes Muenchow (2019). Geocomputation with R. The R Series. CRC Press.

The first version of the book has been published by CRC Press in the R Series and can be viewed online at bookdown.org. Read the latest version at r.geocompx.org.

Note: we are actively working on the Second Edition 🏗

Summary of the changes

Since commencing work on the Second Edition in September 2021 much has changed, including:

  • Replacement of raster with terra in Chapters 1 to 7 (see commits related to this update here)
  • Update of Chapter 7 to include mention alternative ways or reading-in OSM data in #656
  • Refactor build settings so the book builds on Docker images in the geocompr/docker repo
  • Improve the experience of using the book in Binder (ideal for trying out the code before installing or updating the necessary R packages), as documented in issue #691 (thanks to yuvipanda)
  • Improved communication of binary spatial predicates in Chapter 4 (see #675)
  • New section on the links between subsetting and clipping (see #698) in Chapter 5
  • New section on the dimensionally extended 9 intersection model (DE-9IM)
  • New chapter on raster-vector interactions split out from Chapter 5
  • New section on the sfheaders package
  • New section in Chapter 2 on spherical geometry engines and the s2 package
  • Replacement of code based on the old mlr package with code based on the new mlr3 package, as described in a huge pull request

See https://github.com/geocompx/geocompr/compare/1.9…main for a continuously updated summary of the changes to date. At the time of writing (April 2022) there have been more than 10k lines of code/prose added, lots of refactoring!

Contributions at this stage are very welcome.

Contributing

We encourage contributions on any part of the book, including:

  • improvements to the text, e.g., clarifying unclear sentences, fixing typos (see guidance from Yihui Xie);
  • changes to the code, e.g., to do things in a more efficient way;
  • suggestions on content (see the project’s issue tracker);
  • improvements to and alternative approaches in the Geocompr solutions booklet hosted at r.geocompx.org/solutions (see a blog post on how to update solutions in files such as _01-ex.Rmd here)

See our-style.md for the book’s style.

Many thanks to all contributors to the book so far via GitHub (this list will update automatically): prosoitos, florisvdh, babayoshihiko, katygregg, tibbles-and-tribbles, Lvulis, rsbivand, iod-ine, KiranmayiV, cuixueqin, defuneste, zmbc, erstearns, FlorentBedecarratsNM, dcooley, darrellcarvalho, marcosci, appelmar, MikeJohnPage, eyesofbambi, krystof236, nickbearman, tylerlittlefield, giocomai, KHwong12, LaurieLBaker, MarHer90, mdsumner, pat-s, sdesabbata, ahmohil, ateucher, annakrystalli, andtheWings, kant, gavinsimpson, Himanshuteli, yutannihilation, howardbaek, jimr1603, jbixon13, olyerickson, yvkschaefer, katiejolly, kwhkim, layik, mpaulacaldas, mtennekes, mvl22, ganes1410, richfitz, VLucet, wdearden, yihui, adambhouston, chihinl, cshancock, e-clin, ec-nebi, gregor-d, jasongrahn, p-kono, pokyah, schuetzingit, tim-salabim, tszberkowitz, vlarmet.

During the project we aim to contribute ‘upstream’ to the packages that make geocomputation with R possible. This impact is recorded in our-impact.csv.

Downloading the source code

The recommended way to get the source code underlying Geocomputation with R on your computer is by cloning the repo. You can can that on any computer with Git installed with the following command:

git clone https://github.com/geocompx/geocompr.git

An alternative approach, which we recommend for people who want to contribute to open source projects hosted on GitHub, is to install the gh CLI tool. From there cloning a fork of the source code, that you can change and share (including with Pull Requests to improve the book), can be done with the following command:

gh repo fork geocompx/geocompr # (gh repo clone geocompx/geocompr # also works)

Both of those methods require you to have Git installed. If not, you can download the book’s source code from the URL https://github.com/geocompx/geocompr/archive/refs/heads/main.zip . Download/unzip the source code from the R command line to increase reproducibility and reduce time spent clicking around:

u = "https://github.com/geocompx/geocompr/archive/refs/heads/main.zip"
f = basename(u)
download.file(u, f)        # download the file
unzip(f)                   # unzip it
file.rename(f, "geocompr") # rename the directory
rstudioapi::openProject("geococompr") # or open the folder in vscode / other IDE

Reproducing the book in R/RStudio/VS Code

To ease reproducibility, we created the geocompkg package. Install it with the following commands:

install.packages("remotes")
# To reproduce the first Part (chapters 1 to 8):
install.packages('geocompkg', repos = c('https://geocompr.r-universe.dev', 'https://cloud.r-project.org'), dependencies = TRUE, force = TRUE)

Installing geocompkg will also install core packages required for reproducing Part 1 of the book (chapters 1 to 8). Note: you may also need to install system dependencies if you’re running Linux (recommended) or Mac operating systems. You also need to have the remotes package installed:

To reproduce book in its entirety, run the following command (which installs additional ‘Suggests’ packages, this may take some time to run!):

# Install packages to fully reproduce book (may take several minutes):
options(repos = c(
  geocompx = 'https://geocompx.r-universe.dev',
  cran = 'https://cloud.r-project.org/'
))
# From geocompx.r-universe.dev (recommended):
install.packages("geocompkg", dependencies = TRUE)

# Alternatively from GitHub:
remotes::install_github("geocompr/geocompkg", dependencies = TRUE)

You need a recent version of the GDAL, GEOS, PROJ and udunits libraries installed for this to work on Mac and Linux. See the sf package’s README for information on that. After the dependencies have been installed you should be able to build and view a local version the book with:

# Change this depending on where you have the book code stored:
rstudioapi::openProject("~/Downloads/geocompr")
 # or code /location/of/geocompr in the system terminal
 # or cd /location/of/geocompr then R in the system terminal, then:
bookdown::render_book("index.Rmd") # to build the book
browseURL("_book/index.html")      # to view it
# Or, to serve a live preview the book and observe impact of changes:
bookdown::serve_book()

Geocompr in a devcontainer

A great feature of VS Code is devcontainers, which allow you to develop in an isolated Docker container. If you have VS Code and the necessary dependencies installed on your computer, you can build Geocomputation with R in a devcontainer as shown below (see #873 for details):

Geocompr in Binder

For many people the quickest way to get started with Geocomputation with R is in your web browser via Binder. To see an interactive RStudio Server instance click on the following button, which will open mybinder.org with an R installation that has all the dependencies needed to reproduce the book:

Launch Rstudio Binder

You can also have a play with the repository in RStudio Cloud by clicking on this link (requires log-in):

Launch Rstudio Cloud

Geocomputation with R in a Docker container

See the geocompx/docker repository for details.

Reproducing this README

To reduce the book’s dependencies, scripts to be run infrequently to generate input for the book are run on creation of this README.

The additional packages required for this can be installed as follows:

source("code/extra-pkgs.R")

With these additional dependencies installed, you should be able to run the following scripts, which create content for the book, that we’ve removed from the main book build to reduce package dependencies and the book’s build time:

source("code/01-cranlogs.R")
source("code/sf-revdep.R")
source("code/09-urban-animation.R")
source("code/09-map-pkgs.R")

Note: the .Rproj file is configured to build a website not a single page. To reproduce this README use the following command:

rmarkdown::render("README.Rmd", output_format = "github_document", output_file = "README.md")

Citations

The main packages used in this book are cited from packages.bib. Other citations are stored online using Zotero.

If you would like to add to the references, please use Zotero, join the open group add your citation to the open geocompr library.

We use the following citation key format:

[auth:lower]_[veryshorttitle:lower]_[year]

This can be set from inside Zotero desktop with the Better Bibtex plugin installed (see github.com/retorquere/zotero-better-bibtex) by selecting the following menu options (with the shortcut Alt+E followed by N), and as illustrated in the figure below:

Edit > Preferences > Better Bibtex

Zotero settings: these are useful if you want to add references.

When you export the citations as a .bib file from Zotero, use the Better BibTex (not BibLaTeX) format.

We use Zotero because it is a powerful open source reference manager that integrates well with citation tools in VS Code and RStudio.

geocompr's People

Contributors

appelmar avatar babayoshihiko avatar cuixueqin avatar darrellcarvalho avatar dcooley avatar defuneste avatar eblondel avatar erstearns avatar eyesofbambi avatar florentbedecarratsnm avatar florisvdh avatar iod-ine avatar jannes-m avatar katygregg avatar kiranmayiv avatar krystof236 avatar lvulis avatar marcosci avatar mikejohnpage avatar nickbearman avatar nowosad avatar pat-s avatar prosoitos avatar robinlovelace avatar rsbivand avatar sdesabbata avatar smkerr avatar tibbles-and-tribbles avatar tylerlittlefield avatar zmbc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

geocompr's Issues

Write chapter 5 on transforming data

Will contain information on CRS transformations (raster and vector) as well as afine transforms. Could also contain transformation in raster datasets.

Issues with dplyr verbs

From Chapter 3

library(tidyverse)
library(sf) #0.4-3
library(spData) # inc spatial data and datsets
f = system.file("shapes/wrld.shp", package = "spData")
world = st_read(f)


# this works
world_few_rows = world[world$population > 1e9,]

#OR
world_few_rows = world %>% 
        filter(population > 1e9)
Error in filter_impl(.data, quo) : Result must have length 177, not 12180

and

world_orig = world # create copy of world dataset for future reference
world = select(world_orig, name_long, continent, population = pop)
 #Error in select.sf(world_orig, name_long, continent, population = pop) : requires dplyr > 0.5.0: install that first, then reinstall sf


> sessionInfo()
R version 3.3.2 (2016-10-31)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows >= 8 x64 (build 9200)

locale:
[1] LC_COLLATE=English_Canada.1252  LC_CTYPE=English_Canada.1252   
[3] LC_MONETARY=English_Canada.1252 LC_NUMERIC=C                   
[5] LC_TIME=English_Canada.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
 [1] bindrcpp_0.1       spData_0.1-18      sf_0.4-3           dplyr_0.6.0       
 [5] purrr_0.2.2.2      readr_1.1.1        tidyr_0.6.3        tibble_1.3.1      
 [9] ggplot2_2.2.1.9000 tidyverse_1.1.1   

loaded via a namespace (and not attached):
 [1] Rcpp_0.12.10.1   bindr_0.1        cellranger_1.1.0 plyr_1.8.4      
 [5] forcats_0.2.0    tools_3.3.2      jsonlite_1.4     lubridate_1.6.0 
 [9] gtable_0.2.0     nlme_3.1-131     lattice_0.20-34  rlang_0.1.1     
[13] psych_1.7.5      DBI_0.6-1        parallel_3.3.2   haven_1.0.0     
[17] stringr_1.2.0    httr_1.2.1       knitr_1.16       xml2_1.1.1      
[21] hms_0.3          grid_3.3.2       glue_1.0.0       R6_2.2.1        
[25] readxl_1.0.0     foreign_0.8-68   udunits2_0.13    reshape2_1.4.2  
[29] modelr_0.1.0     magrittr_1.5     units_0.4-4      scales_0.4.1    
[33] assertthat_0.2.0 mnormt_1.5-5     rvest_0.3.2      colorspace_1.3-2
[37] stringi_1.1.5    lazyeval_0.2.0   munsell_0.4.3    broom_0.4.2  

Installing geocompr fails

Installation of each package fails with the message:'Don't know how to decompress files with extension - ' . Where - appears to be the version number of the particular package e.g. 3-23 or 94-5 etc.

I'm using RStudio 1.0.136 (R v3.2.0) on Mac OS X 10.10.5

Thanks for your help - looking forward to reading (contributing?)

Stu

Make point-pattern.Rmd a chapter

I think this should be the new chapter 10, after raster-vector.

Sound good? Please give it a bash if you find some spare time (p.s. there are more typos in there - try to spot them!) @Nowosad.

Section on best practices for visualisation

Hi there, your book already looks amazing!

It would be great to add a section dedicated to best practices for visualisation, e.g. use of color-blind friendly palettes. A package that provides some nice palettes is viridis. There are also tools to select good color schemes for maps and other graphics (e.g. colorbrewer2).

The use of these palettes throughout the book would also help strenghten the message.

Name

I think Geocomputation with R is a better working title and that the repo would benefit from being called geocompR but let's decide after April 11th.

issue with dplyr and shiny

I am trying to use dplyr verbs in reactive functions in a shiny app (and I am pretty new to shiny). The app is quite long, so I am not attaching it. Basically the issue is with the instruction within the server function which subset one of the datasets I am loading:

  # some code here
  # load files (from the data folder) ---------------------------------------

  studies <- readRDS("data/studies.rds")
  samples <- readRDS("data/samples.rds")
  edges <- readRDS("data/edges.rds")
  taxa <- readRDS("data/taxa.rds")
  version <- readRDS("data/version.rds")
 
 # some more code here defining the ui

server <- function(input, output) {
  # some code irrelevant to the issue (defines other outputs)
  
  # samples table, view tab -------------------------------------------------
  # Render selectInput for study, View tab 
    
    output$choose_study <- renderUI({
    study_names <- studies$studyId
    selectInput("view_study",
                label = h5("Select one or more studies to explore samples"), 
                choices = study_names, 
                selected = "ST1",
                multiple = TRUE)    
  })
  
  # Reactively subset so that only the selected study is viewed

  f_vsamples <- reactive({
    subset(samples, studyId %in% input$view_study, 
           select = c(label_2, s_type, n_reads2, foodId, llabel, L1, description))
  })

  output$vsamples_table <- DT::renderDataTable(
    f_vsamples(),
    rownames = F,
    escape = F,
    colnames = c("label" = "label_2", 
                 "sample type" = "s_type", 
                 "reads" = "n_reads2",
                 "code" = "foodId", 
                 "ext. code" = "llabel",
                 "food group" = "L1",
                 "description" = "description"),
    options = list(pageLength = 5,
                   lengthMenu = c(5, 10, 25, 20),
                   columnDefs = list(
                     list(
                       targets = 5,
                       render = JS(
                         "function(data, type, row, meta) {",
                         "return type === 'display' && data.length > 30 ?",
                         "'<span title=\"' + data + '\">' + data.substr(0, 30) + '...</span>' : data;","}"
                       )
                     )
                   )
                   
    )
  )
}

this works fine and the table use rendered correctly. However, when I replace

  f_vsamples <- reactive({
    dplyr::filter(samples, studyId %in% input$view_study) %>% 
           dplyr::select(label_2, s_type, n_reads2, foodId, llabel, L1, description)
  })

the table is still generated correctly, but I get a warning in the R console

Listening on http://127.0.0.1:4980
Warning: Error in filter_impl: Result must have length 1723, not 0
Stack trace (innermost first):
    99: <Anonymous>
    98: stop
    97: filter_impl
    96: filter.tbl_df
    95: dplyr::filter
    94: <reactive:f_vsamples> [/Users/eugenio/Dropbox/incomune/Progetti attivi/FoodMicrobioNet/attività2017/filterApp/appv2.R#240]
    83: f_vsamples
    82: exprFunc
    81: widgetFunc
    80: func
    79: origRenderFunc
    78: renderFunc
    77: origRenderFunc
    76: output$vsamples_table
     1: runApp

Am I doing something wrong?

Everything goes fine if I try this

require(shiny)
require(dplyr)
n <- 10
model.data0 <- 
  data.frame( "COURSE" = sample(LETTERS[1:3], n, replace=TRUE),
              "VALUE"  = sample(1:10, n, replace=TRUE),
              "TODROP" = sample(1:10, n, replace=TRUE)) 

ui <- fluidPage( 
  sidebarLayout(
    sidebarPanel(
      uiOutput('choose_course')
    ),
    mainPanel(
      tableOutput('courseTable')
    )
  )
)

server <- function(input, output, session) {
  # Build data, would be replaced by the csv loading in your case
  model.data0 <- 
    data.frame( "COURSE" = sample(LETTERS[1:3], n, replace=TRUE),
                "VALUE"  = sample(1:10, n, replace=TRUE),
                "TODROP" = sample(1:10, n, replace=TRUE)) 
  
  # Render selectInput 
  output$choose_course <- renderUI({
    course.names <- as.vector( unique(model.data0$COURSE) )
    selectInput("courses","Choose courses", choices=course.names, multiple=TRUE)    
  })
  
  # Subset so that only the selected rows are in model.data
  model.data <- reactive({
    # subset(model.data0(), COURSE %in% input$courses)
    dplyr::filter(model.data0, COURSE %in% input$courses) %>%
      dplyr::select(COURSE, VALUE)
  })
  
  output$courseTable <- renderTable({ model.data() })
}
runApp(shinyApp(ui,server))

which is taken from here https://stackoverflow.com/questions/37887482/filtering-from-selectinput-in-r-shiny. In this example using dplyr::filter or subset does not really matter neither does the position of the code building model.data0. So the culprit is likely to be the way I am using dplyr verbs in the server part of my app. Please advise

README.Rmd issue

There is a minor issue with README.Rmd. If you use a knit button, It doesn't give a proper .md file. This is probably because we use "Project build tools: Website". I found out a temporary solution:

rmarkdown::render('README.Rmd', output_format = 'md_document', output_file = 'README.md')

load sdData from github not cran

Current (25 June):
2 Geographic data in R
Prerequisites
This chapter requires the packages sf, and spData to be installed and loaded:
library(sf)
library(spData)

Should be:
devtools::install_github("nowosad/spData")

Reference issue in the second chapter

@Robinlovelace there is a mention of three packages - sp, rgdal, and rgeos in the second chapter, however references are missing there. It looks like they are not in the geocompr package description, therefore there are no valid bib entry of them.
The question is - should we add them to the description or we should add theirs bib entry to the ref.bib file?

Additional datasets to use

Finding interesting datasets to showcase algorithms can be very difficult.

One option could be to use ECMWF and Copernicus data. If you are not familiar with them, ECMWF stands for European Centre for Medium-range Weather Forecast, it is an inter-governamental organisation and stores the largest meteorological data archive in the world (global coverage). Copernicus is a European programme that provides satellite and in-situ observations for a number of domains (e.g. biodiversity, environmental protection, climate, public health, tourism, etc.).

There are countless uses for these datasets!

gpkg instead of shp

My idea is to mostly use the .gpkg format for vector examples. We could show one example with .shp at the beginning, and shortly inform that we will use .gpkg later in the book. We could also add an extended explanation why .gpkg is better in an appendix to the book. What do you think about it @Robinlovelace ?

make build and make html errors

@Robinlovelace can you check if make build or make html work for you? I've got an error for lines 119-128 in the 03-attr.Rmd:

Error in as(st_geometry(x), "Spatial") :
  no method or default for coercing "sfc_MULTIPOLYGON" to "Spatial"
Calls: <Anonymous> ... eval -> eval -> set_units -> st_area ->
 <Anonymous> -> as

Definition of "Multiple Features"

Section 2.1.4 says: "So far, our geometry types have just included one feature. To represent multiple features in one object, we can use the “multi”-version of each geometry type:"

But I think (and a careful reading of the standard would confirm this) these things are still "single features". They're just single features with multiple (ie compound) geometries. You only get multiple features when you create an sfc object.

Michael Sumner's rasters tips

https://twitter.com/mdsumner/status/860482390702960641

  • get the thing into memory for processing: readAll(x) /1
  • doing multiple extractions on same vector layer, use cellFromXY, cellFromPolygons etc. to build index /2
  • don't use projectRaster(raster, crs = projection(shp)), use spTransform(shp, projection(raster)) /3
  • extract on a one-layer RasterLayer may be slow, use extract(brick(raster()), ...) one weird trick /4
  • extract tiled file is VERY SLOW, readAll into memory, or rewrite untiled writeRaster(raster(), "file.tif") if too big for mem /5
  • read the performance vignette for large files https://cran.r-project.org/web/packages/raster/vignettes/functions.pdf … /6
  • (read the introduction vignette too) /7
  • a brick on file in .grd format is slow for crop(brick, ex) for every layer, it's better if brick is in memory /8
  • re /8 this is not true for a brick on file in NetCDF format, internally optimized for crop() on all layers /9
  • crop, extract, other ops will be slow for a stack composed of multiple single-layer files, try conversion to a bulk format /10
  • do try coarse grained parallel::parLapply(cluster, list-of-rasters-or-files, rasterOps) it can be effective /11
  • explore velox, fasterize, and unrelated packages like spatstat, oce / 12
  • sometimes raster(readGDAL(file)) is faster than raster(file) /13

The idea from https://twitter.com/clavitolo/status/865671631368105984

Section on projections and datum

People working with geospatial data rarely work in one projection, however this is a topic that is often neglected in books.

A brief intro and explanation of how to transform existing maps using various different projections would be great. But more importantly, it would be extremely interesting to read about what can go badly wrong with projection mismatches.

Ideas for additional topics to include

List of suggestions for the second edition. See below for various ideas and things already implemented (see #12 (comment) for an older version of this list that includes ideas already implemented) .

Part I

  • Additional operations on vector data: st_convex_hull, st_triangulate, st_polygonize, st_segmentize, st_boundary and st_node - see c1d1c0e
  • S2 geometry
  • geos package by @paleolimbot

Part II

  • 3d mapping section / chapter - section on that

  • ? A new part on visualizing geographic data with static/interactive/other chapters. Just thinking it could be usefully split-up given its popularity (RL)

Part III

  • Content in the algorithms chapter building on work by Stan Openshaw, father of 'geocomputation' e.g. a function that implements GAM, with reference to the series of conferences called GeoCompution. [aka how to implement spatial ideas in R]
  • Include something on facets ordered by location

Other

Michael Sumner's vectors tips

https://twitter.com/mdsumner/status/860676991040733184

  • rmapshaper::ms_simplify for reducing resolution while maintaining topology (shared edges) and object data /1
    - raster::shapefile for easy read/write /2 (but say no to shp)
  • rgeos::gBuffer(xsdf, width = 0, byid = TRUE) and similar w sf::st_buffer will resolve many topology problems, geometry errors /3
  • ggpolypath will draw polygon holes correctly but is limited with some aesthetics because identity is implicit in gg not absolute /4
    - ggplot2-like fortify to df decomposition is rogue but helpful for some tasks, see spbabel for examples including recomposition /5
    - use df[c("x", "y")] <- http://as.data .frame(rgdal::project(as.matrix(df[c("lon", "lat")]), prj)) rogue diy transforms /6
  • - don't go rogue until you know what you're doing with from/to proj crs ... /7
  • ultimate auto-graticule is probably unicorn, see graticule pkg for manual control over getting exactly what's needed /7
  • your map is a unique creation, do whatever rogue or otherwise to get what you need /8
  • use mapview and leaflet and plotly to interactively explore your map data /9
  • use fasterize for rasterizing /10
  • use sf::st_crs for EPSG look up /11
  • use Geopackage for data you care about, shapefile do a lot of dumb down, though sometimes it's the only option /12
  • sf uses planar assumptions for projected but ellipsoidal for longlat /13
  • set crs NA for planar in longlat, transform back then forward for ellipsoidal in projected (need tell much longer story here) /14
  • sf::st_segmentize in longlat? You probably wanted vertical orthodromes and horizontal loxodromes, i.e planar assumption /15
  • sf poly constructors do heavy checking on ring closure, diy/rogue for fast creation when checking not needed /16
  • use sf st_read/read_sf and write counterparts for wicked fast I/o, use as(x, "Spatial") st_as_sf rather than rgdal for sp i/o /17
  • try geom_sf in dev version of tidyverse/ggplot2 share your real world examples! /18
  • use spex pkg to get an extent polygon (rather than just a raw extent with no crs) /19

Section on web services and OGC standards

OGC standards such as WMS/WFS/WCS/... are becoming very common (hurray!) and many data providers serve layers via web services.

A section on how to assemble a typical request and visualise the response would be fantastic!

Make chapter beginnings consistent

Currently some have this:

---
knit: "bookdown::preview_chapter"
---

They should all have prerequisites...

@Nowosad note my use of pacman's p_load() for package loading - you happy with that? I think it's the best solution and that the benefits justify the costs of having another (minor) dependency for the book.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.