Coder Social home page Coder Social logo

rules_r's Introduction

R Rules for Bazel Tests

General Information

Rules

Repository Rules

Container Rules

Overview

These rules are used for building R packages with Bazel. Although R has an excellent package management system, there is no continuous build and integration system for entire R package repositories. An advantage of using Bazel, over a custom solution of tracking the package dependency graph and triggering builds accordingly on each commit, is that R packages can be built and tested as part of one build system in multi-language monorepos.

These rules are mature for production use.

Getting started

The following assumes that you are familiar with how to use Bazel in general.

To begin, you can add the following or equivalent to your WORKSPACE file:

load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")

# Change master to the git tag you want.
http_archive(
    name = "rules_r",
    strip_prefix = "rules_r-master",
    urls = ["https://github.com/grailbio/rules_r/archive/master.tar.gz"],
)

load("@rules_r//R:dependencies.bzl", "r_register_toolchains", "r_rules_dependencies")

r_rules_dependencies()

r_register_toolchains()

You can load the rules in your BUILD file like so:

load("@rules_r//R:defs.bzl",
     "r_pkg", "r_library", "r_unit_test", "r_pkg_test")

Advanced users can also set up Gazelle to maintain the BUILD files for the R packages in their repo automatically.

Configuration

The following software must be installed on your system:

1. bazel (v5.0.0 or above)
2. R (4.1.2 or above; should be locatable using the `PATH` environment variable)

NOTE: After re-installing or upgrading R, please reset the registered toolchain with bazel sync --configure to rebuild your packages with the new installation.

NOTE: It is possible to use R from a bazel package instead of a system installation. See the toolchain r-toolchain-nix in the tests directory as an example.

For each package, you can also specify a different Makevars file that can be used to have finer control over native code compilation. The site-wide Makevars files are configured by default in the toolchains, and these define the compiler toolchain to use and the flags needed for these toolchains for reproducible builds.

For macOS, this setup will help you cover the requirements for a large number of packages:

brew install gcc pkg-config icu4c openssl

For Ubuntu, this (or equivalent for other Unix systems) helps:

apt-get install pkgconf libssl-dev libxml2-dev libcurl4-openssl-dev

Note

For no interference from other packages during the build (possibly other versions installed manually by the user), it is recommended that packages other than those with recommended priority be installed in the directory pointed to by R_LIBS_USER. The Bazel build process will then be able to hide all the other packages from R by setting a different value for R_LIBS_USER.

When moving to Bazel for installing R packages on your system, we recommend cleaning up existing machines:

sudo Rscript \
  -e 'options("repos"="https://cloud.r-project.org")' \
  -e 'lib <- c(.Library, .Library.site)' \
  -e 'non_base_pkgs <- installed.packages(lib.loc=lib, priority=c("recommended", "NA"))[, "Package"]' \
  -e 'remove.packages(non_base_pkgs, lib=lib)'

# If not set up already, create the directory for R_LIBS_USER.
Rscript \
  -e 'dir.create(Sys.getenv("R_LIBS_USER"), recursive=TRUE, showWarnings=FALSE)'

For more details on how R searches different paths for packages, see libPaths.

External packages

To depend on external packages from CRAN and other remote repos, you can define the packages as a CSV with three columns -- Package, Version, and sha256. Then use r_repository_list rule to define R repositories for each package. For packages not in a CRAN like repo (e.g. github), you can use r_repository rule directly. For packages on your local system but outside your main repository, you will have to use local_repository with a saved BUILD file. Same for VCS repositories.

load("@rules_r//R:repositories.bzl", "r_repository", "r_repository_list")

# R packages with non-standard sources.
r_repository(
    name = "R_plotly",
    sha256 = "24c848fa2cbb6aed6a59fa94f8c9b917de5b777d14919268e88bff6c4562ed29",
    strip_prefix = "plotly-a60510e4bbce5c6bed34ef6439d7a48cb54cad0a",
    urls = [
        "https://github.com/ropensci/plotly/archive/a60510e4bbce5c6bed34ef6439d7a48cb54cad0a.tar.gz",
    ],
)

# R packages with standard sources.
# See below for an example of how to generate the CSV package_list.
r_repository_list(
    name = "r_repositories_bzl",
    build_file_overrides = "@myrepo//third-party/R:build_file_overrides.csv",
    package_list = "@myrepo//third-party/R:packages.csv",
    remote_repos = {
        "BioCsoft": "https://bioconductor.org/packages/3.14/bioc",
        "BioCann": "https://bioconductor.org/packages/3.14/data/annotation",
        "BioCexp": "https://bioconductor.org/packages/3.14/data/experiment",
        "CRAN": "https://cloud.r-project.org",
    },
)

load("@r_repositories_bzl//:r_repositories.bzl", "r_repositories")

r_repositories()

The list of all external R packages configured this way can be obtained from your shell with

$ bazel query 'filter(":R_", //external:*)'

NOTE: Periods ('.') in the package names are replaced with underscores ('_') because bazel does not allow periods in repository names.

To generate and maintain a CSV file containing all your external dependencies for use with r_repository_list, you can use the functions in the script repo_management.R.

For example:

script="/path/to/rules_r/scripts/repo_management.R"
package_list_csv="/path/to/output/csv/file"
packages="comma-separated list of packages you want to add to the local cache"
bioc_version="bioc_version to use, e.g. 3.11"

# This will be the cache directory for a local copy of all the packages.
# The output CSV will always reflect the state of this directory.
local_r_repo="${HOME}/.cache/r-repo"

Rscript - <<EOF
source('${script}')
pkgs <- strsplit('${packages}', ',')[[1]]
# Set ForceDownload to TRUE when switching R or Bioc versions.
# options("ForceDownload" = TRUE)
# Keep in sync with r_repository_list in WORKSPACE.
options(repos = c(
    BioCsoft = "https://bioconductor.org/packages/${bioc_version}/bioc",
    BioCann = "https://bioconductor.org/packages/${bioc_version}/data/annotation",
    BioCexp = "https://bioconductor.org/packages/${bioc_version}/data/experiment",
    CRAN = "https://cloud.r-project.org")
)
addPackagesToRepo(pkgs, repo_dir = '${local_r_repo}')
packageList('${local_r_repo}', '${package_list_csv}')
EOF

Examples

Some examples are available in the tests directory of this repo.

Also see Razel scripts that provide utility functions to generate BUILD files and WORKSPACE rules.

Contributing

Contributions are most welcome. Please submit a pull request giving the owners of this github repo access to your branch for minor style related edits, etc. We recommend opening an issue first to discuss the nature of your change before beginning work on it.

Known Issues

Please check open issues at the github repo.

Rules

r_pkg

r_pkg(srcs, pkg_name, deps, cc_deps, build_args, install_args, config_override,
      roclets, roclets_deps, makevars, env_vars, inst_files, tools, build_tools,
      metadata)

Rule to install the package and its transitive dependencies in the Bazel sandbox, so it can be depended upon by other package builds.

The builds produced from this rule are tested to be byte-for-byte reproducible with the same R installation. For native code compilation, the compiler flags for reproducibility are defined in the default site Makevars file in the local toolchain. If using your own toolchain, ensure that your site Makevars file has the right flags.

Implicit output targets
name.bin.tar.gz Binary archive of the package.
name.tar.gz Source archive of the package.
name.so Shared archive of package native code; empty file if package does not have native code.
Attributes
srcs

List of files, required

Source files to be included for building the package.

pkg_name

String; optional

Name of the package if different from the target name.

deps

List of labels; optional

R package dependencies of type `r_pkg` or `r_library`.

cc_deps

List of labels; optional

cc_library dependencies for this package.

build_args

List of strings; default ["--no-build-vignettes", "--no-manual"]

Additional arguments to supply to R CMD build. Note that building vignettes is disabled by default to not require Tex installation for users. In order to build vignettes, override this attribute, and ensure that the relevant binaries are available in your system default PATH (usually /usr/bin and /usr/local/bin)

install_args

List of strings; optional

Additional arguments to supply to R CMD INSTALL.

config_override

File; optional

Replace the package configure script with this file.

roclets

List of strings; optional

roclets to run before installing the package. If this is non-empty, then you must specify roclets_deps as the R package you want to use for running roclets. The runtime code will check if devtools is available and use `devtools::document`, failing which, it will check if roxygen2 is available and use `roxygen2::roxygenize`.

roclets_deps

List of labels; optional

roxygen2 or devtools dependency for running roclets.

makevars

File; optional

Additional Makevars file supplied as R_MAKEVARS_USER.

env_vars

Dictionary; optional

Extra environment variables to define for building the package.

inst_files

Label keyed Dictionary; optional

Files to be bundled with the package through the inst directory. The values of the dictionary will specify the package relative destination path. For example, '' will bundle the files to the top level directory, and 'mydir' will bundle all files into a directory mydir.

tools

List of labels; optional

Executables that code in this package will try to find in the system.

build_tools

List of labels; optional

Executables that native code compilation will try to find in the system.

metadata

String keyed Dictionary; optional

Metadata key-value pairs to add to the DESCRIPTION file before building. When text is enclosed within `{}`, bazel volatile and stable status files will be used to substitute the text. Inclusion of these files in the build has consequences on local and remote caching. Also see `stamp`.

stamp

Integer; default -1

Same behavior as the stamp attribute in cc_binary rule.

r_library

r_library(pkgs, library_path)

Executable rule to install the given packages and all dependencies to a user provided or system default R library. Run the target with --help for usage information.

The rule used to provide a tar archive of the library as an implicit output. That feature is now it's own rule -- r_library_tar. See documentation for r_library_tar rule and example usage for container_image rule.

Attributes
pkgs

List of labels, required

Package (and dependencies) to install.

library_path

String; optional

If different from system default, default library location for installation. For runtime overrides, use bazel run [target] -- -l [path].

r_unit_test

r_unit_test(pkg, suggested_deps, env_vars, tools, data)

Rule to keep all deps in the sandbox, and run the provided R test scripts.

When run with bazel coverage, this rule will also produce a coverage report in Cobertura XML format. The coverage report will contain coverage for R code in the package, and C/C++ code in the src directory of R packages.

Attributes
pkg

Label; required

R package (of type r_pkg) to test.

suggested_deps

List of labels; optional

R package dependencies of type `r_pkg` or `r_library`.

env_vars

Dictionary; optional

Extra environment variables to define before running the test.

tools

List of labels; optional

Executables to be made available to the test.

data

List of labels; optional

Data to be made available to the test.

r_pkg_test

r_pkg_test(pkg, suggested_deps, check_args, env_vars, tools, data)

Rule to keep all deps of the package in the sandbox, build a source archive of this package, and run R CMD check on the package source archive in the sandbox.

Attributes
pkg

Label; required

R package (of type r_pkg) to test.

suggested_deps

List of labels; optional

R package dependencies of type `r_pkg` or `r_library`.

check_args

List of strings; default ["--no-build-vignettes, "--no-manual"]

Additional arguments to supply to R CMD build. Note that building vignettes is disabled by default to not require Tex installation for users. In order to build vignettes, override this attribute, and ensure that the relevant binaries are available in your system default PATH (usually /usr/bin and /usr/local/bin)

env_vars

Dictionary; optional

Extra environment variables to define before running the test.

tools

List of labels; optional

Executables to be made available to the test.

data

List of labels; optional

Data to be made available to the test.

r_binary

r_binary(name, src, deps, data, env_vars, tools, rscript_args, script_args)

Build a wrapper shell script for running an executable which will have all the specified R packages available.

The target can be executed standalone, with bazel run, or called from other executables if RUNFILES_DIR is exported in the environment with the runfiles of the root executable.

Attributes
src

File; required

An Rscript interpreted file, or file with executable permissions.

deps

List of labels; optional

Dependencies of type r_binary, r_pkg, or r_library.

data

List of labels; optional

Files needed by this rule at runtime.

env_vars

Dictionary; optional

Extra environment variables to define before running the binary.

tools

List of labels; optional

Executables to be made available to the binary.

rscript_args

List of strings; optional

If src file does not have executable permissions, arguments for the Rscript interpreter. We recommend using the shebang line and giving your script execute permissions instead of using this.

script_args

List of strings; optional

A list of arguments to pass to the src script.

r_test

r_test(name, src, deps, data, env_vars, tools, rscript_args, script_args)

This is identical to r_binary but is run as a test.

r_markdown

r_markdown(name, src, deps, data, env_vars, tools, rscript_args, script_args,
render_function="rmarkdown::render", input_argument="input", output_dir_argument="output_dir",
render_args)

This rule renders an R markdown through generating a stub to call the render function. The render function and the argument names for the function are default set for rmarkdown::render but can be customized. Note that render_args will need to be quoted appropriately if set. This rule can be used wherever an r_binary rule can be used.

If arguments are given on the command line when running the target, flags of the form --arg=value are passed as keyword arguments to the render function. The values can be arbitrary R expressions, and strings will need to be quoted. The last argument without the prefix -- will be the output directory, else the output directory will be the default output directory of the render function, typically the same directory as the input file.

r_toolchain

r_toolchain(r, rscript, version, args, makevars_site, env_vars, tools, files, system_state_file)

Toolchain to specify the tools and environment for performing build actions. Also see r_register_toolchains for how to configure the default registered toolchains.

Attributes
r

String; default R

Absolute path to R, or name of R executable; the search path will include the directories for tools attribute.

rscript

String; default Rscript

Absolute path to Rscript, or name of Rscript executable; the search path will include the directories for tools attribute.

version

String; optional

If provided, ensure version of R matches this string in x.y form. This version check is performed in the `r_pkg` and `r_binary` (and by extension, `r_test` and `r_markdown`) rules. For stronger guarantees, perform this version check when generating the `system_state_file` (see attribute below).

args

List of strings; default ["--no-save", "--no-site-file", "--no-environ"]

Arguments to R and Rscript, in addition to `--slave --no-restore --no-init-file`.

makevars_site

Label; optional

Site-wide Makevars file.

env_vars

Dictionary; optional

Environment variables for BUILD actions.

tools

List of labels; optional

Additional tools to make available in PATH.

files

List of labels; optional

Additional files available to the BUILD actions.

system_state_file

Label; optional

A file that captures your system state. Use it to rebuild all R packages whenever the contents of this file change. This is ideally generated by a repository_rule with `configure = True`, so that a call to `bazel sync --configure` resets this file.

Repository Rules

r_repository

r_repository(urls, strip_prefix, type, sha256, build_file, rscript)

Repository rule in place of new_http_archive that can run razel to generate the BUILD file automatically. See section on external packages and Razel scripts.

Attributes
urls

List of strings; required

URLs from which the package source archive can be fetched.

strip_prefix

String; optional

The prefix to strip from all file paths in the archive.

type

String; optional

Type of the archive file (zip, tgz, etc.).

sha256

String; optional

sha256 checksum of the archive to verify.

build_file

File; optional

Optional BUILD file for this repo. If not provided, one will be generated.

razel_args

Dictionary; optional

Other arguments to supply to buildify function in razel.

rscript

String; optional

Name, path or label (must start with `@` or `//`) of the interpreter to use for running the razel script.

r_repository_list

r_repository_list(package_list, build_file_overrides, remote_repos, other_args, rscript)

Repository rule that will generate a bzl file containing a macro, to be called as r_repositories(), for r_repository definitions for packages in package_list CSV. See section on external packages.

Attributes
package_list

File; required

CSV containing packages with name, version and sha256; with a header.

build_file_overrides

File; optional

CSV containing package name and BUILD file path; with a header.

remote_repos

Dictionary; optional

Repos to use for fetching the archives.

other_args

Dictionary; optional

Other arguments to supply to generateWorkspaceMacro function in razel.

rscript

String; optional

Name, path or label (must start with `@` or `//`) of the interpreter to use for running the razel script.

r_version

String; optional

If provided, ensure version of R matches this string in x.y form.

r_rules_dependencies

load("@rules_r//R:dependencies.bzl", "r_rules_dependencies")

r_rules_dependencies()

Repository rule that provides repository definitions for dependencies of the BUILD system. One such dependency is the site-wide Makevars file.

r_coverage_dependencies

load("@rules_r//R:dependencies.bzl", "r_coverage_dependencies")

r_coverage_dependencies()

load("@r_coverage_deps_bzl//:r_repositories.bzl", coverage_deps = "r_repositories")

coverage_deps()

Repository rule that provides repository definitions for dependencies in computing code coverage for unit tests. Not needed if users already have a repository definition for the covr package.

r_register_toolchains

load("@rules_r//R:dependencies.bzl", "r_register_toolchains")

r_register_toolchains(r_home, strict, makevars_site, version, args, tools)

Repository rule that generates and registers a platform independent toolchain of type r_toolchain based on the user's system and environment. If you want to register your own toolchain for specific platforms, register them before calling this function in your WORKSPACE file to give them preference.

NOTE: These toolchains read your system state and cache the findings for future runs. Whenever you install a new R version, or if you want to reset the toolchain for any reason, run:

bazel sync --configure
Attributes
r_home

String, optional

A path to `R_HOME` (as returned from `R RHOME`). If not specified, the rule looks for R and Rscript in `PATH`. The environment variable `BAZEL_R_HOME` takes precendence over this value.

strict

Bool; default True

Fail if R is not found on the host system.

makevars_site

Bool; default True

Generate a site-wide Makevars file.

version

String; optional

version attribute value for r_toolchain.

args

List of strings; default ["--no-save", "--no-site-file", "--no-environ"]

args attribute value for r_toolchain.

tools

List of strings; optional

tools attribute value for r_toolchain.

rules_r's People

Contributors

collinmeltonatgrail avatar curiousleo avatar eddelbuettel avatar forestfang-stripe avatar gnome-skillet avatar hchauvin avatar jesseschalken avatar jmillikin-stripe avatar rj3d avatar siddharthab avatar subhachandrachandra avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

rules_r's Issues

trouble building r_pkg depending on cc_library

I am building an R package via r_pkg that depends on a cc_library A. Library A depends on two system libraries B and C that I've imported as external repositories via new_local_repository and then created a rule for these using cc_library.
Library A is built fine but my R package is having problems linking against A, and what's more it seems to try and include B and C in the linking statement.
The error I see depends on whether I'm running sandboxed or not. If sandboxed, I see:

cp: cannot create regular file 'external/R_flowCore/flowCore/src/libhdf5.so': Read-only file system
cp: cannot create regular file 'external/R_flowCore/flowCore/src/libarmadillo.so': Read-only file system

those are the system dependencies B and C.
If I don't run sandboxed I see:

g++: error: libhdf5.so: No such file or directory
g++: error: libarmadillo.so: No such file or directory

I don't fullly understand what directories bazel is doing its work in, so I'm having trouble understanding why I see these particular errors, especially the latter non-sandboxed one. Poking around I do see an R package source directory where these .so files have been copied into, but it's doesn't contain the compiled .o files for the rest of the cpp code, so presumably the build directory is elsewhere.

I've looked at rules_r/R/scripts/build.sh and there's a lot of complexity there that I'm having difficulty unravelling.

Finally,thinking this may be related to the bundles configure script, but I see this same behavior irrespective of whether I use the package's bundled configure file or if I remove the configure script.

I'd like to understand:

  • in what directory are the g++ build commands invoked? I know R usually creates a temporary directory, but I'm having trouble getting insight into what the bash script is actually doing. Is there a mechanism by which I can start to trace the inner workings here?
  • Alternately is there a higher level explanation of the build rule available somewhere.
  • Alternately, @siddharthab if you are amenable perhaps I could get a bit of your time on a call or synchronous chat to dive a bit deeper?
    Thanks in advance for all your help.

Respect topological ordering for cc_deps

Currently, the libraries to link are aggregated as such:

c_libs_flags = l.path for d in cc_deps for l in d.cc.libs 

As a consequence, in the following case, the linker complains that some symbols in B are not found, because the dependencies are not given to the linker in topological order:

cc_library A transitively depends on cc_library B
cc_deps = [":B", ":A"]
Command: g++ ... -lB -lA

The solution is to aggregate the d.cc.libs using the depset mechanism instead of simply iterating through the list (I suppose for d.cc.libs the order is "topological").

Stamp package description files with build time metadata

Have the ability for users to specify a metadata dictionary with values allowing substitutions from the volatile and stable status files. Have a boolean attribute stamp to allow users to specify if stable status file should be a dependency.

r_pkg broken when PKG_SRC_DIR empty

Hello,
It's looks like r_pkg rule is failing when WORKSPACE file is in the root of the package. Simple R package repo with such issue: https://github.com/knightdave/simpleR .

rules_r version: 0.5.9.tar.gz

❯ bazel --version
bazel 3.5.0
❯ bazel build --verbose_failures //...               
Starting local Bazel server and connecting to it...
INFO: Invocation ID: 164d6861-16b7-4791-92fc-7e51dbb3f51b
INFO: Analyzed target //:simpleR (18 packages loaded, 115 targets configured).
INFO: Found 1 target...
ERROR: /hidden/simpleR/BUILD.bazel:18:6: Building R package simpleR failed (Exit 1): build.sh failed: error executing command 
  (cd /hidden/.cache/bazel/_bazel_rycerzd/ff5bd6ed62b50ccee0dfeb7260c1c601/sandbox/linux-sandbox/4/execroot/__main__ && \
  exec env - \
    BAZEL_R_DEBUG=false \
    BAZEL_R_VERBOSE=false \
    BUILD_ARGS=''\''--no-build-vignettes'\'' '\''--no-manual'\''' \
    BUILD_TOOLS_EXPORT_CMD='export PATH' \
    CONFIG_OVERRIDE='' \
    C_CPP_FLAGS='' \
    C_LIBS_FLAGS='' \
    C_SO_FILES='' \
    EXPORT_ENV_VARS_CMD='' \
    FLOCK_PATH=bazel-out/host/bin/external/com_grail_rules_r/R/scripts/flock \
    INSTALL_ARGS='' \
    INSTRUMENTED=false \
    INSTRUMENT_SCRIPT=external/com_grail_rules_r/R/scripts/instrument.R \
    INST_FILES_MAP='' \
    METADATA_MAP='' \
    PKG_BIN_ARCHIVE=bazel-out/k8-fastbuild/bin/simpleR.bin.tar.gz \
    PKG_LIB_PATH=bazel-out/k8-fastbuild/bin/lib \
    PKG_NAME=simpleR \
    PKG_SRC_ARCHIVE=bazel-out/k8-fastbuild/bin/simpleR.tar.gz \
    PKG_SRC_DIR='' \
    R='/usr/bin/R --slave --no-restore --no-init-file --no-save --no-site-file --no-environ' \
    REQUIRED_VERSION='' \
    ROCLETS='' \
    RSCRIPT='/usr/bin/Rscript --no-init-file --no-save --no-site-file --no-environ' \
    R_LIBS_DEPS='' \
    R_LIBS_ROCLETS='' \
    R_MAKEVARS_SITE='' \
    R_MAKEVARS_USER='' \
    STATUS_FILES=bazel-out/volatile-status.txt \
  external/com_grail_rules_r/R/scripts/build.sh)
Execution platform: @local_config_platform//:host

Use --sandbox_debug to see verbose messages from the sandbox build.sh failed: error executing command 
  (cd /hidden/.cache/bazel/_bazel_rycerzd/ff5bd6ed62b50ccee0dfeb7260c1c601/sandbox/linux-sandbox/4/execroot/__main__ && \
  exec env - \
    BAZEL_R_DEBUG=false \
    BAZEL_R_VERBOSE=false \
    BUILD_ARGS=''\''--no-build-vignettes'\'' '\''--no-manual'\''' \
    BUILD_TOOLS_EXPORT_CMD='export PATH' \
    CONFIG_OVERRIDE='' \
    C_CPP_FLAGS='' \
    C_LIBS_FLAGS='' \
    C_SO_FILES='' \
    EXPORT_ENV_VARS_CMD='' \
    FLOCK_PATH=bazel-out/host/bin/external/com_grail_rules_r/R/scripts/flock \
    INSTALL_ARGS='' \
    INSTRUMENTED=false \
    INSTRUMENT_SCRIPT=external/com_grail_rules_r/R/scripts/instrument.R \
    INST_FILES_MAP='' \
    METADATA_MAP='' \
    PKG_BIN_ARCHIVE=bazel-out/k8-fastbuild/bin/simpleR.bin.tar.gz \
    PKG_LIB_PATH=bazel-out/k8-fastbuild/bin/lib \
    PKG_NAME=simpleR \
    PKG_SRC_ARCHIVE=bazel-out/k8-fastbuild/bin/simpleR.tar.gz \
    PKG_SRC_DIR='' \
    R='/usr/bin/R --slave --no-restore --no-init-file --no-save --no-site-file --no-environ' \
    REQUIRED_VERSION='' \
    ROCLETS='' \
    RSCRIPT='/usr/bin/Rscript --no-init-file --no-save --no-site-file --no-environ' \
    R_LIBS_DEPS='' \
    R_LIBS_ROCLETS='' \
    R_MAKEVARS_SITE='' \
    R_MAKEVARS_USER='' \
    STATUS_FILES=bazel-out/volatile-status.txt \
  external/com_grail_rules_r/R/scripts/build.sh)
Execution platform: @local_config_platform//:host

Use --sandbox_debug to see verbose messages from the sandbox
cp: cannot stat '/DESCRIPTION': No such file or directory
Target //:simpleR failed to build
INFO: Elapsed time: 8.045s, Critical Path: 0.27s
INFO: 3 processes: 3 remote cache hit.
FAILED: Build did NOT complete successfully

The same package is working when I'll put it to some subfolder.

Why not let Bazel compile C++ as well?

Currently, r_pkg leans pretty heavily on R to build *.cpp code, which is pretty slow.

What if, instead, C++ code was built with Bazel (which is fast), and then passed to r_pkg through cc_dep, already prebuilt? I've hacked up something like this and it seems to work fine, and it's much faster. I had to use an empty *.cpp file to get r_pkg to link the *.so, and I had to put libRcpp.so into a location where it could easily be found, but other than that this required no changes to the C++ code. The resulting package seems to work OK under r_unit_test. Not sure what r_pkg_test would think of it, but it'll probably be fine as well, since it works OK with other cc_deps and this is no different.

So basically the suggestion here is to modify the rule such that it compiles all C++ source with cc_library (and lines up the *.so's and headers for that to work, so symlinking hack is not necessary), and then feeds the result into build.sh much like it would feed cc_deps.

In r_library, do not create tar unless asked for

Creating a tar copies all dependencies in the tar. This can be an expensive operation if done as part of an automated build of all library targets.

The current recommendation is to use the tag manual to skip building library targets.

Using the fact that in sandboxed builds, the runfiles are symlinks, we can let the runfiles be the package files, and on running the library target, copy the library directories into the destination, following symlinks.

Create an r_binary rule

Rule analogous to sh_binary but for the Rscript interpreter. Will have attributes srcs, data and r_pkg_deps.

Shared objects are thrown out by R cmd build

Technically, r_pkg is supposed to collect *.so's and bundle them into the package, which is what it does at the preliminary build step. However, this doesn't really work because R CMD build throws out those *.so's, not recognizing them as a valid part of the package, so by the time build.sh needs to run R CMD install they're no longer there.

It would seem that, instead, they should be copied after build archive is expanded. I'm not sure if install won't throw them out, but cursory search in install.R seems to suggest that it shouldn't.

Custom toolchain for R

We rely on system provided R and are not controlling the version of R being used, or the packages installed at .Library search path.

A toolchain to have better control will make the build more hermetic.

ARM support

Let's test that we can support ARM processors, specifically on the new AWS instance types. We may have to wait for R 4.1.0 for full support.

For macOS, CRAN recommendation is to use Rosetta for R versions less than 4.1.0.

Long R_LIBS (R_LIBS_DEPS) not parsed properly by R CMD INSTALL

When the list of dependencies gets really long, R CMD INSTALL no longer sees the installed dependencies when building a package.
I can confirm this by shortening R_LIBS_DEPS, or by installing the packages that aren't found into the R/site-library, then R CMD INSTALL sees them properly.
Two questions:

  1. is there a known work-around?
  2. Is it possible to have rules_r install to a "global" R_LIBS directory rather than package-specific R_LIBS directories that then get joined into a long ungainly string?
    I can clarify this issue further if needed.
    This is on ubuntu.

r_repository workspace rule to run razel automatically

This rule can be like an http_archive, and generate a default BUILD file for the package.

Modalities to obtain the R package can be, either exclusive or in order of preference,

  1. local path
  2. URLs (with optional sha256)
  3. package name and version; the default repositories can be configured as a .Rprofile file through another WORKSPACE rule.

sourcing razel fails when output user root has '\'

To reproduce:

cd tests
bazel --output_user_root=/tmp/a\\b sync
...
ERROR: An error occurred during the fetch of repository 'r_repositories_bzl':
   Failed to generate bzl file: 

Error in file(filename, "r", encoding = encoding) : 
  cannot open the connection
Calls: source -> file
In addition: Warning message:
In file(filename, "r", encoding = encoding) :
  cannot open file '/private/tmp//fbcd335144c4da9c79038f9ce88e9f6d/external/com_grail_rules_r/scripts/razel.R': No such file or directory
Execution halted

Reproducible builds

The packages built have stamped information about the built timestamp, the source directory and the library directory for the installation. This is especially bothersome with docker images as different layers are created with each build.

The build timestamp can be fixed to an empty string with the --built_timestamp flag to R CMD INSTALL. For the rest, we need to build and install in a constant directory, which means fixing a /tmp path for a package, and acquiring a lock on that path so that builds in other workspaces do not interfere with this build.

roclets attribute for r_pkg

r_pkg can take a list of strings as a "roclets" attribute to run roxygen on package source code before building it. This will automate a manual step that developers do frequently before running a build or test.

Update covr dependency to CRAN

We have been using a covr dependency with some of our own modifications. Would be nice to get these changes upstreamed and then depend on a CRAN release directly.

The PR is r-lib/covr#468.

Repository rules for R

We currently use new_http_archive for external R packages, and explicitly list CRAN and BioC repo URLs for the package. A new repository rule for R remote repositories should help.

r_library does not clean up before copying

Running r_library targets should clean up the destination packages before copying new files there. Otherwise, previously installed files from inst/, etc. will continue to exist in the new install.

Is it possible to include a py_binary in the deps of a r_binary?

I wanted to use my R code to call a python code that is also built with bazel. Currently, I have to build them as two separate targets (a py_binary and a r_binary). But is there a way that I can put the py_binary as a deps for the r_binary, so that when I build the r_binary, it will automatically build the py_binary?

Do not use the same file descriptor for lock

In build.sh, we use 200 as the constant for the file descriptor that we lock.

This is most likely causing unwanted waits for the lock to be released when multiple packages are being built simultaneously.

This is just a hunch though that needs to be validated.

`r_pkg_test` does not seem to work with `pkg_name`

r_pkg_test rules do not seem to work correctly when the target r_pkg has a different name than pkg_name.

Given the following BUILD file for a package:

load("@com_grail_rules_r//R:defs.bzl", "r_pkg", "r_pkg_test")

r_pkg(
    name = "dummy",
    pkg_name = "<pkgname>",
    srcs = glob(
        ["**"],
        exclude = [
            "BUILD",
        ],
    )
)

r_pkg_test(
    name = "check",
    pkg = ":dummy",
)

I see the following when trying to run the check target (lightly edited for privacy):

bazel run //<pkgname>:check
Loading:
Loading: 0 packages loaded
Analyzing: target //<pkgname>:check (1 packages loaded, 0 targets configured)
INFO: Analyzed target //<pkgname>:check (27 packages loaded, 375 targets configured).
INFO: Found 1 target...
bazel: Entering directory `<snip>/execroot/<pkgname>/'
[0 / 2] [Prepa] BazelWorkspaceStatusAction stable-status.txt
bazel: Leaving directory `<snip>/execroot/<pkgname>/'
Target //<pkgname>:check up-to-date:
  bazel-bin/check
INFO: Elapsed time: 1.024s, Critical Path: 0.66s
INFO: 10 processes: 7 internal, 3 linux-sandbox.
INFO: Build completed successfully, 10 total actions
INFO: Running command line: external/bazel_tools/tools/test/test-setup.sh ./check
INFO: Build completed successfully, 10 total actions
exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //:check
-----------------------------------------------------------------------------
* using log directory ‘<snip>/execroot/<pkgname>/bazel-out/k8-fastbuild/bin/check.runfiles/<pkgname>/dummy.Rcheck’
* using R version 3.6.3 (2020-02-29)
* using platform: x86_64-pc-linux-gnu (64-bit)
* using session charset: UTF-8
* checking package directory ... ERROR
package directory `<snip>/execroot/<pkgname>/bazel-out/k8-fastbuild/bin/check.runfiles/<pkgname>/dummy.Rcheck/00_pkg_src/dummy’ does not exist
* DONE

Status: 1 ERROR
See
  ‘<snip>/execroot/<pkgname>/bazel-out/k8-fastbuild/bin/check.runfiles/<pkgname>/dummy.Rcheck/00check.log’
for details.

This is almost certainly because R CMD check expects the tarball name to match the package name, but when using pkg_name this will not be the case.

Redesign r_pkg_test

r_pkg_test builds the source archive of the package independently of the r_pkg rule.

This means all the package build configuration and dependencies have to be specified twice.

Write tests for reproducibility

We currently don't test that our rules result in a reproducible build on execution on the same machine. These tests have been performed manually before.

r_unit_test should be able to declare data dependencies

I think it'd be cool if r_unit_test could define data dependencies so that Bazel could download and symlink testdata for unit tests in an atomic, pipelined, and checksummed fashion. Currently the only deps for r_unit_test can be other packages. While that might be appropriate for r_pkg_test, seems like for r_unit_test that's too restrictive.

In particular, this could be useful during development if the authors are not planning to release the package, or if package will be modified before release to be able to do a package level test instead. Indeed, the package could dynamically choose either Bazel managed or pre-installed data source depending on how it's being tested.

Native library dependencies in r_pkg

We rely on a package's configure script to locate the system dependency. We should be able to specify some system requirements through Bazel and provide configure flags to use the Bazel built requirement.

Code coverage

It would be possible to get code coverage for r_pkg as follows:

  • The covr R package could be used for a mix of offline and online instrumentation: the package loader script would be modified offline to instrument the package when it is loaded. Some covr internals have to be used here, but thanks to rules_r it is possible to pin down a compatible version.
  • A small function in R converts the covr line-by-line format to an LCOV coverage report.
  • The LCOV coverage reports are collected by Bazel into bazel-testlogs when testing with 'bazel coverage'.

It is possible to wrap an r_binary invocation in a similar way.

The LCOV reports can then be consumed in many ways, for instance using the genhtml command-line utility.

Code coverage can be transitive if --instrumentation_filter is properly set, and covr integrates nicely with native code, either generated by rules_r or handled through cc_deps.

Check minimum R version

The implementation of these rules expect a minimum of R 3.3. We should check for it on bootstrap and message the user accordingly.

Variable brew install prefix

We assume a brew install prefix of /usr/local in Makevars.darwin for finding gfortran.

Provide a default Makevars file target that is dynamically generated and uses the right brew prefix for the user.

Make reproducible mode the default

Reproducible mode has been the default at GRAIL for ~8 months and has been working without any issues. We should write more unit tests for it and make it the default and only mode.

How do you keep track of per-pkg system libraries?

I'm building custom images containing R libs (among other things) using GoogleContainerTools/distroless/package_manager. I have worked though collecting and adding R deps to packages.csv so all of my R world dependencies are fine.

But those R libraries have system library dependencies. When you load the R libraries in R code, they look for the system libraries they use, and throw at the first one they can't find.

I am not a native R person, so please forgive me if this is a stupid question: is there any way to find the native deps of R libraries? Iterating over this is really time consuming at scale.

Any help appreciated.

Deployment story

Is there a deployment story native to bazel? Or is the out of scope for this?

Add an option for R CMD install to work on a writable copy of the source files

Having a symlinked, read-only reference to the source files can be problematic for some configure scripts, e.g., for the XML or Rgraphviz packages (they overwrite source files, they create temporary files within the source tree, ...). One quick fix is to make R CMD install work, for those packages, on a full, writable recursive copy of the source files.

I propose a new copy_srcs bool attribute to r_pkg to enable that. This would reuse the machinery of REPRODUCIBLE_BUILD and copy sources with cp -LpR ... instead of cp -a (-L dereferences symlinks).

r_pkg_test does not configure cc_deps and Makevars

The test target should get this information from the package provider, and make the Makevars, headers and library archives available as runfiles. The paths, etc. should also change to represent short paths.

Run travis on OSX as well

I think it was not available in the past, but now it is possible to run travis on OSX for Open Source projects.

how to install r check deps into system R library path

here is part of my BUILD file

PKG_NAME = "flowWorkspace"
PKG_CHECK_DEPS = [
    "@R_knitr//knitr:knitr",
    "@R_ggcyto//:ggcyto",
    
]
r_pkg(
    name = PKG_NAME,
    srcs = glob(
        ["**"],
        exclude = [
            "BUILD",
            "src/Makevars",
        ],
    ),
    cc_deps = [
        ":flowWorkspace_hdrs",
    ],
    config_override = ":empty",
    makevars = ":flowworkspace_makevars",
    pkg_name = "flowWorkspace",
    deps = [
        "@R_Biobase//Biobase",
        "@R_DelayedArray//DelayedArray",
        "@R_RBGL//RBGL",
        "@R_Rcpp//Rcpp",
        "@R_RcppParallel//RcppParallel",
         ],
)

r_library(
    name="library",
    pkgs = [PKG_NAME]
)

r_pkg_test(
    name = "check",
    timeout = "short",
    pkg = PKG_NAME,
    check_args = ["--no-codoc", "--no-manual", "--no-build-vignettes"],
    suggested_deps = PKG_CHECK_DEPS,
)

I have encountered a test error during R CMD check (invoked by bazel run packages/flowWorkspace:check )

I'd like to reproduce and troubleshoot the test error interactively at regular R console with devtools:test()
I know all the r packages and its deps can be copied to system R library path by running the r_library target (i.e. bazel run packages/flowWorkspace:library), but it will miss the packages defined in PKG_CHECK_DEPS.
I can't paste these to deps list of r_pkg rule since that will fail the bazel build due to some circular dependencies from suggests deps of r package.

I wonder if there is a convenient way to copy the packages listed in PKG_CHECK_DEPS (i.e. bazel-out/k8-fastbuild/bin/packages/flowWorkspace/check.runfiles/ I guess) to system R path instead of manually doing it.

tar_dir attribute for r_library

With a default value of '.', this attribute can specify the path in which the packages are installed inside the tarball. Running the rule will also copy the packages from this path into the library directory.

r_pkg use of cc_deps - does it use the includes directive of cc_library?

Question about having an r_pkg depend on a cc_library through cc_deps.
cc_library has a includes argument that tells downstream dependencies how to search for and use its header files (according to my understanding).
But it seems that this is not passed on R CMD BUILD via r_pkg?
For example I am only seeing the includes related to pkg_deps not cc_deps.
The consequence for my build is that I can't compile against a pure C++ library dependency.
Normally for pure R, I would pass stuff in via configure_args I think, but I don't know the sandbox path ahead of time.

Here's my c++ dependency:compiles fine

cc_library(
    name = "cytolib",
    srcs = glob([
        "src/*.cpp"
    ]),
    hdrs = glob([
        "include/cytolib/*.hpp",
        "include/cytolib/*.h",
    ]),
    copts = [
        "-I.",
        "-std=c++17",
        "-fopenmp",
    ],
    linkopts = ["-lhdf5_serial"],
    include_prefix="cytolib",
    includes=["."],
    strip_include_prefix="include/cytolib",
    deps = [
        "@boost_hdrs//:boost",
        "@hdf5_hdrs//:hdf5",
        "@hdf5_lib//:libhdf5",
    ],
)

and here's my R package that depends on it.

r_pkg(
    name = "flowCore",
    srcs = glob(["**"], exclude=["BUILD"]),
    pkg_name = "flowCore",
    deps =  [
    '@R_Biobase//Biobase:Biobase',
    '@R_BiocGenerics//BiocGenerics:BiocGenerics',
    '@R_Rcpp//Rcpp:Rcpp',
    '@R_matrixStats//matrixStats:matrixStats',
    '@R_RcppArmadillo//RcppArmadillo:RcppArmadillo',
    '@R_S4Vectors//S4Vectors:S4Vectors',
    ],
    cc_deps = ["@cytolib_repo//packages/cytolib:cytolib"],
)

The compilation command I see is:

g++ -std=gnu++11 -I'/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/bazel-out/k8-fastbuild/bin/packages/cytolib/cytolib' -I"/usr/local/lib/R/include" -DNDEBUG -DROUT -pthread -I/usr/local/include -I/usr/local/cytolib/include -I/usr/local/include/tiledb -DBOOST_NO_AUTO_PTR  -I'/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/bazel-out/k8-fastbuild/bin/external/R_Rcpp/Rcpp/lib/Rcpp/include' -I'/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/bazel-out/k8-fastbuild/bin/external/R_RcppArmadillo/RcppArmadillo/lib/RcppArmadillo/include' -I/usr/local/include -Wno-builtin-macro-redefined -D__DATE__="redacted" -D__TIMESTAMP__="redacted" -D__TIME__="redacted" -fdebug-prefix-map="/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/="   -fpic  -g -O2  -c RcppExports.cpp -o RcppExports.o

with the error:

pairVectorRcppWrap.h:10:10: fatal error: cytolib/compensation.hpp: No such file or directory
   10 | #include <cytolib/compensation.hpp>

So it's not finding the header files for cytolib, since they don't seem to be passed on.

And what I would like / expect to see is:

g++ -std=gnu++11 -I'/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/bazel-out/k8-fastbuild/bin/packages/cytolib/_virtual_includes/cytolib' -I"/usr/local/lib/R/include" -DNDEBUG -DROUT -pthread -I/usr/local/include -I/usr/local/cytolib/include -I/usr/local/include/tiledb -DBOOST_NO_AUTO_PTR  -I'/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/bazel-out/k8-fastbuild/bin/external/R_Rcpp/Rcpp/lib/Rcpp/include' -I'/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/bazel-out/k8-fastbuild/bin/external/R_RcppArmadillo/RcppArmadillo/lib/RcppArmadillo/include' -I/usr/local/include -Wno-builtin-macro-redefined -D__DATE__="redacted" -D__TIMESTAMP__="redacted" -D__TIME__="redacted" -fdebug-prefix-map="/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/="   -fpic  -g -O2  -c RcppExports.cpp -o RcppExports.o

With the difference being in the first -I flag:

-I'/home/gfinak/.cache/bazel/_bazel_gfinak/838b2a13fcb4260193034d31efdec1fb/sandbox/linux-sandbox/238/execroot/ozette_mono_repo/bazel-out/k8-fastbuild/bin/packages/cytolib/_virtual_includes/cytolib'

Would love some insights if anyone has some.

r_repositories_list works on one machine but not on another

I have a WORKSPACE which works on my dev machine but breaks on my team's jenkins box.

My dev laptop is Fedora, and the jenkins box is ubuntu. I don't think that should make a difference but it's the one difference in the environment I can think of.

I had to apt-get install r-base because prior to this there was no R on the jenkins box.

Any help appreciated.

WORKSPACE snippet:

http_archive(
    name = "com_grail_rules_r",
    strip_prefix = "rules_r-master",
    urls = ["https://github.com/grailbio/rules_r/archive/master.tar.gz"],
)

load("@com_grail_rules_r//R:dependencies.bzl", "r_register_toolchains", "r_rules_dependencies")

r_rules_dependencies()

r_register_toolchains()

load("@com_grail_rules_r//R:repositories.bzl", "r_repository_list", "r_repository")

r_repository(
    name = "R_data_table",
    sha256 = "f5b2b7d44ef5d8cb3505b4e6b4c4539e7a2132dffc5516da6f717fa51ebe9d3b",
    strip_prefix = "data.table-c0052964694a4c618ab182aa474f924d40576d94",
    urls = [
        "https://github.com/Rdatatable/data.table/archive/c0052964694a4c618ab182aa474f924d40576d94.tar.gz",
    ],
)

r_repository_list(
    name = "r_repositories_bzl",
    package_list = "//sample_r:packages.csv",
    remote_repos = {
        "CRAB": "https://cloud.r-project.org",
    },
)

load("@r_repositories_bzl//:r_repositories.bzl", "r_repositories")

r_repositories()

load("@com_grail_rules_r//R:dependencies.bzl", "r_coverage_dependencies")

r_coverage_dependencies()

bash log:

$ bazel build //sample_r:sample
INFO: Writing tracer profile to '/home/jenkinsadmin/.cache/bazel/_bazel_jenkinsadmin/b33a4b3c4e6f0c61d58e78df9100fe07/command.profile.gz'
INFO: SHA256 (https://github.com/grailbio/rules_r/archive/master.tar.gz) = bacdb1d81e81db7a95d9e364e9eea73c9cfea43f0901c115bc7b8d3f5eec1ae2
DEBUG: Rule 'com_grail_rules_r' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "bacdb1d81e81db7a95d9e364e9eea73c9cfea43f0901c115bc7b8d3f5eec1ae2"
DEBUG: Call stack for the definition of repository 'com_grail_rules_r' which is a http_archive (rule definition at /home/jenkinsadmin/.cache/bazel/_bazel_jenkinsadmin/b33a4b3c4e6f0c61d58e78df9100fe07/external/bazel_tools/tools/build_defs/repo/http.bzl:292:16):
 - /home/jenkinsadmin/repo/mob/WORKSPACE:391:1
INFO: Call stack for the definition of repository 'r_repositories_bzl' which is a r_repository_list (rule definition at /home/jenkinsadmin/.cache/bazel/_bazel_jenkinsadmin/b33a4b3c4e6f0c61d58e78df9100fe07/external/com_grail_rules_r/R/repositories.bzl:139:21):
 - /home/jenkinsadmin/repo/mob/WORKSPACE:414:1
ERROR: An error occurred during the fetch of repository 'r_repositories_bzl':
   Failed to generate bzl file: 

Error in available.packages(repos = repos, type = type) : 
  unused argument (repos = repos)
Calls: generateWorkspaceMacro ... mergeWithRemote -> as.data.frame -> available.packages
Execution halted
ERROR: no such package '@r_repositories_bzl//': Failed to generate bzl file: 

Error in available.packages(repos = repos, type = type) : 
  unused argument (repos = repos)
Calls: generateWorkspaceMacro ... mergeWithRemote -> as.data.frame -> available.packages
Execution halted
ERROR: no such package '@r_repositories_bzl//': Failed to generate bzl file: 

Error in available.packages(repos = repos, type = type) : 
  unused argument (repos = repos)
Calls: generateWorkspaceMacro ... mergeWithRemote -> as.data.frame -> available.packages
Execution halted
INFO: Elapsed time: 1.323s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)

r_package macro doesn't expose all r_pkg options

Just got bit trying to set stamp & metadata on an r_package, but all it passes through is the srcs & deps.

I'm not sure if there's any particular reason not to just let the r_package take **kwargs & directly pass it along to its underlying r_pkg ?

Though honestly, looking at it, I probably would've written r_package purely in terms of kwargs. Then it's a 100% transparent drop-in replacement, plus you don't need to worry at all about updating the convenience wrapper when the real thing adds new options.

E.g.

def r_package(name, **kwargs):
    r_pkg(
        name = name,
        **kwargs,
    )
    r_library(
        name = "library",
        pkgs = [":%s" % name],
        tags = ["manual"],
    )

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.