Coder Social home page Coder Social logo

vital-software / monofo-buildkite-plugin Goto Github PK

View Code? Open in Web Editor NEW
14.0 10.0 2.0 5.14 MB

Buildkite dynamic pipeline generator for mono-repos

License: GNU General Public License v3.0

JavaScript 1.67% TypeScript 90.01% Dockerfile 1.08% Shell 7.19% Batchfile 0.02% Ruby 0.04%
monorepo buildkite-plugin buildkite pipeline-generator

monofo-buildkite-plugin's Introduction

monofo Build status

A Buildkite dynamic pipeline generator for monorepos. monofo lets you split your .buildkite/pipeline.yml into multiple components, each of which will only be run if it needs to be (based on what's changed since the last build).

Monofo keeps your pipeline running the same way it always has (e.g. you don't have to split your pipeline and use triggers), while potentially saving heaps of time by only building what you need.

Basic usage

Instead of calling buildkite-agent pipeline upload in the first step of a pipeline, execute monofo pipeline which will output the dynamic pipeline on stdout: npx monofo pipeline | buildkite-agent pipeline upload

To make this easier, Monofo supports configuration as a Buildkite plugin, so an example to generate your pipeline might be:

steps:
  - name: ":pipeline: Generate pipeline"
    command: echo "Monorepo pipeline uploaded"
    plugins:
      - seek-oss/aws-sm#v2.2.1: # for example, but your secret management might be e.g. via S3 bucket or "env" file instead
          env:
            BUILDKITE_API_ACCESS_TOKEN: "global/buildkite-api-access-token"
      - vital-software/monofo#v5.0.12:
          generate: pipeline

Note that Monofo requires an environment variable to be configured, allowing it to access the Buildkite API. This is the BUILDKITE_API_ACCESS_TOKEN environment variable. See Configuration for details.

Splitting pipelines using multiple pipeline.yml files

Split your .buildkite/pipeline.yml into whatever components you'd like and give them with a short name, like: pipeline.<component>.yml. Each of these pipeline files can contain their own set of steps and environment variables.

Next, add a monorepo configuration section to the top of each of these component pipelines. Declare any input (produces) and output (expects) artifacts that your pipeline either builds or needs. An example configuration is:

monorepo:
  expects:  node-modules.tar.gz
  produces: app.tar.gz
  matches:
    - serverless.yml
    - app/**.ts

The matches configuration defines a set of (minimatch) glob-style paths to match. If there are any differences on your build that match (when compared to a carefully selected base commit), the component build is fully included in the resulting output pipeline.

However, if there are no matches for a component, its steps will be replaced by "dummy steps" that will download the artifacts that would have been produced had the component run (these artifacts are downloaded from the base commit's build).

For convenience, if you change pipeline.foo.yml, that change will automatically be considered matching for the foo pipeline, without having to add that pipeline file to the matches array yourself.

Features

Get artifacts for skipped components with expects/produces

A pipeline configuration can define the artifacts that the component build expects in order to run, and those that the build produces if successful. For example:

monorepo:
  expects:  blah.cfg
  produces: output/foo.zip
  [...]

These are used:

  • to put the component pipelines into a dependency order
  • to know what artifacts should be pulled from a previous build when needed (i.e. when a component pipeline can be skipped)

Deflate/inflate artifacts to convenient archive formats

See artifacts for more information

You can use the plugin to upload and download artifacts to compressed tarballs using good compression algorithms such as lz4, or even content-addressing-based caching systems such as desync (a casync implementation)

The following breaks your node_modules/ artifact up into chunks, and caches the chunks locally and on S3:

env:
  MONOFO_DESYNC_STORE: "s3+https://s3.amazonaws.com/some-bucket/desync/store"
  MONOFO_DESYNC_CACHE: "/tmp/monofo/desync-store"

steps:
  - command: yarn install
    plugins:
      - vital-software/monofo#v5.0.12:
          upload:
            node-modules.catar.caibx:
              - "node_modules/"

The resulting node-modules.catar.caibx only contains pointers to the full content chunks, and as a result, is only 200KiB for a 500MB node_modules/ artifact. This means it can upload/download in seconds.

Content-based build skipping (pure)

See content-based build skipping for more information

You can mark a pipeline as pure by setting the monorepo.pure flag to true - this indicates that it doesn't have side-effects other than producing its artifacts, and the only inputs it relies on are listed in its matches.

Doing so enables an extra layer of caching, based on the contents of the input files. For example:

monorepo:
  pure: true
  matches:
    - package.json
    - yarn.lock

In any future build, if package.json and yarn.lock have the same content, this pipeline will be skipped.

Branch inclusion/exclusion filters

If you require more specificity for what branches do or do not run your pipelines, there is a branch filter that matches the Buildkite step-level branch filtering rules.

monorepo:
  expects:  blah.cfg
  produces: output/foo.zip
  branches: 'main'

Controlling what is included

These rules are applied in the order listed here.

PIPELINE_RUN_ALL

If you set the environment variable PIPELINE_RUN_ALL=1, all parts of the pipeline will be output; this is a good way to "force a full build", or disable monofo temporarily.

PIPELINE_RUN_ONLY

If you set PIPELINE_RUN_ONLY=component-name, that component will be included, and others excluded, regardless of matches. Pipeline-level depends_on will still be respected.

PIPELINE_RUN_*, PIPELINE_NO_RUN_*

If you set PIPELINE_RUN_<COMPONENT_NAME>=1, that component will be included, even if it wouldn't ordinarily. And if you set PIPELINE_NO_RUN_<COMPONENT_NAME> that component will never be included, even if it does have matches.

Configuration

The main required piece of configuration is the BUILDKITE_API_ACCESS_TOKEN

Buildkite API access token

When calculating the commit to diff against, monofo uses Buildkite API to look up the last successful build of the current branch. To do so, monofo needs a Buildkite API access token set as the environment variable BUILDKITE_API_ACCESS_TOKEN. You'd probably set this in your Buildkite build secrets.

The token only needs the read_builds scope. We need an API token, not an agent token.

DynamoDB setup

DynamoDB setup is only required if you're intending to use pure mode

Development

  • yarn commit - Start a commit with formatting
  • yarn test - Runs the tests
  • yarn build - Compiles Typescript

Command Topics

monofo-buildkite-plugin's People

Contributors

dependabot-preview[bot] avatar dependabot[bot] avatar renovate-bot avatar semantic-release-bot avatar teriu avatar tommoyang avatar velitheda avatar wjoneil avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

getditto

monofo-buildkite-plugin's Issues

Git LFS files are always considered changed

When using this plugin in our mono repo with git lfs enabled, it always considers the lfs files to have changed.

The agent which performs the upload doesn't actually perform the git lfs checkout.

When run locally on a git lfs checkout it appears to work correctly (with the lfs checked out or not), so I can only assume it is something about the way buildkite is performing the checkout.

The automated release is failing 🚨

🚨 The automated release from the main branch failed. 🚨

I recommend you give this issue a high priority, so other packages depending on you can benefit from your bug fixes and new features again.

You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. I’m sure you can fix this πŸ’ͺ.

Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.

Once all the errors are resolved, semantic-release will release your package the next time you push a commit to the main branch. You can also manually restart the failed CI job that runs semantic-release.

If you are not sure how to resolve this, here are some links that can help you:

If those don’t help, or if this issue is reporting something you think isn’t right, you can always ask the humans behind semantic-release.


Invalid npm token.

The npm token configured in the NPM_TOKEN environment variable must be a valid token allowing to publish to the registry https://registry.npmjs.org/.

If you are using Two Factor Authentication for your account, set its level to "Authorization only" in your account settings. semantic-release cannot publish with the default "
Authorization and writes" level.

Please make sure to set the NPM_TOKEN environment variable in your CI with the exact value of the npm token.


Good luck with your project ✨

Your semantic-release bot πŸ“¦πŸš€

Multi-base-build artifacts

The inject-artifact step can become quite unwieldy with large, or large numbers of, artifacts. It's necessary because of a few reasons:

  1. monofo has a single base build that it pulls artifacts from
  2. monofo doesn't attempt to look at what artifacts exist on a build at all
    • it assumes the base build contains all the artifacts
  3. monofo doesn't interfere with how artifacts are downloaded in build steps
    • what works outside of monofo works exactly the same way inside monofo

So, for these reasons, monofo tries to ensure that every build contains every artifact (that's used by another pipeline), so that if a pipeline is skipped, that artifact can be injected and the cached version used.

But inject-artifacts is not a good mechanism. Ideally, it should be unnecessary, if we can work around the above reasons. This requires a few things:

First, break 3

The hardest bit:

  • Make monofo aware of how steps download artifacts (i.e. the artifacts plugin)
    • Make it inject into those steps environment variables like BUILDKITE_PLUGIN_ARTIFACTS_BUILD that tell it to use a specific build ID
    • Monofo can now make a step use an artifact from another build
  • Complication: some plugins we probably won't be able to identify/adjust (e.g. the JUnit plugin does an internal download, or e.g. a custom pre-command hook that does downloading of artifacts in parallel)
    • Solution: provide enough context in env vars for a custom pre-command hook to work out where to get things from (MONOFO_<COMPONENT>_BASE_BUILD_ID?)
    • Solution: also allow opt-out for specified artifacts (e.g. junit.xml if needed), which goes back to using inject-artifacts for them

Then, break 2

  • Make monofo put "skip flags" into Build Metadata when a sub-pipeline that produces an artifact is skipped
    • This metadata is keyed by the artifact name, and the value is the current base build
    • This is a build saying "I didn't build this, but I got it from buildId=X which did"
  • Now monofo can look at the single base build, and identify that it should instead pull artifacts from some other base build for some particular artifact

Then, break 1

  • Make monofo do the lookup of metadata on the base build, then inject the required build IDs into steps that download or depend on that artifact

πŸŽ‰ That should be it: no more inject-artifacts step except for things that really need to be in the Buildkite artifacts system for the current build

Action Required: Fix Renovate Configuration

There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.

Error type: undefined. Note: this is a nested preset so please contact the preset author if you are unable to fix it yourself.

`MONOFO_ARTIFACT_<NAME>_SKIP` is unusable for skipping only incremental builds

Goal for original _SKIP

Say you have an artifact named example.catar.caibx and you're trying to do the classic "incremental build" pattern:

monorepo:
  pure: true
  expects: dependency.catar.caibx
  produces: example.catar.caibx
  matches:
    - "some-path/**"

steps:
  - name: "Compile example archive"
    depends_on:
      - dependency
    commands:
      - do-the-compile
      - produce-example-file-list-to-cache  # produces example.list
    env:
      MONOFO_ARTIFACT_EXAMPLE_BUILD_ID: ${MONOFO_ARTIFACT_EXAMPLE_BUILD_ID:-$MONOFO_BASE_BUILD_ID}  # provides the place to get the base artifact for the incremental build from
      MONOFO_ARTIFACT_EXAMPLE_SOFT_FAIL: 1                                                             # says: don't worry if I can't get a base artifact for the incremental build
    plugins:
      - docker-compose#v3.9.0:
          run: node
      - vital-software/monofo#v4.0.0:
          download:
            - dependency.catar.caibx
            - example.catar.caibx   # pre-cache incremental build
          upload:
            example.catar.caibx:
              filesFrom: example.list
              null: true

The plan was to have MONOFO_ARTIFACT_EXAMPLE_SKIP=1 cause just the incremental build part of this step to be skipped, so that the produced example was run on a "clean" checkout, rather than checkout + previous result.

Why the current design isn't enough

  • Practically, you want an environment variable you can set globally, on all steps in the build
  • But if you set MONOFO_ARTIFACT_EXAMPLE_SKIP=1, any steps in other partial pipelines that want to actually depend on the latest copy of the artifact will also be skipped

Suggested fix

For the incremental use-case, we can detect that the same dependency is being uploaded later in the current step. And the skip the artifact iff we are uploading it.

We should use an additional environment variable for this new behavior, to retain BC: suggest MONOFO_ARTIFACT_EXAMPLE_SKIP_PRECACHE=1

Support overriding the default branch

As a simple way to support alternate/multiple default branches, add support for a MONOFO_DEFAULT_BRANCH environment variable that can be used with the same meaning as (but a higher priority than) BUILDKITE_PIPELINE_DEFAULT_BRANCH

Support exclusion from PIPELINE_RUN_ALL

Some sub-pipelines might only be triggered in certain circumstances, e.g. when PIPELINE_RUN_ or PIPELINE_RUN_ONLY are given. One use case I've seen for this is 'task' pipelines, run by schedules, where the task wants to reused some of the monofo build, but doesn't want to ever be pulled in by PIPELINE_RUN_ALL

This might be task: true or runs_on: { all: false } or similar:

monorepo:
  name: some-task
  task: true
  expects:
    - node-modules.tar.lz4
    - typescript.tar.lz4

Replacement of skipped steps should transfer dependencies to dependents

Example setup:

# pipeline.a.yml
steps:
  - key: a
    command: # does something with side effects

# pipeline.b.yml
monorepo:
  produces: b.tar.gz
steps:
  - key: b
    depends_on: a
    command: # creates b.tar.gz

# pipeline.c.yml
monorepo:
  expects: b.tar.gz
steps:
  - key: c
    depends_on: b
    command: # uses b.tar.gz, and relies on side-effects of `a` happening first

If b has no matches and is skipped, currently we:

  • Create the artifact step
  • Add a dependency to any steps that depended on the skipped step (in this case c), for the artifact step (so that b.tar.gz is guaranteed to exist before step c runs).

This is fine logic, however depending on the nature of the different steps, some builds may have ordering concerns between a and c. Such builds might be relying on the transitive nature of depends_on (as they should be able to) so that their pipeline is simple. In our specific example, a = node_modules, b = typescript, and c = some tests.

Just running c after the replacement of b isn't enough. Instead, it has to run after all of b's depends_ons, if they're still in the build.

Result: if b is skipped, c's depends_on should be [artifact step + (depends on of b) + (depends on of c)].filter(still in build), for a result of: [artifact step, a]

Overeager replacement of dependency in pipeline with no `expects`

I have a pipeline that's roughly like this:

monorepo:
  pure: true
  matches:
    - "**/package.json"
    - yarn.lock
    - scripts/yarn-install
  produces: node-modules.tar.lz4
    
steps:
  - name: ":nodejs: Install"
    key: node-modules
    depends_on: node-image
    command: scripts/yarn-install

Even though this pipeline doesn't declare any expects, when we skip the node-image containing pipeline, we also make this step wait for the inject artifacts step. That must be unnecessary: this pipeline declares no expects so it can't need to wait for artifacts to be pulled from previous builds (and if it does, it should declare that fact).

Removing this unnecessary depends_on replacement would allow this step to run immediately, rather than waiting for inject-artifacts.

Bad ordering prevents coherent fallback on an integration branch

When using MONOFO_INTEGRATION_BRANCH, if the diff strategy fails, the fallback is used. This implies if you want to e.g. have an integration branch called 'dev', you still need to set MONOFO_DEFAULT_BRANCH=dev. But the ordering here prevents the integration diff from ever triggering in that case:

https://github.com/vital-software/monofo/blob/061f22fb7ec910e7ac909207690c22aaea0e603b/src/diff.ts#L151-L157

I think the solution is to reverse this ordering, so that we can set both MONOFO_INTEGRATION_BRANCH and MONOFO_DEFAULT_BRANCH, and have the integration diff happen first (and the default branch diff only happen if that fails)

Sort of file arguments to tar needs to be ordered by dfs/prefix, not name-based

Should be e.g.

./node_modules/
./data-platform/node_modules
./data-platform/checks/node_modules

ends up being just based on name, so:

./data-platform/checks/node_modules
./data-platform/node_modules
./node_modules

which is the exact opposite of what we need!

Maybe GNU tar's --sort=inode is doing this internally, but we need a way to do it for our file list

Incomplete support for integration branches that are reset to the default branch

Summary: the integration branch feature MONOFO_DEFAULT_BRANCH, as implied by the name, overwrites the default branch (i.e. in the environment) so that we can treat dev as an integration branch. However, the diffing logic is suboptimal in cases where an integration branch is reset to the default branch regularly, because monofo does no assume any relationship between an integration branch and the default branch.

Problem

  • To begin with, dev and default branch both building fine, small incremental builds
  • We automatically merge main into dev every night, and we reset dev to the default branch every week (a force push, that rewrites history)
  • Engineers merge work to dev too, so it ends up being a long-running integration that lasts for a week, then gets reset
  • The problem happens on Monday, after a reset: suddenly, when the diffing logic looks at the past builds of dev on Buildkite, none of them match up to the ancestors of the current commit (because they all, as far back as the start of the week, happened on a parallel branch of development that is now thrown away by the hard reset)

Ideas

For simplicity, we want to take on the concept of an integration branch as distinct from the default branch, deprecate MONOFO_DEFAULT_BRANCH, add MONOFO_INTEGRATION_BRANCH, and provide better semantics. Those semantics are:

  • We still use the integration branch when looking up base build commits in Git, and we prefer builds that happened on the branch when matching builds on the Buildkite API, but we also consider default-branch commits that are ancestors of the integration branch.

  • We can use

    Example: ?branch[]=master&branch[]=testing returns all builds on master and testing branches

    to do so, and retain ordering

Meta: self-host monofo to reduce build time

  • #272 Add support for excluding directories from the pipeline search, or giving a base directory
    image
  • Split the pipeline files into: node-modules, typescript, test, release
  • Replace docker containers with volumes

Support group steps

https://buildkite.com/docs/pipelines/group-step

Group steps should be supported

Initial support only requires that we can analyze the depends_on which now might be within the group steps themselves - the ones on the group should be ok (or maybe we need to merge them down onto the steps) - this is bug level, because we should support all Buildkite steps

Later support for grouping (they have to be grouped together in the upload dynamically, not the same group given multiple times) would be a nice to have

Completely empty YAML file results in error about destructuring null

If a pipeline.yml file contains no contents at all (in this case, it only contained a lengthy comment), then the following error is given and monofo crashes!

TypeError: Cannot destructure property 'monorepo' of 'js_yaml_1.load(...)' as it is null.
    at Function.read (/var/lib/buildkite-agent/.npm/_npx/7612/lib/node_modules/monofo/build/src/config.js:167:17)
    at async Promise.all (index 4)
    at async Function.readAll (/var/lib/buildkite-agent/.npm/_npx/7612/lib/node_modules/monofo/build/src/config.js:229:17)
    at async Function.getAll (/var/lib/buildkite-agent/.npm/_npx/7612/lib/node_modules/monofo/build/src/config.js:237:25)

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Warning

These dependencies are deprecated:

Datasource Name Replacement PR?
npm @google/semantic-release-replace-plugin Unavailable
npm @types/log-update Unavailable
npm @types/mkdirp Unavailable
npm @types/rimraf Unavailable
npm @types/tempy Unavailable

Rate-Limited

These updates are currently rate-limited. Click on a checkbox below to force their creation now.

  • chore(deps): update buildkite plugin seek-oss/aws-sm to v2.3.2
  • chore(deps): update buildkite plugin vital-software/monofo to v5.0.12
  • chore(deps): update dependency compare-versions to v4.1.4
  • chore(deps): update buildkite plugin artifacts to v1.9.4
  • chore(deps): update buildkite plugin docker-compose to v3.13.0
  • chore(deps): update buildkite plugin docker-compose to v5
  • chore(deps): update dependency chalk to v5
  • chore(deps): update dependency compare-versions to v6
  • chore(deps): update dependency execa to v9
  • chore(deps): update dependency glob to v11
  • chore(deps): update dependency log-update to v6
  • chore(deps): update dependency mkdirp to v3
  • chore(deps): update dependency pretty-bytes to v6
  • chore(deps): update dependency rimraf to v6
  • chore(deps): update dependency split2 to v4.2.0 (split2, @types/split2)
  • chore(deps): update dependency tiny-async-pool to v2 (tiny-async-pool, @types/tiny-async-pool)
  • chore(deps): update oclif (major) (@oclif/core, @oclif/plugin-autocomplete, @oclif/plugin-commands, @oclif/plugin-help, @oclif/plugin-not-found, @oclif/plugin-version, oclif)
  • chore(deps): update dependency @semantic-release/changelog to v6.0.3
  • chore(deps): update dependency @tsconfig/node14 to v1.0.3
  • chore(deps): update dependency @types/bluebird to v3.5.42
  • chore(deps): update dependency @types/command-exists to v1.2.3
  • chore(deps): update dependency @types/debug to v4.1.12
  • chore(deps): update dependency @types/js-yaml to v4.0.9
  • chore(deps): update dependency @types/tiny-async-pool to v1.0.5
  • chore(deps): update dependency @types/toposort to v2.0.7
  • chore(deps): update js test packages (@types/jest, jest, jest-dynalite, nock, ts-jest)
  • chore(deps): update dependency @google/semantic-release-replace-plugin to v1.2.7
  • chore(deps): update dependency @types/lodash to v4.17.7
  • chore(deps): update dependency pkg to v5.8.1
  • chore(deps): update dependency typescript to v4.9.5
  • chore(deps): update linters (@typescript-eslint/eslint-plugin, @typescript-eslint/parser, eslint, eslint-config-airbnb-typescript, eslint-config-prettier, eslint-plugin-import, eslint-plugin-jest, eslint-plugin-prettier, prettier)
  • chore(deps): update commitlint monorepo to v19 (major) (@commitlint/cli, @commitlint/config-conventional)
  • chore(deps): update dependency @tsconfig/node14 to v14
  • chore(deps): update dependency @types/minimatch to v5
  • chore(deps): update dependency @types/mkdirp to v2
  • chore(deps): update dependency @types/rimraf to v4
  • chore(deps): update dependency husky to v9
  • chore(deps): update dependency typescript to v5
  • chore(deps): update jest monorepo to v29 (major) (@types/jest, jest, ts-jest)
  • chore(deps): update linters (major) (@typescript-eslint/eslint-plugin, @typescript-eslint/parser, eslint, eslint-config-airbnb-typescript, eslint-config-prettier, eslint-plugin-jest, eslint-plugin-prettier, prettier)
  • πŸ” Create all rate-limited PRs at once πŸ”

Edited/Blocked

These updates have been manually edited so Renovate will no longer make changes. To discard all commits and start over, click on a checkbox.

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Detected dependencies

buildkite
.buildkite/buildkite/pipeline.yml
  • docker-compose v3.9.0
  • improbable-eng/metahook v0.4.1
  • artifacts v1.5.0
  • improbable-eng/metahook v0.4.1
  • artifacts v1.5.0
  • docker-compose v3.9.0
  • docker-compose v3.9.0
  • improbable-eng/metahook v0.4.1
  • seek-oss/aws-sm v2.3.1
.buildkite/monofo/pipeline.node-modules.yml
  • docker-compose v3.9.0
  • vital-software/monofo v5.0.11
.buildkite/monofo/pipeline.plugin-lint.yml
  • docker-compose v3.9.0
.buildkite/monofo/pipeline.plugin-test.yml
  • docker-compose v3.9.0
.buildkite/monofo/pipeline.release.yml
  • improbable-eng/metahook v0.4.1
  • vital-software/monofo v5.0.11
  • docker-compose v3.9.0
  • seek-oss/aws-sm v2.3.1
.buildkite/monofo/pipeline.test.yml
  • docker-compose v3.9.0
  • vital-software/monofo v5.0.11
  • vital-software/monofo v5.0.11
.buildkite/monofo/pipeline.typescript.yml
  • docker-compose v3.9.0
  • vital-software/monofo v5.0.11
test/projects/kitchen-sink/.buildkite/pipeline.bar.yml
  • artifacts v1.3.0
  • artifacts v1.3.0
test/projects/kitchen-sink/.buildkite/pipeline.baz.yml
  • artifacts v1.3.0
test/projects/kitchen-sink/.buildkite/pipeline.foo.yml
  • artifacts v1.3.0
test/projects/kitchen-sink/.buildkite/pipeline.qux.yml
  • artifacts v1.3.0
test/projects/skipped/.buildkite/pipeline.bar.yml
  • artifacts v1.3.0
  • artifacts v1.3.0
test/projects/skipped/.buildkite/pipeline.foo.yml
  • artifacts v1.3.0
docker-compose
docker-compose.buildkite.yml
docker-compose.local.yml
docker-compose.yml
dockerfile
Dockerfile
npm
package.json
  • @aws-sdk/client-dynamodb 3.80.0
  • @aws-sdk/credential-provider-node 3.80.0
  • @aws-sdk/lib-dynamodb 3.80.0
  • @aws-sdk/types 3.78.0
  • @oclif/core 1.7.0
  • @oclif/plugin-autocomplete 1.2.0
  • @oclif/plugin-help 5.1.12
  • @oclif/plugin-not-found 2.3.1
  • @oclif/plugin-commands 2.1.0
  • @oclif/plugin-version 1.0.4
  • bluebird 3.7.2
  • chalk 4.1.2
  • command-exists 1.2.9
  • compare-versions 4.1.3
  • debug 4.3.4
  • execa 5.1.1
  • glob 8.0.1
  • got 11.8.3
  • js-yaml 4.1.0
  • lodash 4.17.21
  • log-update 4.0.0
  • minimatch 5.0.1
  • mkdirp 1.0.4
  • pretty-bytes 5.6.0
  • rimraf 3.0.2
  • split2 4.1.0
  • tempy 1.0.1
  • tiny-async-pool 1.3.0
  • toposort 2.0.2
  • @commitlint/cli 16.2.4
  • @commitlint/config-conventional 16.2.4
  • @google/semantic-release-replace-plugin 1.1.0
  • @semantic-release/changelog 6.0.1
  • @semantic-release/git 10.0.1
  • @tsconfig/node14 1.0.1
  • @types/bluebird 3.5.36
  • @types/command-exists 1.2.0
  • @types/debug 4.1.7
  • @types/glob 7.2.0
  • @types/jest 27.5.0
  • @types/js-yaml 4.0.5
  • @types/lodash 4.14.182
  • @types/log-update 3.1.0
  • @types/minimatch 3.0.5
  • @types/mkdirp 1.0.2
  • @types/rimraf 3.0.2
  • @types/split2 3.2.1
  • @types/tempy 0.3.0
  • @types/tiny-async-pool 1.0.1
  • @types/toposort 2.0.3
  • @typescript-eslint/eslint-plugin 5.22.0
  • @typescript-eslint/parser 5.22.0
  • cz-conventional-changelog 3.3.0
  • eslint 8.14.0
  • eslint-config-airbnb-typescript 17.0.0
  • eslint-config-prettier 8.5.0
  • eslint-plugin-import 2.26.0
  • eslint-plugin-jest 26.1.5
  • eslint-plugin-prettier 4.0.0
  • husky 7.0.4
  • jest 28.0.3
  • jest-dynalite 3.5.1
  • nock 13.2.4
  • npm-run-all 4.1.5
  • oclif 3.0.1
  • pkg 5.6.0
  • prettier 2.6.2
  • semantic-release 19.0.2
  • stdout-stderr 0.1.13
  • ts-jest 28.0.1
  • typescript 4.6.4
  • node >=14.16
nvm
.nvmrc

  • Check this box to trigger a request for Renovate to run again on this repository

Provide local artifact caching to prevent time spent downloading artifacts repeatedly

Purpose

Background context: In repositories where I'm using Monofo, I often have to pull in a "decompress artifact" and "compress artifact" helper, for dealing with large directories with many files (e.g. for caching node_modules between builds). The artifacts themselves are .tar.lz4 archives, for example.

Monofo itself also uses buildkite-agent artifact download and buildkite-agent artifact upload in the inject-artifacts step. If the artifact download there had a caching layer wrapping it, it wouldn't have to do as much downloading work.

Original Draft Issue

buildkite-agent artifact search 'output/coverage-data-platform/cobertura.part.3.xml' . --build 'd31b7e49-95cf-4cea-b091-9756a59e3605'
2021-06-03 14:19:22 INFO   Searching for artifacts: "output/coverage-data-platform/cobertura.part.3.xml"
a408d0f8-807d-467d-96bc-5cdef1238c0b output/coverage-data-platform/cobertura.part.3.xml 2021-06-03T00:14:23Z
$ buildkite-agent artifact shasum 'output/coverage-data-platform/cobertura.part.3.xml' . --build 'd31b7e49-95cf-4cea-b091-9756a59e3605'
2021-06-03 14:21:44 INFO   Searching for artifacts: "output/coverage-data-platform/cobertura.part.3.xml"
f62be7358e7ec7016cd7b53ffee4afac3c4942db

More important for big artifacts obviously. Instead of downloading, get the above info, and check a cache dir for the artifact ID as the file key. Double-check integrity by sha1sum, and fall back to re-downloading if needed. Make sure moves into the file cache are atomic to prevent most integrity errors.

The automated release is failing 🚨

🚨 The automated release from the main branch failed. 🚨

I recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.

You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. I’m sure you can resolve this πŸ’ͺ.

Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.

Once all the errors are resolved, semantic-release will release your package the next time you push a commit to the main branch. You can also manually restart the failed CI job that runs semantic-release.

If you are not sure how to resolve this, here is some links that can help you:

If those don’t help, or if this issue is reporting something you think isn’t right, you can always ask the humans behind semantic-release.


No npm token specified.

An npm token must be created and set in the NPM_TOKEN environment variable on your CI environment.

Please make sure to create an npm token and to set it in the NPM_TOKEN environment variable on your CI environment. The token must allow to publish to the registry https://registry.npmjs.org/.


Good luck with your project ✨

Your semantic-release bot πŸ“¦πŸš€

Improve desync local cache handling

  1. We should store extracted .catar files and their associated indexes in a seed dir, and use them when extracting the .caidx into a .catar
    • This will mean splitting the untar operation and using lower-level commands
  2. We should chop a file into the local cache after storing it in S3
    • This is a small optimization for the particular build agent that produced the artifact
    • This will mean splitting the tar operation and using lower-level commands
    • We can use the seed dir from the first part as an implicit guarantee of what is in the store already, and use that to further improve the performance of the upload

Make sure the npx monofo@latest download command can also be used to pre-cache artifacts from other pipelines; that would mean we can allow users to precache e.g. node-modules.caidx in their agent bootstrap

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.