Coder Social home page Coder Social logo

microsoft / azure-pipelines-artifact-caching-tasks Goto Github PK

View Code? Open in Web Editor NEW
50.0 7.0 15.0 416 KB

Azure Pipelines Tasks to cache intermediate artifacts from build (e.g. resolved node packages)

License: MIT License

TypeScript 57.08% JavaScript 42.92%

azure-pipelines-artifact-caching-tasks's Introduction

Attention

The tasks contained within this repository are not maintained at this time. Instead, please use the built-in Pipeline Caching tasks for caching intermediate artifacts in Azure Pipelines.

Azure Pipelines Artifact Caching Tasks

Build Status Release status

This repo contains the tasks that enable the caching of intermediate artifacts from an Azure Pipelines build using Universal Artifacts.

How to use

This build task is meant to add an easy way to provide caching of intermediate build artifacts. To demonstrate, let's examine the following build definition snippet:

- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
  inputs:
    keyfile: "**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
    targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
    vstsFeed: "$(ArtifactFeed)"

- script: |
    yarn install
  displayName: Install Dependencies

- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
  inputs:
    keyfile: "**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
    targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
    vstsFeed: "$(ArtifactFeed)"

Conceptually, this snippet creates a lookup key from the keyfile argument and checks the vstsFeed for a matching entry. If one exists, it will be downloaded and unpacked. After more node_modules are restored via yarn the SaveCache task runs to create a cache entry if it wasn't available previously (if a cache entry was downloaded, this is a no-op).

Inputs:

  • keyfile: The file or pattern of files to use for creating the lookup key of the cache. Due to the nature of node_modules potentially having their own yarn.lock file, this snippet explicitly excludes that pattern to ensure there is a consistent lookup key before and after package restoration.
  • targetfolder: The file/folder or pattern of files/folders that you want to cache. The matching files/folders will be represented as the universal package that is uploaded to your Azure DevOps artifact feed.
  • vstsFeed: The guid representing the artifact feed in Azure DevOps meant to store the build's caches.

If you do not want to add two build steps to your build definition, you can also use a single task that implicitly adds the SaveCache task at the end of the build. For example:

- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache@1
  inputs:
    keyfile: "**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
    targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
    vstsFeed: "$(ArtifactFeed)"

- script: |
    yarn install
  displayName: Install Dependencies

Optimistic cache restoration

If a cache was restored successfully, the build variable CacheRestored is set to true. This can provide a further performance boost by optionally skipping package install commands entirely.

In the following example, the 'yarn' task will only run if there was not a cache hit. Although this can provide faster builds, it may not be suitable for production builds.

- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache@1
  inputs:
    keyfile: "**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
    targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
    vstsFeed: "$(ArtifactFeed)"

- script: |
    yarn install
  displayName: Install Dependencies
  condition: ne(variables['CacheRestored'], 'true')

Cache aliases

By default, the name of the variable used for optimistic cache restoration defaults to CacheRestored. However, this can be problematic in restoring multiple caches in the same build (E.g. caches for build output and for packages). To work around this, you may set an optional task variable to control the naming of the CacheRestored variable.

For example:

- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache@1
  inputs:
    keyfile: "yarn.lock"
    targetfolder: "node_modules"
    vstsFeed: "$(ArtifactFeed)"
    alias: "Packages"

- script: |
    yarn install
  displayName: Install Dependencies
  condition: ne(variables['CacheRestored-Packages'], 'true')

Platform independent caches

By default, cached archives are platform dependent to support the small differences that may occur in packages produced for a specific platform. If you are certain that the cached archive will be platform independent, you can set the task variable platformIndependent to true and all platforms will restore the same archive.

For example:

- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
  inputs:
    keyfile: keyfile
    targetfolder: bin
    vstsFeed: $(ArtifactFeed)
    platformIndependent: true

Onboarding

  1. Install the extension from the marketplace into your Azure DevOps organization.
  2. Ensure Azure Artifacts is enabled for your organization.
  3. Create a new Azure Artifacts feed to store caches in. After creating the feed, the GUID will be referenced in your build definition. In the examples above, ArtifactFeed is a build variable equal to the GUID of the Azure Artifact feed.

Note: The GUID for your Azure Artifact feed can be found either by using the Azure DevOps rest apis or by creating a build task in the traditional visual designer that references the feed and then selecting "View YAML".

Known limitations

The task is designed to only cache artifacts that are produced within the build's root directory. This works best for packages that follow this convention (e.g. NPM and NuGet), but not for artifacts that are produced outside of the repo's directory (e.g. Maven).

The task skips restoring and saving caches on forked repositories by design. This is a security measure to protect cached artifacts from forked sources and a limitation from the Azure Artifacts permissions model (users of forked repositories don't have access to download these artifacts).

The task is only available to be used within Azure DevOps Services because Azure DevOps Server does not support Universal Packages.

How to build

Prerequisites: Node and NPM

Windows and Mac OSX: Download and install node from nodejs.org

Linux: Install using package manager

From a terminal ensure at least node 4.2 and npm 5:

$ node -v && npm -v
v4.2.0
5.6.0

To install npm separately:

[sudo] npm install npm@5 -g
npm -v
5.6.0

Note: On Windows, if it's still returning npm 2.x, run where npm. Notice hits in program files. Rename those two npm files and the 5.6.0 in AppData will win.

Install Dependencies

Once:

npm install

Build

The following instructions demonstrate how to build and test either all or a specific task. The output will be sent to the _build directory. You can then use the tfx client to upload this to your server for testing.

The build will also generate a task.loc.json and an english strings file under Strings in your source tree. You should check these back in. Another localization process will create the other strings files.

To build all tasks:

npm run build

Optionally, you can build a specific task:

node make.js build --task RestoreCacheV1

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

azure-pipelines-artifact-caching-tasks's People

Contributors

dependabot[bot] avatar ethanis avatar jenniferkerns avatar jessehouwing avatar matissehack avatar microsoftopensource avatar msftgits avatar schneiderl avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

azure-pipelines-artifact-caching-tasks's Issues

Prevent saving cache when PR build

PR builds often should not be cached as they contain intermediate state and/or are a security risk (in the case of PR builds from public forks).

It would be nice if this task could automatically handle these so people don't accidently cache risky artifacts.

Issue creating tarball: tar: Failed to clean up compressor

I am running the following task on a self-hosted agent with Windows 10, the latest version of Azure CLI (2.1.0), and the latest version of azure-devops extension (0.17.0):

steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache@1
  displayName: 'Restore and save artifact based on: UserManager/yarn.lock'
  inputs:
    keyfile: UserManager/yarn.lock
    targetfolder: 'UserManager/node_modules'
    vstsFeed: 'af0142af-0b95-4f50-99ed-1ccc89207d06'

The post-job which is supposed to save artifacts to cache fails with:

Issue creating tarball:
     tar: Failed to clean up compressor

This is what's logged by the task when enabling debug output:

2020-02-26T12:16:48.1005120Z ##[debug]Evaluating condition for step: 'Restore and save artifact based on: UserManager/yarn.lock'
2020-02-26T12:16:48.1007273Z ##[debug]Evaluating: AlwaysNode()
2020-02-26T12:16:48.1007798Z ##[debug]Evaluating AlwaysNode:
2020-02-26T12:16:48.1010038Z ##[debug]=> True
2020-02-26T12:16:48.1010839Z ##[debug]Result: True
2020-02-26T12:16:48.1011602Z ##[section]Starting: Restore and save artifact based on: UserManager/yarn.lock
2020-02-26T12:16:48.1166343Z ==============================================================================
2020-02-26T12:16:48.1166753Z Task         : Restore and save cache
2020-02-26T12:16:48.1167092Z Description  : Restores and saves a folder given a specified key.
2020-02-26T12:16:48.1167395Z Version      : 1.0.18
2020-02-26T12:16:48.1167616Z Author       : Microsoft Corp
2020-02-26T12:16:48.1167838Z Help         : 
2020-02-26T12:16:48.1168127Z ==============================================================================
2020-02-26T12:16:48.4325554Z ##[debug]agent.TempDirectory=C:\agent\_work\_temp
2020-02-26T12:16:48.4396662Z ##[debug]loading inputs and endpoints
2020-02-26T12:16:48.4402189Z ##[debug]loading ENDPOINT_AUTH_PARAMETER_SYSTEMVSSCONNECTION_ACCESSTOKEN
2020-02-26T12:16:48.4413702Z ##[debug]loading ENDPOINT_AUTH_SCHEME_SYSTEMVSSCONNECTION
2020-02-26T12:16:48.4417138Z ##[debug]loading ENDPOINT_AUTH_SYSTEMVSSCONNECTION
2020-02-26T12:16:48.4419231Z ##[debug]loading INPUT_DRYRUN
2020-02-26T12:16:48.4420928Z ##[debug]loading INPUT_FEEDLIST
2020-02-26T12:16:48.4422554Z ##[debug]loading INPUT_KEYFILE
2020-02-26T12:16:48.4424311Z ##[debug]loading INPUT_PLATFORMINDEPENDENT
2020-02-26T12:16:48.4426143Z ##[debug]loading INPUT_TARGETFOLDER
2020-02-26T12:16:48.4427717Z ##[debug]loading INPUT_VERBOSITY
2020-02-26T12:16:48.4432905Z ##[debug]loaded 9
2020-02-26T12:16:48.4445202Z ##[debug]Agent.ProxyUrl=undefined
2020-02-26T12:16:48.4446083Z ##[debug]Agent.CAInfo=undefined
2020-02-26T12:16:48.4446430Z ##[debug]Agent.ClientCert=undefined
2020-02-26T12:16:48.4446825Z ##[debug]Agent.SkipCertValidation=undefined
2020-02-26T12:16:48.6807897Z ##[debug]Agent.ProxyUrl=undefined
2020-02-26T12:16:48.6808301Z ##[debug]Agent.CAInfo=undefined
2020-02-26T12:16:48.6808641Z ##[debug]Agent.ClientCert=undefined
2020-02-26T12:16:48.6810472Z ##[debug]check path : C:\agent\_work\_tasks\RestoreAndSaveCache_50759521-9c5e-4f40-9ae7-8f9876ba9439\1.0.18\node_modules\azure-pipelines-tool-lib\lib.json
2020-02-26T12:16:48.6812219Z ##[debug]adding resource file: C:\agent\_work\_tasks\RestoreAndSaveCache_50759521-9c5e-4f40-9ae7-8f9876ba9439\1.0.18\node_modules\azure-pipelines-tool-lib\lib.json
2020-02-26T12:16:48.6812858Z ##[debug]system.culture=en-US
2020-02-26T12:16:48.7166149Z ##[debug]check path : C:\agent\_work\_tasks\RestoreAndSaveCache_50759521-9c5e-4f40-9ae7-8f9876ba9439\1.0.18\task.json
2020-02-26T12:16:48.7167112Z ##[debug]adding resource file: C:\agent\_work\_tasks\RestoreAndSaveCache_50759521-9c5e-4f40-9ae7-8f9876ba9439\1.0.18\task.json
2020-02-26T12:16:48.7167677Z ##[debug]system.culture=en-US
2020-02-26T12:16:48.7193055Z ##[debug]Agent.JobStatus=Succeeded
2020-02-26T12:16:48.7193462Z ##[debug]System.PullRequest.IsFork=False
2020-02-26T12:16:48.7202043Z ##[debug]keyfile=C:\agent\_work\1\s\UserManager\yarn.lock
2020-02-26T12:16:48.7205446Z ##[debug]targetfolder=C:\agent\_work\1\s\UserManager\node_modules
2020-02-26T12:16:48.7206065Z ##[debug]System.DefaultWorkingDirectory=C:\agent\_work\1\s
2020-02-26T12:16:48.7210485Z ##[debug]defaultRoot: 'C:\agent\_work\1\s'
2020-02-26T12:16:48.7211152Z ##[debug]findOptions.allowBrokenSymbolicLinks: 'false'
2020-02-26T12:16:48.7211643Z ##[debug]findOptions.followSpecifiedSymbolicLink: 'false'
2020-02-26T12:16:48.7212098Z ##[debug]findOptions.followSymbolicLinks: 'false'
2020-02-26T12:16:48.7212906Z ##[debug]matchOptions.debug: 'false'
2020-02-26T12:16:48.7213290Z ##[debug]matchOptions.nobrace: 'true'
2020-02-26T12:16:48.7213665Z ##[debug]matchOptions.noglobstar: 'false'
2020-02-26T12:16:48.7214053Z ##[debug]matchOptions.dot: 'true'
2020-02-26T12:16:48.7216842Z ##[debug]matchOptions.noext: 'false'
2020-02-26T12:16:48.7217217Z ##[debug]matchOptions.nocase: 'true'
2020-02-26T12:16:48.7217583Z ##[debug]matchOptions.nonull: 'false'
2020-02-26T12:16:48.7217986Z ##[debug]matchOptions.matchBase: 'false'
2020-02-26T12:16:48.7218390Z ##[debug]matchOptions.nocomment: 'false'
2020-02-26T12:16:48.7218779Z ##[debug]matchOptions.nonegate: 'false'
2020-02-26T12:16:48.7219161Z ##[debug]matchOptions.flipNegate: 'false'
2020-02-26T12:16:48.7219684Z ##[debug]pattern: 'C:\agent\_work\1\s\UserManager\yarn.lock'
2020-02-26T12:16:48.7243808Z ##[debug]findPath: 'C:\agent\_work\1\s\UserManager\yarn.lock'
2020-02-26T12:16:48.7244247Z ##[debug]statOnly: 'true'
2020-02-26T12:16:48.7245517Z ##[debug]found 1 paths
2020-02-26T12:16:48.7245851Z ##[debug]applying include pattern
2020-02-26T12:16:48.7253636Z ##[debug]1 matches
2020-02-26T12:16:48.7254301Z ##[debug]1 final results
2020-02-26T12:16:48.7254963Z ##[debug]Found key file: C:\agent\_work\1\s\UserManager\yarn.lock
2020-02-26T12:16:48.7255474Z ##[debug]System.DefaultWorkingDirectory=C:\agent\_work\1\s
2020-02-26T12:16:48.7255898Z ##[debug]defaultRoot: 'C:\agent\_work\1\s'
2020-02-26T12:16:48.7256332Z ##[debug]findOptions.allowBrokenSymbolicLinks: 'false'
2020-02-26T12:16:48.7256785Z ##[debug]findOptions.followSpecifiedSymbolicLink: 'false'
2020-02-26T12:16:48.7257235Z ##[debug]findOptions.followSymbolicLinks: 'false'
2020-02-26T12:16:48.7257629Z ##[debug]matchOptions.debug: 'false'
2020-02-26T12:16:48.7258005Z ##[debug]matchOptions.nobrace: 'true'
2020-02-26T12:16:48.7258404Z ##[debug]matchOptions.noglobstar: 'false'
2020-02-26T12:16:48.7258796Z ##[debug]matchOptions.dot: 'true'
2020-02-26T12:16:48.7259156Z ##[debug]matchOptions.noext: 'false'
2020-02-26T12:16:48.7259531Z ##[debug]matchOptions.nocase: 'true'
2020-02-26T12:16:48.7259895Z ##[debug]matchOptions.nonull: 'false'
2020-02-26T12:16:48.7260280Z ##[debug]matchOptions.matchBase: 'false'
2020-02-26T12:16:48.7260660Z ##[debug]matchOptions.nocomment: 'false'
2020-02-26T12:16:48.7261056Z ##[debug]matchOptions.nonegate: 'false'
2020-02-26T12:16:48.7261438Z ##[debug]matchOptions.flipNegate: 'false'
2020-02-26T12:16:48.7261871Z ##[debug]pattern: 'C:\agent\_work\1\s\UserManager\node_modules'
2020-02-26T12:16:48.7262415Z ##[debug]findPath: 'C:\agent\_work\1\s\UserManager\node_modules'
2020-02-26T12:16:48.7262808Z ##[debug]statOnly: 'true'
2020-02-26T12:16:48.7263119Z ##[debug]found 1 paths
2020-02-26T12:16:48.7263430Z ##[debug]applying include pattern
2020-02-26T12:16:48.7263748Z ##[debug]1 matches
2020-02-26T12:16:48.7264031Z ##[debug]1 final results
2020-02-26T12:16:48.7267162Z ##[debug]


-----------------------------
2020-02-26T12:16:48.7267613Z ##[debug]Found target folder: UserManager\node_modules
2020-02-26T12:16:48.7268045Z ##[debug]-----------------------------



2020-02-26T12:16:48.7274226Z ##[debug]Absolute path for pathSegments: C:\agent\_work\1\s\UserManager\yarn.lock = C:\agent\_work\1\s\UserManager\yarn.lock
2020-02-26T12:16:48.7295153Z ##[debug]platformIndependent=false
2020-02-26T12:16:48.7313174Z ##[debug]win32-1b7f9a7f0689b46356c235d9493ede2498f44f3a14b7eea2ffbcab135ec02bea=false
2020-02-26T12:16:48.7325524Z Creating cache entry for:  win32-1b7f9a7f0689b46356c235d9493ede2498f44f3a14b7eea2ffbcab135ec02bea
2020-02-26T12:16:48.7326140Z ##[debug]System.DefaultWorkingDirectory=C:\agent\_work\1\s
2020-02-26T12:16:48.7327326Z ##[debug]testing directory 'C:\agent\_work\1\s\tmp_cache'
2020-02-26T12:16:48.7328746Z ##[debug]testing directory 'C:\agent\_work\1\s'
2020-02-26T12:16:48.7329682Z ##[debug]mkdir 'C:\agent\_work\1\s\tmp_cache'
2020-02-26T12:16:49.0425444Z Issue creating tarball:
2020-02-26T12:16:49.0425900Z     tar: Failed to clean up compressor
2020-02-26T12:16:49.0426083Z 
2020-02-26T12:16:49.0426517Z ##[debug]rm -rf C:\agent\_work\1\s\tmp_cache
2020-02-26T12:16:49.0427198Z ##[debug]removing directory
2020-02-26T12:16:49.0548894Z ##[section]Finishing: Restore and save artifact based on: UserManager/yarn.lock

"where tar" gives me the following and shouldn't be the issue:

c:\Windows\System32\tar.exe

Any idea why this task is failing for me? I appreciate any help.

Artifact base name should be configurable or should take stage name into consideration

I have a multi-stage Build pipeline. Each stage relies on node_modules. While generating the cache, it is producing multiple versions of the same artifact in Artifacts Feed. As these artifacts are nothing but node_modules, the older version of node_modules makes no sense and should be cleared with the Artifacts Feed retention policy. If the user can configure the artifact base name or the task takes stage name also into consideration while generating the artifact, retention policy with keeping only the latest (or last 2) version can be enforced easily.

Error running package

Hi,

Trying to use this package, and I get this warning on the build, and nothing gets stored in the artifact feed:

2020-09-04T12:46:56.2665948Z ##[warning]Error: An unexpected error occurred while trying to download the package. Exit code(19) and error({"@t":"2020-09-04T12:46:56.0739556Z","@m":"ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 96cce4f8-9fdf-457e-a4bc-67119819c1e1","@i":"4d441561","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-09-04 12:46:56.073Z"}
{"@t":"2020-09-04T12:46:56.2311578Z","@m":"ApplicationInsightsTelemetrySender did not correlate any events with X-TFS-Session 96cce4f8-9fdf-457e-a4bc-67119819c1e1","@i":"759002a2","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-09-04 12:46:56.231Z"}
{"@t":"2020-09-04T12:46:56.2324331Z","@m":"The feed with ID 'facba201-d6aa-478d-87db-7600b209260a' doesn't exist.","@i":"8ef04954","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2020-09-04 12:46:56.232Z"})
2020-09-04T12:46:56.2718214Z ##[warning]Issue running universal packages tools

Is this because it can't connect to the feed?

I followed the instructions here to create the feed: https://xebia.com/blog/caching-your-node-modules-in-azure-devops/

And the permissions on the feed seem right (the build service has "Contributor" permissions).

Any help would be appreciated.

Thanks,
Dan.

Save Artifcat step is skipped.

I have a build pipeline that is running on an Hosted VS2017 build agent. I configured according to the documentation. My save step is always skipped and the log is mostly empty.

Restore Step Yaml:

steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
  displayName: 'Restore artifact based on: **/package-lock.json'
  inputs:
    keyfile: '**/package-lock.json'
    targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
    vstsFeed: 'e8aefeda-ecd2-4565-b851-45926c23210b'

Save Step Yaml:

steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
  displayName: 'Save artifact based on: **/package-lock.json'
  inputs:
    keyfile: '**/package-lock.json'
    targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
    vstsFeed: 'e8aefeda-ecd2-4565-b851-45926c23210b'

And this is the log from the save step:

2019-04-22T20:55:20.0033358Z ##[section]Starting: Save artifact based on: **/package-lock.json
2019-04-22T20:55:20.0135291Z ==============================================================================
2019-04-22T20:55:20.0135352Z Task : Save cache
2019-04-22T20:55:20.0135390Z Description : Saves a cache with Universal Artifacts given a specified key.
2019-04-22T20:55:20.0135439Z Version : 1.0.10
2019-04-22T20:55:20.0135472Z Author : Microsoft Corp
2019-04-22T20:55:20.0135503Z Help :
2019-04-22T20:55:20.0135551Z ==============================================================================
2019-04-22T20:55:44.1025528Z ##[section]Finishing: Save artifact based on: **/package-lock.json

Add support to project-scoped feeds

The current implementation only supports organization-scoped feeds but Azure DevOps recommends and allows to create only project-scoped feeds.

https://docs.microsoft.com/en-us/azure/devops/artifacts/concepts/feeds?view=azure-devops#project-scoped-feeds-vs-organization-scoped-feeds

https://docs.microsoft.com/en-us/azure/devops/artifacts/feeds/project-scoped-feeds?view=azure-devops

The workaround is to use the REST API to manually create a organization-scoped feed but since by default only project-scoped feeds are created, it would be nice to add the support for both types.

https://developercommunity.visualstudio.com/content/problem/859583/can-no-longer-create-organization-scoped-feeds.html

Failed to find api location for area: clienttools id: 187ec90d-dd1e-4ec6-8c57-937d979261e5

The task fails on Azure DevOps server 2020
The raw log:

2021-07-07T14:55:13.3488650Z ##[debug]Evaluating condition for step: 'Restore artifact based on: d:_wf\01\24\s/dayforce/UI/Scheduler.UI/package-lock.json'
2021-07-07T14:55:13.3535819Z ##[debug]Evaluating: SucceededNode()
2021-07-07T14:55:13.3543020Z ##[debug]Evaluating SucceededNode:
2021-07-07T14:55:13.3576816Z ##[debug]=> True
2021-07-07T14:55:13.3587066Z ##[debug]Result: True
2021-07-07T14:55:13.3637316Z ##[section]Starting: Restore artifact based on: d:_wf\01\24\s/dayforce/UI/Scheduler.UI/package-lock.json
2021-07-07T14:55:13.4336746Z ==============================================================================
2021-07-07T14:55:13.4337346Z Task : Restore cache
2021-07-07T14:55:13.4337826Z Description : Restore a folder from a cache given a specified key.
2021-07-07T14:55:13.4338137Z Version : 1.0.18
2021-07-07T14:55:13.4338501Z Author : Microsoft Corp
2021-07-07T14:55:13.4338980Z Help :
2021-07-07T14:55:13.4339264Z ==============================================================================
2021-07-07T14:55:14.5098707Z ##[debug]agent.TempDirectory=d:_wf\01_temp
2021-07-07T14:55:14.5100406Z ##[debug]loading inputs and endpoints
2021-07-07T14:55:14.5101295Z ##[debug]loading ENDPOINT_AUTH_PARAMETER_SYSTEMVSSCONNECTION_ACCESSTOKEN
2021-07-07T14:55:14.5107444Z ##[debug]loading ENDPOINT_AUTH_SCHEME_SYSTEMVSSCONNECTION
2021-07-07T14:55:14.5112099Z ##[debug]loading ENDPOINT_AUTH_SYSTEMVSSCONNECTION
2021-07-07T14:55:14.5115278Z ##[debug]loading INPUT_DRYRUN
2021-07-07T14:55:14.5118029Z ##[debug]loading INPUT_FEEDLIST
2021-07-07T14:55:14.5120677Z ##[debug]loading INPUT_KEYFILE
2021-07-07T14:55:14.5123060Z ##[debug]loading INPUT_PLATFORMINDEPENDENT
2021-07-07T14:55:14.5125611Z ##[debug]loading INPUT_TARGETFOLDER
2021-07-07T14:55:14.5128269Z ##[debug]loading INPUT_VERBOSITY
2021-07-07T14:55:14.5132639Z ##[debug]loading SECRET_DBPASSWORD
2021-07-07T14:55:14.5136102Z ##[debug]loading SECRET_SYSTEM_ACCESSTOKEN
2021-07-07T14:55:14.5141502Z ##[debug]loaded 11
2021-07-07T14:55:14.5156327Z ##[debug]Agent.ProxyUrl=undefined
2021-07-07T14:55:14.5157933Z ##[debug]Agent.CAInfo=undefined
2021-07-07T14:55:14.5158324Z ##[debug]Agent.ClientCert=undefined
2021-07-07T14:55:14.5158659Z ##[debug]Agent.SkipCertValidation=undefined
2021-07-07T14:55:14.8009894Z ##[debug]Agent.ProxyUrl=undefined
2021-07-07T14:55:14.8010439Z ##[debug]Agent.CAInfo=undefined
2021-07-07T14:55:14.8010756Z ##[debug]Agent.ClientCert=undefined
2021-07-07T14:55:14.8015694Z ##[debug]check path : d:_wf\01_tasks\RestoreCache_9aea8869-034d-4094-a6ad-880767d0686c\1.0.18\node_modules\azure-pipelines-tool-lib\lib.json
2021-07-07T14:55:14.8020637Z ##[debug]adding resource file: d:_wf\01_tasks\RestoreCache_9aea8869-034d-4094-a6ad-880767d0686c\1.0.18\node_modules\azure-pipelines-tool-lib\lib.json
2021-07-07T14:55:14.8021326Z ##[debug]system.culture=en-US
2021-07-07T14:55:14.8607830Z ##[debug]check path : d:_wf\01_tasks\RestoreCache_9aea8869-034d-4094-a6ad-880767d0686c\1.0.18\task.json
2021-07-07T14:55:14.8610400Z ##[debug]adding resource file: d:_wf\01_tasks\RestoreCache_9aea8869-034d-4094-a6ad-880767d0686c\1.0.18\task.json
2021-07-07T14:55:14.8611833Z ##[debug]system.culture=en-US
2021-07-07T14:55:14.8643903Z ##[debug]Agent.JobStatus=Succeeded
2021-07-07T14:55:14.8645300Z ##[debug]System.PullRequest.IsFork=False
2021-07-07T14:55:14.8655665Z ##[debug]keyFile=d:_wf\01\24\s\dayforce\UI\Scheduler.UI\package-lock.json
2021-07-07T14:55:14.8657900Z ##[debug]System.DefaultWorkingDirectory=d:_wf\01\24\s
2021-07-07T14:55:14.8664328Z ##[debug]defaultRoot: 'd:_wf\01\24\s'
2021-07-07T14:55:14.8666138Z ##[debug]findOptions.allowBrokenSymbolicLinks: 'false'
2021-07-07T14:55:14.8667358Z ##[debug]findOptions.followSpecifiedSymbolicLink: 'false'
2021-07-07T14:55:14.8668554Z ##[debug]findOptions.followSymbolicLinks: 'false'
2021-07-07T14:55:14.8670383Z ##[debug]matchOptions.debug: 'false'
2021-07-07T14:55:14.8671770Z ##[debug]matchOptions.nobrace: 'true'
2021-07-07T14:55:14.8672822Z ##[debug]matchOptions.noglobstar: 'false'
2021-07-07T14:55:14.8677291Z ##[debug]matchOptions.dot: 'true'
2021-07-07T14:55:14.8678468Z ##[debug]matchOptions.noext: 'false'
2021-07-07T14:55:14.8679301Z ##[debug]matchOptions.nocase: 'true'
2021-07-07T14:55:14.8680055Z ##[debug]matchOptions.nonull: 'false'
2021-07-07T14:55:14.8680898Z ##[debug]matchOptions.matchBase: 'false'
2021-07-07T14:55:14.8681666Z ##[debug]matchOptions.nocomment: 'false'
2021-07-07T14:55:14.8682378Z ##[debug]matchOptions.nonegate: 'false'
2021-07-07T14:55:14.8683076Z ##[debug]matchOptions.flipNegate: 'false'
2021-07-07T14:55:14.8683812Z ##[debug]pattern: 'd:_wf\01\24\s\dayforce\UI\Scheduler.UI\package-lock.json'
2021-07-07T14:55:14.8705725Z ##[debug]findPath: 'd:_wf\01\24\s\dayforce\UI\Scheduler.UI\package-lock.json'
2021-07-07T14:55:14.8706884Z ##[debug]statOnly: 'true'
2021-07-07T14:55:14.8708430Z ##[debug]found 1 paths
2021-07-07T14:55:14.8709212Z ##[debug]applying include pattern
2021-07-07T14:55:14.8718251Z ##[debug]1 matches
2021-07-07T14:55:14.8719522Z ##[debug]1 final results
2021-07-07T14:55:14.8720085Z ##[debug]Found key file: d:_wf\01\24\s\dayforce\UI\Scheduler.UI\package-lock.json
2021-07-07T14:55:14.8720601Z ##[debug]System.DefaultWorkingDirectory=d:_wf\01\24\s
2021-07-07T14:55:14.8727597Z ##[debug]Absolute path for pathSegments: d:_wf\01\24\s\dayforce\UI\Scheduler.UI\package-lock.json = d:_wf\01\24\s\dayforce\UI\Scheduler.UI\package-lock.json
2021-07-07T14:55:14.8820600Z ##[debug]platformIndependent=true
2021-07-07T14:55:14.8877912Z ##[debug]System.DefaultWorkingDirectory=d:_wf\01\24\s
2021-07-07T14:55:14.8918914Z ##[debug]testing directory 'd:_wf\01\24\s\tmp_cache'
2021-07-07T14:55:14.8920566Z ##[debug]testing directory 'd:_wf\01\24\s'
2021-07-07T14:55:14.8923396Z ##[debug]mkdir 'd:_wf\01\24\s\tmp_cache'
2021-07-07T14:55:14.8940544Z ##[debug]dryRun=false
2021-07-07T14:55:14.8940893Z ##[debug]alias=null
2021-07-07T14:55:14.8945706Z ##[debug]Getting artifact tool
2021-07-07T14:55:14.8946246Z ##[debug]Getting credentials for local feeds
2021-07-07T14:55:14.8964589Z SYSTEMVSSCONNECTION exists true
2021-07-07T14:55:14.8964985Z ##[debug]SYSTEMVSSCONNECTION exists true
2021-07-07T14:55:14.8965297Z ##[debug]Got auth token
2021-07-07T14:55:14.8966540Z ##[debug]SYSTEMVSSCONNECTION=http://tdc1tfsapp01:8080/tfs/DefaultCollection/
2021-07-07T14:55:14.8970612Z ##[debug]System.ServerType=OnPremises
2021-07-07T14:55:14.8973619Z http://tdc1tfsapp01:8080/tfs/DefaultCollection/
2021-07-07T14:55:14.8978902Z ##[debug]UPack.OverrideArtifactToolPath=undefined
2021-07-07T14:55:14.8981513Z ##[debug]Agent.ProxyUrl=undefined
2021-07-07T14:55:14.9372051Z ##[warning]Error: Failed to find api location for area: clienttools id: 187ec90d-dd1e-4ec6-8c57-937d979261e5
2021-07-07T14:55:14.9382412Z ##[debug]Processed: ##vso[task.issue type=warning;]Error: Failed to find api location for area: clienttools id: 187ec90d-dd1e-4ec6-8c57-937d979261e5
2021-07-07T14:55:14.9383171Z Error initializing artifact tool utility
2021-07-07T14:55:14.9383874Z ##[warning]Issue running universal packages tools
2021-07-07T14:55:14.9384753Z ##[debug]Processed: ##vso[task.issue type=warning;]Issue running universal packages tools
2021-07-07T14:55:14.9385191Z ##[debug]rm -rf d:_wf\01\24\s\tmp_cache
2021-07-07T14:55:14.9385517Z ##[debug]removing directory
2021-07-07T14:55:14.9500140Z ##[section]Finishing: Restore artifact based on: d:_wf\01\24\s/dayforce/UI/Scheduler.UI/package-lock.json

First question: Is the task supported on Azure DevOps server or is it the same story as with #13937
The yaml configuration:

  • task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache@1
    displayName: 'Cache npm packages'
    inputs:
    keyfile: '$(Build.SourcesDirectory)/dayforce/UI/Scheduler.UI/package-lock.json'
    targetfolder: '$(Build.SourcesDirectory)/dayforce/UI/Scheduler.UI/node_modules'
    vstsFeed: '3f278d58-089c-48a5-88a4-bcc898bffa10'

Artifact feed cannot be found, yet it does exist

I'm trying to run task as part of a build, and nothing is happening due to the fact that the feed does not exist, yet it does.

2020-11-22T12:11:46.1418529Z ##[section]Starting: RestoreAndSaveCache
2020-11-22T12:11:46.1425312Z ==============================================================================
2020-11-22T12:11:46.1425561Z Task         : Restore and save cache
2020-11-22T12:11:46.1425798Z Description  : Restores and saves a folder given a specified key.
2020-11-22T12:11:46.1425989Z Version      : 1.0.18
2020-11-22T12:11:46.1426147Z Author       : Microsoft Corp
2020-11-22T12:11:46.1426322Z Help         : 
2020-11-22T12:11:46.1426512Z ==============================================================================
2020-11-22T12:11:46.4368114Z SYSTEMVSSCONNECTION exists true
2020-11-22T12:11:46.5696376Z got service url from area
2020-11-22T12:11:46.5699750Z https://vsblob.dev.azure.com/peculigital/
2020-11-22T12:11:47.0389430Z Downloading: https://08wvsblobprodsu6weus73.vsblob.vsassets.io/artifacttool/artifacttool-linux-x64-Release_0.2.172.zip?sv=2019-02-02&sr=b&sig=78boalUUrRx7boHbSI3l19J6UsiiyBOaMK63HuumEOk%3D&spr=https&se=2020-11-22T13%3A11%3A47Z&sp=r&P1=1606050407&P2=11&P3=2&P4=cmCBBEJ7ONYlxb97GR4GPsUq1YJkahfuCBGMU%2bhWpU0%3d
2020-11-22T12:11:49.9501082Z Caching tool: ArtifactTool 0.2.172 x64
2020-11-22T12:11:50.1373973Z /opt/hostedtoolcache/ArtifactTool/0.2.172/x64/artifacttool
2020-11-22T12:11:50.1412469Z SYSTEMVSSCONNECTION exists true
2020-11-22T12:11:50.2580995Z got service url from area
2020-11-22T12:11:50.5984845Z Downloading package: spotboxmasterbuild, version: 1.0.0-aa9268f0d820e71a3b0109e34dc0dad7b90b7cf52e8b38178a941641d5c0bfcb using feed id: 6214b037-2ff6-4b0d-8f66-15eae64c2683
2020-11-22T12:11:50.5997210Z /opt/hostedtoolcache/ArtifactTool/0.2.172/x64/artifacttool universal download --feed 6214b037-2ff6-4b0d-8f66-15eae64c2683 --service https://dev.azure.com/peculigital/ --package-name spotboxmasterbuild --package-version 1.0.0-aa9268f0d820e71a3b0109e34dc0dad7b90b7cf52e8b38178a941641d5c0bfcb --path /home/vsts/work/1/s/tmp_cache --patvar UNIVERSAL_DOWNLOAD_PAT --verbosity None
2020-11-22T12:11:52.5908006Z {"@t":"2020-11-22T12:11:50.7781767Z","@m":"Ignoring --cache-directory because the cache is not yet supported on OS \"Linux 4.15.0-1098-azure #109~16.04.1-Ubuntu SMP Wed Sep 30 18:53:14 UTC 2020\".","@i":"e0df78aa","@l":"Warning","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-11-22 12:11:50.778Z"}
2020-11-22T12:11:52.5930530Z {"@t":"2020-11-22T12:11:52.3291829Z","@m":"ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 3ef1c829-d91f-4082-96eb-5b0bef2f1a58","@i":"ad37c4b8","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-11-22 12:11:52.329Z"}
2020-11-22T12:11:52.5964253Z ##[warning]Error: An unexpected error occurred while trying to download the package. Exit code(19) and error({"@t":"2020-11-22T12:11:50.7781767Z","@m":"Ignoring --cache-directory because the cache is not yet supported on OS \"Linux 4.15.0-1098-azure #109~16.04.1-Ubuntu SMP Wed Sep 30 18:53:14 UTC 2020\".","@i":"e0df78aa","@l":"Warning","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-11-22 12:11:50.778Z"}
{"@t":"2020-11-22T12:11:52.3291829Z","@m":"ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 3ef1c829-d91f-4082-96eb-5b0bef2f1a58","@i":"ad37c4b8","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-11-22 12:11:52.329Z"}
{"@t":"2020-11-22T12:11:52.5832548Z","@m":"ApplicationInsightsTelemetrySender did not correlate any events with X-TFS-Session 3ef1c829-d91f-4082-96eb-5b0bef2f1a58","@i":"17b28a74","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-11-22 12:11:52.583Z"}
{"@t":"2020-11-22T12:11:52.5841694Z","@m":"The feed with ID '6214b037-2ff6-4b0d-8f66-15eae64c2683' doesn't exist.","@i":"7b338b08","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2020-11-22 12:11:52.584Z"})
2020-11-22T12:11:52.5993315Z {"@t":"2020-11-22T12:11:52.5832548Z","@m":"ApplicationInsightsTelemetrySender did not correlate any events with X-TFS-Session 3ef1c829-d91f-4082-96eb-5b0bef2f1a58","@i":"17b28a74","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-11-22 12:11:52.583Z"}
2020-11-22T12:11:52.5995025Z {"@t":"2020-11-22T12:11:52.5841694Z","@m":"The feed with ID '6214b037-2ff6-4b0d-8f66-15eae64c2683' doesn't exist.","@i":"7b338b08","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2020-11-22 12:11:52.584Z"}
2020-11-22T12:11:52.5996003Z ##[warning]Issue running universal packages tools
2020-11-22T12:11:52.6181087Z ##[section]Finishing: RestoreAndSaveCache

My YAML for this step looks like this:

steps:
  - task: NodeTool@0
    inputs:
      versionSpec: '14.x'
    displayName: 'Install Node.js'
  - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache@1
    inputs:
      keyfile: '**/package-lock.json, !**/node_modules/**/package-lock.json, !**/.*/**/package-lock.json'
      targetFolder: '**/node_modules, !**/node_modules/**/node_modules'
      vstsFeed: $(ArtifactFeed)
      platformIndependent: true
  - script: |
      npm install
    displayName: 'Install dependencies only if there is no cache available'
    condition: ne(variables['CacheRestored'], 'true')

The variable is set in a top level YAML file:

trigger:
  - master

pool:
  name: 'Azure Pipelines'
  vmImage: 'ubuntu-20.04'

variables:
  artifactFeed: 'SpotboxCache'

image

I can't figure this out, and would appreciate any help.

Restore Cache succeeds with `tar: Error opening archive` message

This log shows that the message tar: Error opening archive: Failed to open. Clearly the task did not succeed, but it is marked as such.

I believe the fault is with the failure to check the return code when calling tar in cacheUtilities.downloadCaches

shell.exec(`tar -xzf ${tarballPath} -C "${destinationFolder}"`);

try {
    tl.setVariable(hash, "true");
    // only if tar succeeds should we indicateit succeeded
    const success = shell.exec(`tar -xzf ${tarballPath} -C "${destinationFolder}"`).code == 0);
    // Set variable to track whether or not we downloaded cache (i.e. it already existed)
    tl.setVariable(output, success ? "true" : "false");
    return;
    }
} catch (err) {
    console.log(err);
}

Can't get it working with maven

I tried to use the caching with a maven project. I redirected the maven repo with -Dmaven.repo.local=$(Build.Repository.LocalPath)/cache/.m2. ls $(Build.Repository.LocalPath)/cache/.m2 prints files after the first step that downloads dependencies into this directory. But my SaveCache task:

- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
  inputs:
    keyfile: pom.xml
    targetfolder: cache
    vstsFeed: MYGUID

just prints

2019-07-05T10:01:29.0607255Z ##[section]Starting: SaveCache
2019-07-05T10:01:29.0642800Z ==============================================================================
2019-07-05T10:01:29.0643157Z Task         : Save cache
2019-07-05T10:01:29.0643305Z Description  : Saves a cache with Universal Artifacts given a specified key.
2019-07-05T10:01:29.0643482Z Version      : 1.0.13
2019-07-05T10:01:29.0643668Z Author       : Microsoft Corp
2019-07-05T10:01:29.0643834Z Help         : 
2019-07-05T10:01:29.0644112Z ==============================================================================
2019-07-05T10:01:29.5796005Z ##[section]Finishing: SaveCache

or in the more detailed version

2019-07-05T11:01:16.5204340Z ##[debug]pattern: '/home/vsts/work/1/s/cache'
2019-07-05T11:01:16.5204708Z ##[debug]applying include pattern against original list
2019-07-05T11:01:16.5204971Z ##[debug]1 matches
2019-07-05T11:01:16.5205210Z ##[debug]1 final results
2019-07-05T11:01:16.5205674Z ##[debug]


-----------------------------
2019-07-05T11:01:16.5206101Z ##[debug]cache
2019-07-05T11:01:16.5206530Z ##[debug]-----------------------------



2019-07-05T11:01:16.5207009Z ##[debug]Absolute path for pathSegments: /home/vsts/work/1/s/pom.xml = /home/vsts/work/1/s/pom.xml
2019-07-05T11:01:16.5207279Z ##[debug]platformIndependent=false
2019-07-05T11:01:16.5207719Z ##[debug]linux-fooHash=undefined
2019-07-05T11:01:16.5208054Z ##[debug]task result: Skipped
2019-07-05T11:01:16.5209132Z ##[debug]Processed: ##vso[task.complete result=Skipped;]Not caching artifact produced during build: linux-fooHash
2019-07-05T11:01:16.5210289Z ##[section]Finishing: SaveCache

using caching tasks on private build server without tar

Hi,

When creating the archive for the cache, a tar command is executed.

cshell.exec(tar -xzf ${tarballPath} -C "${destinationFolder}");

This works fine on hosted 2017 agents but not on our private win2016 core agents.

Looking over the list of tools installed on a hosted vs2017 image I cannot find tar among them.

Can you point me in the right direction of which set of tools should be installed on our build servers
to get the task to work?

Kind regards,

Hans ter Wal

SaveCache step is not executed

the step is supposed to run, looks like it is skipped

image

with the following raw log:

2021-01-25T23:26:07.3363715Z ##[section]Starting: SaveCache
2021-01-25T23:26:07.3373483Z ==============================================================================
2021-01-25T23:26:07.3374104Z Task : Save cache
2021-01-25T23:26:07.3374584Z Description : Saves a cache with Universal Artifacts given a specified key.
2021-01-25T23:26:07.3375015Z Version : 1.0.18
2021-01-25T23:26:07.3375370Z Author : Microsoft Corp
2021-01-25T23:26:07.3375741Z Help :
2021-01-25T23:26:07.3376144Z ==============================================================================
2021-01-25T23:26:08.0490208Z ##[section]Finishing: SaveCache

If a step is skipped due to condition not match, the information will be printed out like the one below, so I think this might be an issue with the save cache task itself.
image

restore cache step failed: error initializing artifact tool utility

this is the log:

Starting: RestoreCache

Task : Restore cache
Description : Restore a folder from a cache given a specified key.
Version : 1.0.18
Author : Microsoft Corp
Help :

SYSTEMVSSCONNECTION exists true
got service url from area
https://mssqltools.vsblob.visualstudio.com/
Downloading: https://004vsblobprodeus2116.vsblob.vsassets.io/artifacttool/artifacttool-linux-x64-Release_0.2.181.zip?sv=2019-07-07&sr=b&sig=e7oXCKTrMTp2uLwsKJL9In%2F84fmfwNKu9RIm2AN6iTM%3D&spr=https&se=2021-01-26T00%3A02%3A09Z&sp=r&P1=1611619029&P2=11&P3=2&P4=FmRDql35kTtX0YYO6P3yqFewc9Ttt7suhcmJj7mPR8Q%3d
##[warning]Invalid or unsupported zip format. No END header found
Error initializing artifact tool utility
##[warning]Issue running universal packages tools
Finishing: RestoreCache

Can't get it to work for dotnet project

I'm trying to cache the NuGet packages that we need for our asp.net core project, so far without success.
I have the following definition:

steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache@1
  inputs:
    keyfile: '**/*.csproj, **/*.sln'
    targetfolder: '$(backendFolder)packages/'
    vstsFeed: '$(ArtifactFeed)'
- script: dotnet restore 'Backend.sln' --packages 'packages/'
  condition: ne(variables['CacheRestored'], 'true')
  displayName: 'dotnet restore'
  workingDirectory: '$(backendFolder)'
- script: dotnet build 'Backend.sln' --configuration $(buildConfiguration) --no-restore
  workingDirectory: '$(backendFolder)'
  displayName: 'dotnet build $(buildConfiguration)'

I tried various combinations, but so far none seems to work. I guess I am making a basic mistake, but can't figure it out, so any help is appreciated!
What I noticed in the build output:

{"@t":"2019-04-28T14:49:13.2701438Z","@m":"Can't find the package 'xy-ci' with version '1.0.0-linux-83d1e3cdf979d3a31a12e7fd49ef4f8dc76ee52086ef493622cf3c747d712369' in feed 'ArtifactCache'.","@i":"23b82bea","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2019-04-28 14:49:13.270Z"}
Cache miss:  linux-83d1e3cdf979d3a31a12e7fd49ef4f8dc76ee52086ef493622cf3c747d712369

When I check the feed, the package is not there. But how does the caching step come to that number if it does not exist?

Issue with creating the tar package

Since few days we have a problem on Azure Devops and Microsoft Hosted Build Agents when using the Caching Task.

Starting: Initialize job
Agent name: 'Azure Pipelines 2'
Agent machine name: 'fv-az623'
Current agent version: '2.163.1'
Current image version: '20200113.1'
Agent running as: 'VssAdministrator'
Prepare build directory.
Set build variables.
Download all required tasks.
Downloading task: RestoreAndSaveCache (1.0.18)
Downloading task: CmdLine (2.163.0)
Downloading task: DeleteFiles (1.154.0)
Start tracking orphan processes.
Finishing: Initialize job

2020-01-21T16:21:12.9236275Z ##[section]Starting: RestoreAndSaveCache
2020-01-21T16:21:13.1494518Z ==============================================================================
2020-01-21T16:21:13.1494591Z Task         : Restore and save cache
2020-01-21T16:21:13.1500992Z Description  : Restores and saves a folder given a specified key.
2020-01-21T16:21:13.1501068Z Version      : 1.0.18
2020-01-21T16:21:13.1501101Z Author       : Microsoft Corp
2020-01-21T16:21:13.1501202Z Help         : 
2020-01-21T16:21:13.1501237Z ==============================================================================
2020-01-21T16:22:17.1646169Z Creating cache entry for:  win32-0b0dbef67d2c2b38649073d487151fdfd167ab15f6d7d1606731ffbb57b4b77f
2020-01-21T16:22:17.4617249Z Issue creating tarball:
2020-01-21T16:22:17.4617739Z     tar: Failed to clean up compressor
2020-01-21T16:22:17.4618034Z 
2020-01-21T16:22:19.0192643Z ##[section]Finishing: RestoreAndSaveCache

Tar Issues - Error opening archive and cleaning up compressor on Windows Image

We are seeing an issue when using the Restore and save cache task on a Microsoft Hosted Agent running a Windows image. The task runs just fine, with the same configuration, when we use a Linux image. We are trying to cache npm packages for our angular application but cannot seem to figure out this tar issue while running on a Windows image. Issue #35 is the same error we are getting but the issue has been closed for a few months now.

The error we get when restoring is: tar: Error opening archive: Failed to open '/D/a/1/s/tmp_cache/aaa579713b25a5a1b4e37b99c571beadb0b1ec3592e007eb503ac631.tar.gz'

The error we get when saving is: Issue creating tarball: tar: Failed to clean up compressor

Windows Image: windows-2019
Key file: RepositoryRoot\AngularApplication\package-lock.json
Target folder: RepositoryRoot\AngularApplication\node_modules

Running on Image: ubuntu-20.04 with the same Key file and Target folder the Restore and save cache task runs as expected.

I think the error when restoring might be a path issue. On both images, it tries to use a Linux path /D/. On the Windows image, I think it needs to be D:/.

Please let me know if more details are needed. Any help would be appreciated!

Originally posted by @temparnak in #35 (comment)

Feed GUID not recognized for project-scoped feeds

We got exit code 19 error when the universal artifact download call is done in the RestoreAndSaveCache task. This happens in a hosted Ubuntu agent with a project-scoped feed with default permissions. It's exactly the same error (feed not found) which we got previously with the beta Cache@2 task until we used the 'project-name/feed-name' syntax for the feed but that trick does not work in this extension. The feed guid was retrieved by using the task wizard and its generated YAML code in the pipeline file.

I would also argue that an error like this should really fail the task without failing the pipeline so that this kind of problem can be detected easier. A plain cache miss would of course not fail the task.

Slightly blurred log snippets:
...
/opt/hostedtoolcache/ArtifactTool/0.2.128/x64/artifacttool universal download --feed *** --service https://dev.azure.com/***/ --package-name pipelinesdemoci --package-version 1.0.0-linux-0ded3c5f6f3d1e3ad4d55df155f8233c909403ac42cd19f4c173c3dfe230ed42 --path /home/vsts/work/1/s/tmp_cache --patvar UNIVERSAL_DOWNLOAD_PAT --verbosity Trace
...
{"@t":"2020-04-22T15:23:45.4281724Z","@m":"The feed with ID '***' doesn't exist.","@i":"09f7690a","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2020-04-22 15:23:45.428Z"}
##[warning]Error: An unexpected error occurred while trying to download the package. Exit code(19) and error({"@t":"2020-04-22T15:23:44.6194844Z","@m":"ArtifactHttpClientFactory.CreateVssHttpClient: DedupStoreHttpClient with BaseUri: https://vsblobprodsu6weu.vsblob.visualstudio.com/***/, MaxRetries:5, SendTimeout:00:15:00","@i":"372f5d96","@l":"Verbose","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-04-22 15:23:44.619Z"}
...

"Step references task '1ESLighthouseEng....' which does not exist." Error

Hi! I read your blog post (linked in the VS Code release notes) and thought it sounded like a thing I should try!

But I get this error:

Job Linux: Step references task '1ESLighthouseEng.PipelineArtifactCaching.RestoreAndSaveCacheV1.RestoreAndSaveCache' at version '1' which does not exist.

image

which makes me think I'm missing some basic beginner step like "installing" the task or something.

Azure build page: https://dev.azure.com/isomorphic-git/isomorphic-git/_build/results?buildId=847&view=results

Got any suggestions?

An unexpected error occurred while trying to download the package Exit code(19)

When trying to cache node_modules in my pipeline, I get the following error:

C:\hostedtoolcache\windows\ArtifactTool\0.2.128\x64\ArtifactTool.exe universal download --feed c4b42050-b3c8-4b6b-99d6-3105da67cf96 --service https://dev.azure.com/[MY REPO]/ --package-name [PROJECT NAME] --package-version 1.0.0-win32-db7976018f205bbf5dfbf04a22a9f9878bdcb7a7b441d4bce45572460492c994 --path d:\a\1\s\tmp_cache --patvar UNIVERSAL_DOWNLOAD_PAT --verbosity None
{"@t":"2020-03-03T15:32:31.9648327Z","@m":"ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 49ab3bc6-e61b-4267-b774-c1c87c3010d4","@i":"4e909835","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-03-03 15:32:31.964Z"}
{"@t":"2020-03-03T15:32:32.4985816Z","@m":"The feed with ID 'c4b42050-b3c8-4b6b-99d6-3105da67cf96' doesn't exist.","@i":"2cb2cf53","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2020-03-03 15:32:32.498Z"}
##[warning]Error: An unexpected error occurred while trying to download the package. Exit code(19) and error({"@t":"2020-03-03T15:32:31.9648327Z","@m":"ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 49ab3bc6-e61b-4267-b774-c1c87c3010d4","@i":"4e909835","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-03-03 15:32:31.964Z"}
{"@t":"2020-03-03T15:32:32.4985816Z","@m":"The feed with ID 'c4b42050-b3c8-4b6b-99d6-3105da67cf96' doesn't exist.","@i":"2cb2cf53","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2020-03-03 15:32:32.498Z"})
##[debug]Processed: ##vso[task.issue type=warning;]Error: An unexpected error occurred while trying to download the package. Exit code(19) and error({"@t":"2020-03-03T15:32:31.9648327Z","@m":"ApplicationInsightsTelemetrySender will correlate events with X-TFS-Session 49ab3bc6-e61b-4267-b774-c1c87c3010d4","@i":"4e909835","SourceContext":"ArtifactTool.Commands.UPackDownloadCommand","UtcTimestamp":"2020-03-03 15:32:31.964Z"}%0D%0A{"@t":"2020-03-03T15:32:32.4985816Z","@m":"The feed with ID 'c4b42050-b3c8-4b6b-99d6-3105da67cf96' doesn't exist.","@i":"2cb2cf53","@l":"Error","SourceContext":"ArtifactTool.Program","UtcTimestamp":"2020-03-03 15:32:32.498Z"})
##[warning]Issue running universal packages tools
##[debug]Processed: ##vso[task.issue type=warning;]Issue running universal packages tools
##[debug]rm -rf d:\a\1\s\tmp_cache
##[debug]removing directory
Finishing: RestoreAndSaveCache

Pipeline task

- task: RestoreAndSaveCache@1
  inputs:
    keyfile: '**\[MyProjectName]\package-lock.json, !**\[MyProjectName]\node_modules\**\package-lock.json'
    targetfolder: '$**\[MyProjectName]\node_modules, !**\[MyProjectName]\node_modules\**\node_modules'
    vstsFeed: '[feedId]'

I have tried referencing the artifact feed using it's name and myProject/feedName referenced here , but those only throw error(2) for not being a valid ID.

Azure Dev Ops Server 2019 Support

On the marketplace it says that it also supports Azure Dev Ops Server 2019 but it does not work, cause the server version does not have the universal artifact packages.

image

I'm a bit confused about that, cause in the services version there is the Cache@v2 usable which does not exist on server. I thought that this would be the alternative but it does not work. ๐Ÿ˜•

Cache npm cache directory instead of node_modules

To ensure the CI build uses the package-lock.json, I run npm ci in the build instead of npm install.

npm ci blasts away the node_modules folder. And then recreates it from the npm cache. As such, I'd like to cache the npm cache folder instead.

I've added the ability to configure the working directory in pull request #10. Only I can't get the tests to succeed on my own machine nor on the server. I must be missing some vital component or somethign else has broken.

Requires tar on the build

The task should really bring its own tools IMHO, or the documentation should include a task to have that tool included in the build pipeline. Having to have all tools installed on the agent is a pointless dead exercise.

Further more the task shouldn't succeed if the tar fails. That's really bad, we had a dev thought everything was great because everything ran fine, only to find it didn't.

Given universal files does dedupe and compression, is there a need for tar?

Save artifact skipped after npm install

My pipeline runs on a hosted agent and it's not a forked repository. But after the npm install step or npm build step (tried both) the save artifact is always skipped. (Same if I use the combined restore and save task)

Step yaml:

steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
  displayName: 'Save artifact based on: **/package-lock.json, !**/node_modules/**/package-lock.json, !**/.*/**/package-lock.json'
  inputs:
    keyfile: '**/package-lock.json, !**/node_modules/**/package-lock.json, !**/.*/**/package-lock.json'
    targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
    vstsFeed: 'b43b132c-7213-4f99-86be-e365f9b81320'

I enabled system diagnostics see more in the log this is what it's saying at the end: (log is to long to post fully here)

2020-02-19T13:21:40.6619967Z ##[debug]121098 results
2020-02-19T13:21:40.6620193Z ##[debug]found 121098 paths
2020-02-19T13:21:40.6620368Z ##[debug]applying include pattern
2020-02-19T13:21:41.7090069Z ##[debug]306 matches
2020-02-19T13:21:41.7095332Z ##[debug]pattern: '!\node_modules**\node_modules'
2020-02-19T13:21:41.7095502Z ##[debug]trimmed leading '!'. pattern: '
\node_modules*\node_modules'
2020-02-19T13:21:41.7095696Z ##[debug]after ensurePatternRooted, pattern: 'd:\a\1\s*
\node_modules**\node_modules'
2020-02-19T13:21:41.7095810Z ##[debug]applying exclude pattern
2020-02-19T13:21:41.7146928Z ##[debug]305 matches
2020-02-19T13:21:41.7151200Z ##[debug]1 final results
2020-02-19T13:21:41.7156704Z ##[debug]


2020-02-19T13:21:41.7156947Z ##[debug]Found target folder: src\clients\workspace\node_modules
2020-02-19T13:21:41.7157360Z ##[debug]-----------------------------

2020-02-19T13:21:41.7184265Z ##[debug]Absolute path for pathSegments: d:\a\1\s\src\clients\workspace\libs\ngx-app-settings\package-lock.json = d:\a\1\s\src\clients\workspace\libs\ngx-app-settings\package-lock.json
2020-02-19T13:21:41.7184611Z ##[debug]Absolute path for pathSegments: d:\a\1\s\src\clients\workspace\package-lock.json = d:\a\1\s\src\clients\workspace\package-lock.json
2020-02-19T13:21:41.7269669Z ##[debug]platformIndependent=false
2020-02-19T13:21:41.8275658Z ##[debug]win32-0a4407a3b3b2b8eda3115a3700c0aa1c2d3e09263cfaf8d2e7d2770566e809fe=undefined
2020-02-19T13:21:41.8276529Z ##[debug]task result: Skipped
2020-02-19T13:21:41.8304250Z ##[debug]Processed: ##vso[task.complete result=Skipped;]Not caching artifact produced during build: win32-0a4407a3b3b2b8eda3115a3700c0aa1c2d3e09263cfaf8d2e7d2770566e809fe
2020-02-19T13:21:42.6530188Z ##[section]Finishing: Save artifact based on: /package-lock.json, !/node_modules//package-lock.json, !/.*/**/package-lock.json

Any idea what the problem is here? I just can't figure it out what I'm doing wrong.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.