Coder Social home page Coder Social logo

datadog-sync-cli's Introduction

datadog-sync-cli

Datadog cli tool to sync resources across organizations.

Table of Contents

Quick Start

See Installing section for guides on how to install and setup the tool.

Run the import command to read the specified resources from the source organization and store them locally into JSON files in the directory resources/source.

Then, you can run the sync command which will use the stored files from previous import command (unless --force-missing-dependencies flag is passed) to create/modify the resources on the destination organization. The pushed resources are saved in the directory resources/destination.

Note: The tool uses the resources directory as the source of truth for determining what resources need to be created and modified. Hence, this directory should not be removed or corrupted.

Example Usage

# Import resources from parent organization and store them locally
$ datadog-sync import \
    --source-api-key="..." \
    --source-app-key="..." \
    --source-api-url="https://api.datadoghq.com"

> 2024-03-14 14:53:54,280 - INFO - Starting import...
> ...
> 2024-03-14 15:00:46,100 - INFO - Finished import

# Check diff output to see what resources will be created/modified
$ datadog-sync diffs \
    --destination-api-key="..." \
    --destination-app-key="..." \
    --destination-api-url="https://api.datadoghq.eu"

> 2024-03-14 15:46:22,014 - INFO - Starting diffs...
> ...
> 2024-03-14 14:51:15,379 - INFO - Finished diffs

# Sync the resources to the child organization from locally stored files and save the output locally
$ datadog-sync sync \
    --destination-api-key="..." \
    --destination-app-key="..." \
    --destination-api-url="https://api.datadoghq.eu"

> 2024-03-14 14:55:56,535 - INFO - Starting sync...
> ...
> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors

Purpose

The purpose of the datadog-sync-cli package is to provide an easy way to sync Datadog resources across Datadog organizations.

Note: this tool does not, nor is intended, for migrating intake data such as ingested logs, metrics, etc.

The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.

Installing

Installing from source

Note:: Instlling from source requires Python >= v3.9

  1. Clone the project repo and CD into the directory git clone https://github.com/DataDog/datadog-sync-cli.git; cd datadog-sync-cli
  2. Install datadog-sync-cli tool using pip pip install .
  3. Invoke the cli tool using datadog-sync <command> <options>

Installing from Releases

MacOS and Linux

  1. Download the executable from the Releases page
  2. Provide the executable with executable permission chmod +x datadog-sync-cli-{system-name}-{machine-type}
  3. Move the executable to your bin directory sudo mv datadog-sync-cli-{system-name}-{machine-type} /usr/local/bin/datadog-sync
  4. Invoke the CLI tool using datadog-sync <command> <options>

Windows

  1. Download the executable with extension .exe from the Releases page
  2. Add the directory containing the exe file to your path
  3. Invoke the CLI tool in cmd/powershell using the file name and omitting the extension: datadog-sync-cli-windows-amd64 <command> <options>

Using docker and building the image

  1. Clone the project repo and CD into the directory git clone https://github.com/DataDog/datadog-sync-cli.git; cd datadog-sync-cli
  2. Build the provided Dockerfile docker build . -t datadog-sync
  3. Run the Docker image using entrypoint below:
docker run --rm -v <PATH_TO_WORKING_DIR>:/datadog-sync:rw \
  -e DD_SOURCE_API_KEY=<DATADOG_API_KEY> \
  -e DD_SOURCE_APP_KEY=<DATADOG_APP_KEY> \
  -e DD_SOURCE_API_URL=<DATADOG_API_URL> \
  -e DD_DESTINATION_API_KEY=<DATADOG_API_KEY> \
  -e DD_DESTINATION_APP_KEY=<DATADOG_APP_KEY> \
  -e DD_DESTINATION_API_URL=<DATADOG_API_URL> \
  datadog-sync:latest <command> <options>

The docker run command mounts a specified <PATH_TO_WORKING_DIR> working directory to the container.

Usage

API URL

Available URL's for the source and destination API URLs are:

  • https://api.datadoghq.com
  • https://api.datadoghq.eu
  • https://api.us5.datadoghq.com
  • https://api.us3.datadoghq.com
  • https://api.ddog-gov.com
  • https://api.ap1.datadoghq.com

For all available regions, see Getting Started with Datadog Sites.

Filtering

Filtering is done on two levels, at top resources level and per individual resource level using --resources and --filter respectively.

Top resources level filtering

By default all resources are imported, synced, etc. If you would like to perform actions on a specific top level resource, or subset of resources, use --resources option. For example, the command datadog-sync import --resources="dashboard_lists,dashboards" will import ALL dashboards and dashboard lists in your Datadog organization.

Per resource level filtering

Individual resources can be further filtered using the --filter flag. For example, the following command datadog-sync import --resources="dashboards,dashboard_lists" --filter='Type=dashboard_lists;Name=name;Value=My custom list', will import ALL dashboards and ONLY dashboard lists with the name attribute equal to My custom list.

Filter option (--filter) accepts a string made up of key=value pairs separated by ;.

--filter 'Type=<resource>;Name=<attribute_name>;Value=<attribute_value>;Operator=<operator>'

Available keys:

  • Type: Resource such as Monitors, Dashboards, and more. [required]
  • Name: Attribute key to filter on. This can be any attribute represented in dot notation (such as attributes.user_count). [required]
  • Value: Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string. [required]
  • Operator: Available operators are below. All invalid operator's default to ExactMatch.

By default, if multiple filters are passed for the same resource, OR logic is applied to the filters. This behavior can be adjusted using the --filter-operator option.

SubString and ExactMatch Deprecation

In future releases the SubString and ExactMatch Operator will be removed. This is because the Value key supports regex so both of these scenarios are covered by just writing the appropriate regex. Below is an example:

Let's take the scenario where you would like to filter for monitors that have the filter test in the name attribute:

Operator Command
SubString --filter 'Type=monitors;Name=name;Value=filter test;Operator=SubString'
Using Value --filter 'Type=monitors;Name=name;Value=.*filter test.*
ExactMatch --filter 'Type=monitors;Name=name;Value=filter test;Operator=ExactMatch'
Using Value --filter 'Type=monitors;Name=name;Value=^filter test$

Config file

A Custom config text file can be passed in place of options.

This is an example config file:

# config

destination_api_url="https://api.datadoghq.eu"
destination_api_key="<API_KEY>"
destination_app_key="<APP_KEY>"
source_api_key="<API_KEY>"
source_app_key="<APP_KEY>"
source_api_url="https://api.datadoghq.com"
filter=["Type=Dashboards;Name=title;Value=Test screenboard", "Type=Monitors;Name=tags;Value=sync:true"]

Then, run: datadog-sync import --config config

Cleanup flag

The tools sync command provides a cleanup flag (--cleanup). Passing the cleanup flag will delete resources from the destination organization which have been removed from the source organization. The resources to be deleted are determined based on the difference between the state files of source and destination organization.

For example, ResourceA and ResourceB are imported and synced, followed by deleting ResourceA from the source organization. Running the import command will update the source organizations state file to only include ResourceB. The following sync --cleanup=Force command will now delete ResourceA from the destination organization.

State files

A resources directory is generated in the current working directory of the user. This directory contains json mapping of resources between the source and destination organization. To avoid duplication and loss of mapping, this directory should be retained between tool usage.

When running againts multiple destination organizations, a seperate working directory should be used to ensure seperation of data.

Supported resources

Resource Description
authn_mappings Sync Datadog authn mappings.
dashboard_lists Sync Datadog dashboard lists.
dashboards Sync Datadog dashboards.
downtime_schedules Sync Datadog downtimes.
downtimes (deprecated) Sync Datadog downtimes.
host_tags Sync Datadog host tags.
logs_custom_pipelines (deprecated) Sync Datadog logs custom pipelines.
logs_indexes Sync Datadog logs indexes.
logs_indexes_order Sync Datadog logs indexes order.
logs_metrics Sync Datadog logs metrics.
logs_pipelines Sync Datadog logs OOTB integration and custom pipelines.
logs_pipelines_order Sync Datadog logs pipelines order.
logs_restriction_queries Sync Datadog logs restriction queries.
metric_percentiles Sync Datadog metric percentiles.
metric_tag_configurations Sync Datadog metric tags configurations.
metrics_metadata Sync Datadog metric metadata.
monitors Sync Datadog monitors.
notebooks Sync Datadog notebooks.
powerpacks Sync Datadog powerpacks.
restriction_policies Sync Datadog restriction policies.
roles Sync Datadog roles.
service_level_objectives Sync Datadog SLOs.
slo_corrections Sync Datadog SLO corrections.
spans_metrics Sync Datadog spans metrics.
synthetics_global_variables Sync Datadog synthetic global variables.
synthetics_private_locations Sync Datadog synthetic private locations.
synthetics_tests Sync Datadog synthetic tests.
teams Sync Datadog teams (excluding users and permissions).
users Sync Datadog users.

Note: logs_custom_pipelines resource has been deprecated in favor of logs_pipelines resource which supports both logs OOTB integration and custom pipelines. To migrate to the new resource, rename the existing state files from logs_custom_pipelines.json to logs_pipelines.json for both source and destination files.

Best practices

Many Datadog resources are interdependent. For example, some Datadog resource can reference roles and dashboards, which includes widgets that may use Monitors or Synthetics data. The datadog-sync tool syncs these resources in order to ensure dependencies are not broken.

If importing/syncing subset of resources, users should ensure that dependent resources are imported and synced as well.

See Supported resources section below for potential resource dependencies.

Resource Dependencies
authn_mappings roles, teams
dashboard_lists dashboards
dashboards monitors, roles, powerpacks, service_level_objectives
downtime_schedules monitors
downtimes (deprecated) monitors
host_tags -
logs_custom_pipelines (deprecated) -
logs_indexes -
logs_indexes_order logs_indexes
logs_metrics -
logs_pipelines -
logs_pipelines_order logs_pipelines
logs_restriction_queries roles
metric_percentiles -
metric_tag_configurations -
metrics_metadata -
monitors roles, service_level_objectives
notebooks -
powerpacks monitors, service_level_objectives
restriction_policies dashboards, service_level_objectives, notebooks, users, roles
roles -
service_level_objectives monitors, synthetics_tests
slo_corrections service_level_objectives
spans_metrics -
synthetics_global_variables synthetics_tests
synthetics_private_locations -
synthetics_tests synthetics_private_locations, synthetics_global_variables, roles
teams -
users roles

datadog-sync-cli's People

Contributors

abbasalizaidi avatar alai97 avatar aldrickdev avatar github-actions[bot] avatar jdheyburn avatar nkzou avatar nouemankhal avatar skarimo avatar therve avatar tim-chaplin-dd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datadog-sync-cli's Issues

Docker File - exit status 1: python setup.py egg_info

Overview of issue:

Steps to reproduce the issue:

Clone the project repo and CD into the directory git clone https://github.com/DataDog/datadog-sync-cli.git; cd datadog-sync-cli
Build the probided Dockerfile docker build . -t datadog-sync
followed from https://github.com/DataDog/datadog-sync-cli

Describe the results you received:

=> ERROR [3/4] RUN pip install /datadog-sync-cli 3.3s

[3/4] RUN pip install /datadog-sync-cli:
#0 1.260 WARNING: Value for scheme.headers does not match. Please report this to pypa/pip#9617
#0 1.260 distutils: /usr/local/include/python3.9/UNKNOWN
#0 1.260 sysconfig: /usr/local/include/python3.9
#0 1.260 WARNING: Additional context:
#0 1.260 user = False
#0 1.260 home = None
#0 1.260 root = None
#0 1.260 prefix = None
#0 1.306 Processing /datadog-sync-cli
#0 1.306 DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
#0 1.306 pip 21.3 will remove support for this functionality. You can find discussion regarding this at pypa/pip#7555.
#0 2.984 ERROR: Command errored out with exit status 1:
#0 2.984 command: /usr/local/bin/python -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-7_t1vd4g/setup.py'"'"'; file='"'"'/tmp/pip-req-build-7_t1vd4g/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-z1paamdz
#0 2.984 cwd: /tmp/pip-req-build-7_t1vd4g/
#0 2.984 Complete output (51 lines):
#0 2.984 running egg_info
#0 2.984 creating /tmp/pip-pip-egg-info-z1paamdz/datadog_sync_cli.egg-info
#0 2.984 writing /tmp/pip-pip-egg-info-z1paamdz/datadog_sync_cli.egg-info/PKG-INFO
#0 2.984 writing dependency_links to /tmp/pip-pip-egg-info-z1paamdz/datadog_sync_cli.egg-info/dependency_links.txt
#0 2.984 writing entry points to /tmp/pip-pip-egg-info-z1paamdz/datadog_sync_cli.egg-info/entry_points.txt
#0 2.984 writing requirements to /tmp/pip-pip-egg-info-z1paamdz/datadog_sync_cli.egg-info/requires.txt
#0 2.984 writing top-level names to /tmp/pip-pip-egg-info-z1paamdz/datadog_sync_cli.egg-info/top_level.txt
#0 2.984 writing manifest file '/tmp/pip-pip-egg-info-z1paamdz/datadog_sync_cli.egg-info/SOURCES.txt'
#0 2.984 Traceback (most recent call last):
#0 2.984 File "", line 1, in
#0 2.984 File "/tmp/pip-req-build-7_t1vd4g/setup.py", line 8, in
#0 2.984 setup()
#0 2.984 File "/usr/local/lib/python3.9/site-packages/setuptools/init.py", line 153, in setup
#0 2.984 return distutils.core.setup(**attrs)
#0 2.984 File "/usr/local/lib/python3.9/distutils/core.py", line 148, in setup
#0 2.984 dist.run_commands()
#0 2.984 File "/usr/local/lib/python3.9/distutils/dist.py", line 966, in run_commands
#0 2.984 self.run_command(cmd)
#0 2.984 File "/usr/local/lib/python3.9/distutils/dist.py", line 985, in run_command
#0 2.984 File "/usr/local/lib/python3.9/distutils/dist.py", line 857, in get_command_obj
#0 2.984 klass = self.get_command_class(command)
#0 2.984 File "/usr/local/lib/python3.9/site-packages/setuptools/dist.py", line 790, in get_command_class
#0 2.984 self.cmdclass[command] = cmdclass = ep.load()
#0 2.984 File "/usr/local/lib/python3.9/site-packages/pkg_resources/init.py", line 2450, in load
#0 2.984 return self.resolve()
#0 2.984 File "/usr/local/lib/python3.9/site-packages/pkg_resources/init.py", line 2456, in resolve
#0 2.984 module = import(self.module_name, fromlist=['name'], level=0)
#0 2.984 ModuleNotFoundError: No module named 'setuptools.command.build'
#0 2.984 ----------------------------------------
#0 2.984 WARNING: Discarding file:///datadog-sync-cli. Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
#0 2.985 ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
#0 3.243 WARNING: You are using pip version 21.1; however, version 23.1.2 is available.
#0 3.243 You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.


Dockerfile:5

3 | # Install CLI
4 | COPY . /datadog-sync-cli
5 | >>> RUN pip install /datadog-sync-cli
6 |
7 | VOLUME ["/datadog-sync"]

ERROR: failed to solve: process "/bin/sh -c pip install /datadog-sync-cli" did not complete successfully: exit code: 1

Describe the results you expected:

The container build to load without errors.

Additional information you deem important (e.g. issue happens only occasionally):

datadog-sync don't import browser test steps

Overview of issue:
I'm trying to sync a browser test between two orgs. The datadog-sync cli command created the test on the destination organization, but without steps related to it.

Steps to reproduce the issue:

  1. Add the following filter to an external config file: filter=["Type=synthetics_tests;Name=public_id;Value=8j3-p4u-e35"]

  2. Using the parameter --resources synthetics_tests when invoking the datadog import/sync command

  3. Full command execution is this: datadog-sync sync --config config.ini --resources synthetics_tests -v

Describe the results you received:
Creation of a browser test without any steps associated to it.

Describe the results you expected:
Creation of a browser test with the correct steps associated to it.

Getting error while importing synthetics_tests that use global variables

Overview of issue:

When trying to import Synthetics tests global variables, a new id is generated, which causes a problem while importing the synthetics tests, as the tests refer the ID that is imported and not a new one

Steps to reproduce the issue:

  1. Create a global variable in Org A
  2. Create synthetics test and refer the Step 1 variable in the test
  3. Run import command for global variables [Refer to the Results Received section now]
  4. Run the sync command for global variables
  5. Run import command for synthetics tests [Refer to the Results Received section now]
  6. Synthetics tests sync would fail as it won't find the ID of global variable it is referring

Describe the results you received:

  • In the source folder after the import command, the synthetics_global_variables.json file would contain the correct ID (from Org A)

  • In the destination folder after the import command, the synthetics_global_variables.json file would contain a new ID associated with each global variable.

Below is the error that I am getting while sync-ing the synthetics-tests [for every test that refers to global-variables]

Error Message: Error while creating resource synthetics_tests. source ID: xxx-xxx-xxx#12345678 - Error: 400 Bad Request - {"errors":["Global variable ids don't exist: xxxxxxxx-xxxxxxxx-xxxx-xxxxxxxxxx"]}

Describe the results you expected:

destination synthetics_global_variables.json should also have the same id, so that syncthetics_tests import can refer to those variables. [IDEALLY]

Monitors not updating in destination

Overview of issue:
In the documentation, it is stated that the purpose is The source organization will not be modified, but the destination organization will have resources created and updated by the sync command. However, I do not see my monitor updating via this sync tool. I expect that in the source I have resourceA which the sync tool will then create in the destination as resourceB. If I make an edit to resourceA, I expect running the sync command to then update resourceB in my destination. But this does not happen.

Steps to reproduce the issue:

  1. Create resourceA in source
  2. Run sync tool to get equivalent (resourceB) in destination.
  3. Make an edit to the query of monitor in resourceA
  4. Run sync tool
  5. resourceB in destination is unchanged.

Describe the results you received:
resourceB does not update

Describe the results you expected:
I expect resourceB to update with the same changes that I made manually to resourceA

Additional information you deem important (e.g. issue happens only occasionally):

Timeout during import of monitors

Overview of issue:
Timeout during import of monitors

datadog-sync import --config config.txt -v --resources monitors --max-workers 1
2023-04-25 17:20:39,064 - INFO - Importing monitors
2023-04-25 17:20:39,066 - DEBUG - Starting new HTTPS connection (1): xxxxx.datadoghq.com:443
2023-04-25 17:21:09,461 - ERROR - Error while importing resources monitors: HTTPSConnectionPool(host='xxxxx.datadoghq.com', port=443): Read timed out. (read timeout=30)
2023-04-25 17:21:09,462 - INFO - Finished importing monitors: 0 successes, 0 errors

Steps to reproduce the issue:

  1. See command above

Describe the results you received:

  • None, due to timeout

Describe the results you expected:

  • Downloaded monitors

Additional information you deem important (e.g. issue happens only occasionally):

  • Problem occurs since 17.04.2023
  • Download of dashboards works fine

The author of migrated dashboards and notebooks are not original author

Overview of issue:
The author of migrated dashboards and notebooks are not original author but the owner of the destination api key.

Steps to reproduce the issue:

  1. Create some dashboards
  2. Import & Sync to destination datadog instance
  3. Check dashboard author fields

Describe the results you received:
All the dashboards are now created by the owner of the api key of destination datadog instance

Describe the results you expected:
Keep the original author of dashboard and notebook

Cannot connect to host error, ssl related

First timer with the tool.
Provided the credentials using a config file (as described in the readme file). Tried to pull the resources from datadog using the command:

$ datadog-sync import --config config

but got the following error:

2024-06-24 13:28:03,544 - ERROR - error while validating api key: Cannot connect to host api.datadoghq.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')]
Traceback (most recent call last):
  File "aiohttp/connector.py", line 992, in _wrap_create_connection
  File "asyncio/base_events.py", line 1149, in create_connection
  File "asyncio/base_events.py", line 1182, in _create_connection_transport
  File "asyncio/sslproto.py", line 578, in _on_handshake_complete
  File "asyncio/sslproto.py", line 560, in _do_handshake
  File "ssl.py", line 917, in do_handshake
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "datadog_sync/utils/configuration.py", line 67, in init_async
  File "datadog_sync/utils/configuration.py", line 194, in _validate_client
  File "datadog_sync/utils/configuration.py", line 188, in _validate_client
  File "datadog_sync/utils/custom_client.py", line 30, in wrapper
  File "aiohttp/client.py", line 1194, in __aenter__
  File "aiohttp/client.py", line 578, in _request
  File "aiohttp/connector.py", line 544, in connect
  File "aiohttp/connector.py", line 911, in _create_connection
  File "aiohttp/connector.py", line 1235, in _create_direct_connection
  File "aiohttp/connector.py", line 1204, in _create_direct_connection
  File "aiohttp/connector.py", line 994, in _wrap_create_connection
aiohttp.client_exceptions.ClientConnectorCertificateError: Cannot connect to host api.datadoghq.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "datadog_sync/commands/shared/utils.py", line 35, in run_cmd_async
  File "datadog_sync/utils/configuration.py", line 69, in init_async
AttributeError: 'Configuration' object has no attribute '_exit_cleanup'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "datadog_sync/cli.py", line 23, in <module>
  File "click/core.py", line 1157, in __call__
  File "click/core.py", line 1078, in main
  File "click/core.py", line 1688, in invoke
  File "click/core.py", line 1434, in invoke
  File "click/core.py", line 783, in invoke
  File "datadog_sync/commands/_import.py", line 18, in _import
  File "datadog_sync/commands/shared/utils.py", line 19, in run_cmd
  File "asyncio/runners.py", line 194, in run
  File "asyncio/runners.py", line 118, in run
  File "asyncio/base_events.py", line 687, in run_until_complete
  File "datadog_sync/commands/shared/utils.py", line 57, in run_cmd_async
AttributeError: 'NoneType' object has no attribute 'pbar'
[46242] Failed to execute script 'cli' due to unhandled exception!

Steps to reproduce the issue:

  1. Downloaded the datadog-sync-cli-darwin-arm64 file.
  2. Created the config file:
source_api_key="<API_KEY>"
source_app_key="<APP_KEY>"
source_api_url="https://api.datadoghq.com"
  1. run the command:
$ datadog-sync import --config config

Running the tool on macbook pro, m1 chip

How to use the --cleanup option

Overview of issue:
Hi, I'm trying to figure out how to use the --cleanup option. I'm assuming this is used to delete all sync'ed resources from the destination org that were previously synced?

Steps to reproduce the issue:

  1. Create config:
destination_api_url="https://api.datadoghq.com"
destination_api_key="redacted"
destination_app_key="redacted"
source_api_key="redacted"
source_app_key="redacted"
source_api_url="https://api.datadoghq.com"
resources="roles,dashboards,dashboard_lists,logs_custom_pipelines,logs_indexes,logs_metrics,logs_restriction_queries"
  1. Run import and sync
./datadog-sync-cli import --config config
./datadog-sync-cli sync --config config
  1. Cleanup resources in destination org
./datadog-sync-cli sync --config config --cleanup true

Describe the results you received:
Resources from destination org are not deleted. For example, I still see all synced dashboards still existing in the destination org.

Describe the results you expected:
Resources in destination org get cleaned up/deleted.

Additional information you deem important (e.g. issue happens only occasionally):
I do not see any output from the cli that indicates cleanup is being done; only the sync logs indicating resources are being created/updated. Let me know if I can add any additional info here.

CLI version: 0.3.0
OS: M1 Mac Ventura 13.3.1 (a)

Failed to build docker image

Fail to build docker image

Steps to reproduce the issue:

  1. Clone the repository
  2. Run docker build . -t docker-sync

Errors

` => ERROR [3/4] RUN pip install /datadog-sync-cli 3.5s

[3/4] RUN pip install /datadog-sync-cli:
#6 1.063 Processing /datadog-sync-cli
#6 1.064 Preparing metadata (setup.py): started
#6 3.356 Preparing metadata (setup.py): finished with status 'error'
#6 3.359 error: subprocess-exited-with-error
#6 3.359
#6 3.359 × python setup.py egg_info did not run successfully.
#6 3.359 │ exit code: 1
#6 3.359 ╰─> [49 lines of output]
#6 3.359 /usr/local/lib/python3.11/site-packages/setuptools/installer.py:27: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer.
#6 3.359 warnings.warn(
#6 3.359 Traceback (most recent call last):
#6 3.359 File "", line 2, in
#6 3.359 File "", line 34, in
#6 3.359 File "/datadog-sync-cli/setup.py", line 28, in
#6 3.359 setup(
#6 3.359 File "/usr/local/lib/python3.11/site-packages/setuptools/init.py", line 87, in setup
#6 3.359 return distutils.core.setup(**attrs)
#6 3.359 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#6 3.359 File "/usr/local/lib/python3.11/site-packages/setuptools/_distutils/core.py", line 147, in setup
#6 3.359 _setup_distribution = dist = klass(attrs)
#6 3.359 ^^^^^^^^^^^^
#6 3.359 File "/usr/local/lib/python3.11/site-packages/setuptools/dist.py", line 476, in init
#6 3.359 _Distribution.init(
#6 3.359 File "/usr/local/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 282, in init
#6 3.359 self.finalize_options()
#6 3.359 File "/usr/local/lib/python3.11/site-packages/setuptools/dist.py", line 900, in finalize_options
#6 3.359 ep(self)
#6 3.359 File "/usr/local/lib/python3.11/site-packages/setuptools/dist.py", line 920, in _finalize_setup_keywords
#6 3.359 ep.load()(self, ep.name, value)
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/integration.py", line 91, in version_keyword
#6 3.359 _assign_version(dist, config)
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/integration.py", line 60, in _assign_version
#6 3.359 maybe_version = _get_version(config)
#6 3.359 ^^^^^^^^^^^^^^^^^^^^
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/init.py", line 153, in _get_version
#6 3.359 parsed_version = _do_parse(config)
#6 3.359 ^^^^^^^^^^^^^^^^^
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/init.py", line 87, in _do_parse
#6 3.359 parse_result = _call_entrypoint_fn(config.absolute_root, config, config.parse)
#6 3.359 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/_entrypoints.py", line 49, in _call_entrypoint_fn
#6 3.359 return fn(root)
#6 3.359 ^^^^^^^^
#6 3.359 File "/datadog-sync-cli/setup.py", line 25, in parse_fetch_on_shallow
#6 3.359 return parse(root, pre_parse=fetch_on_shallow)
#6 3.359 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/git.py", line 179, in parse
#6 3.359 wd = get_working_directory(config)
#6 3.359 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/git.py", line 164, in get_working_directory
#6 3.359 return GitWorkdir.from_potential_worktree(config.absolute_root)
#6 3.359 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/git.py", line 54, in from_potential_worktree
#6 3.359 require_command(cls.COMMAND)
#6 3.359 File "/datadog-sync-cli/.eggs/setuptools_scm-7.1.0-py3.11.egg/setuptools_scm/utils.py", line 171, in require_command
#6 3.359 raise OSError("%r was not found" % name)
#6 3.359 OSError: 'git' was not found
#6 3.359 [end of output]
#6 3.359
#6 3.359 note: This error originates from a subprocess, and is likely not a problem with pip.
#6 3.360 error: metadata-generation-failed
#6 3.360
#6 3.360 × Encountered error while generating package metadata.
#6 3.360 ╰─> See above for output.
#6 3.360
#6 3.360 note: This is an issue with the package mentioned above, not pip.
#6 3.360 hint: See above for details.


executor failed running [/bin/sh -c pip install /datadog-sync-cli]: exit code: 1`

Expected output
It should successfully build docker image

Note
Looks like a dependency is missing git.

logs_pipelines_order: how to use?

Overview of issue:
(I'm the customer that asked support for logs_pipelines_order, thanks for the work.)

We got some issues using the new logs_pipelines_order resource, this issue is there to help future users, maybe improve documentation?

when running a diffs before the first sync, we get irrelevant warning

2023-10-09 12:07:49,477 - INFO - Skipping resource: logs_pipelines_order with ID: logs-pipeline-order. Failed to connect resource. {'logs_pipelines': ['jVAX8t9cSTyi_d0-MUh1Jg', 'Sa3WHk3XTwmYYZ0OIR7nuQ', 'IG8_j7YlRfKx0SEvpmjVwg', 'fX2bzl2GQU6EZ0OQ4Jagsg', 'ztJVNwC1S9aBUpqrhUipoA', '6P7jGlyzQqmWYUyn_-F1lw', 'niwmTpGBRIa0SVWa4fU5_w', 'TNDMVe05RsiJmoHgFFbWCQ', 'I-8dmnN5Sk6hEUZdoT8Pdw', 'pAlQHQrvTGOTnrMdHsMVAA', '5At6jBy3QdiV_TSkzNA1kg', 'qZ9ZFLDfTMqGt19AWMxT9g', 'c02R-uDlSUqtNFN4yZcc3g', 'KXPC9KXYTjOV-an6gOhxsw', '52UKFMI5T8WePSxnzyBYHg', 'flyg8Z2iQ3qX3xilMLkaVw', '-eC5_z7ASnyRY74WJ8Zp7g', 'fPxzejH1T_qYqyYPEQqppw', 'tdAfnrdtRWmSYgSm1V_ewA', 'dNCvmHB2QUq6wT0tsinwYg', '763j0DksQ8qftv17JFORoQ', 'CIpxRivjQ2GG-rsunYLZgA', 'T2ZbdeZiSN64rGcx_hMj3A', 'jl9IrjFLTYeWytPhKHQY5Q', '7cKr42xYRaSKJEQ_Tv1N9w', 'ngPqhHLcT3CzlQnVpzLHZA', 'ku7hXF7cSnSlIhLTFw667g', 'wJQRgMvmRGyJl30P78aPTw', 'g5Cs5sr2SemXtMu9Kx87mQ', 'v8w8EypcQf6Egrubqsk5mQ', '4zK7IVI7SbOF5CVgKNgWSA', 'm1EY3k75R6W_DjRPVrA-SQ', 'e1aJMuvWQDC6cjfLILdlFA', 'ulZwKxd3TJGK4F17V3HiJQ', 'GcuKoWxhRJeJqEMEzsyZdA', 'OAusavmVRo2MDS2LwvBHFQ', 'gjs9S_ToRPeEKsT3I4rd9g', 'bxzFAkgeR3GYupJiC9C0Gw', 'CIgFm_mdT5uzKUROm0K6vQ', 'sSQhKUSxReO_ypgLSVvf2w', 'JyPx6rEFRkmKAFMBTAeMeA']}

indeed, after a first sync (of logs_pipelines,logs_pipelines_order), we only get a smaller list:

2023-10-09 12:13:00,580 - INFO - Skipping resource: logs_pipelines_order with ID: logs-pipeline-order. Failed to connect resource. {'logs_pipelines': ['6P7jGlyzQqmWYUyn_-F1lw', 'TNDMVe05RsiJmoHgFFbWCQ', 'I-8dmnN5Sk6hEUZdoT8Pdw', 'pAlQHQrvTGOTnrMdHsMVAA', 'c02R-uDlSUqtNFN4yZcc3g', '-eC5_z7ASnyRY74WJ8Zp7g', 'tdAfnrdtRWmSYgSm1V_ewA', '763j0DksQ8qftv17JFORoQ', 'CIpxRivjQ2GG-rsunYLZgA', 'T2ZbdeZiSN64rGcx_hMj3A', 'jl9IrjFLTYeWytPhKHQY5Q', '7cKr42xYRaSKJEQ_Tv1N9w', 'wJQRgMvmRGyJl30P78aPTw', 'g5Cs5sr2SemXtMu9Kx87mQ', '4zK7IVI7SbOF5CVgKNgWSA', 'm1EY3k75R6W_DjRPVrA-SQ', 'e1aJMuvWQDC6cjfLILdlFA', 'ulZwKxd3TJGK4F17V3HiJQ', 'GcuKoWxhRJeJqEMEzsyZdA', 'OAusavmVRo2MDS2LwvBHFQ', 'gjs9S_ToRPeEKsT3I4rd9g', 'bxzFAkgeR3GYupJiC9C0Gw', 'CIgFm_mdT5uzKUROm0K6vQ', 'JyPx6rEFRkmKAFMBTAeMeA']}

the diff between the two are the logs_pipelines created during the sync
=> diff is not relevant

the logs_pipelines_order resource is skipped

We did a first sync for logs_pipelines, then here logs_pipelines_order:

$ datadog-sync sync --config config --max-workers 1 --verbose --resources="logs_pipelines_order"
2023-10-09 12:48:56,885 - DEBUG - Starting new HTTPS connection (1): api.datadoghq.eu:443
2023-10-09 12:48:56,960 - DEBUG - https://api.datadoghq.eu:443 "GET /api/v1/validate HTTP/1.1" 200 15
2023-10-09 12:48:56,961 - INFO - clients validated successfully
2023-10-09 12:48:57,090 - INFO - Starting sync...
2023-10-09 12:48:57,155 - DEBUG - https://api.datadoghq.eu:443 "GET /api/v1/logs/config/pipeline-order HTTP/1.1" 200 None
2023-10-09 12:48:57,157 - INFO - Skipping resource: logs_pipelines_order with ID: logs-pipeline-order. Failed to connect resource. {'logs_pipelines': ['6P7jGlyzQqmWYUyn_-F1lw', 'TNDMVe05RsiJmoHgFFbWCQ', 'I-8dmnN5Sk6hEUZdoT8Pdw', 'pAlQHQrvTGOTnrMdHsMVAA', 'c02R-uDlSUqtNFN4yZcc3g', '-eC5_z7ASnyRY74WJ8Zp7g', 'tdAfnrdtRWmSYgSm1V_ewA', '763j0DksQ8qftv17JFORoQ', 'CIpxRivjQ2GG-rsunYLZgA', 'T2ZbdeZiSN64rGcx_hMj3A', 'jl9IrjFLTYeWytPhKHQY5Q', '7cKr42xYRaSKJEQ_Tv1N9w', 'wJQRgMvmRGyJl30P78aPTw', 'g5Cs5sr2SemXtMu9Kx87mQ', '4zK7IVI7SbOF5CVgKNgWSA', 'm1EY3k75R6W_DjRPVrA-SQ', 'e1aJMuvWQDC6cjfLILdlFA', 'ulZwKxd3TJGK4F17V3HiJQ', 'GcuKoWxhRJeJqEMEzsyZdA', 'OAusavmVRo2MDS2LwvBHFQ', 'gjs9S_ToRPeEKsT3I4rd9g', 'bxzFAkgeR3GYupJiC9C0Gw', 'CIgFm_mdT5uzKUROm0K6vQ', 'JyPx6rEFRkmKAFMBTAeMeA']}
2023-10-09 12:48:57,162 - INFO - Finished sync: 0 successes, 0 errors

=> no sync.

After digging into the code, I understand we need to run sync with --skip-failed-resource-connections=false:

those pipelines are out-of-the-box pipelines in source organization that are not present in destination organisation (because logs for those services have not (yet) been emitted on the destination organization): they are skipped by logs_pipelines because they cannot be created by API. logs_pipelines_order should also skip them.
by default datadog-sync-cli skips the resource in case of resource connection failure; for this one special case/resource we want to continue (maybe force an override in the code?)

$ datadog-sync sync --config config.real --max-workers 1 --verbose --resources="logs_pipelines_order" --skip-failed-resource-connections=false
2023-10-09 12:51:56,084 - DEBUG - Starting new HTTPS connection (1): api.datadoghq.eu:443
2023-10-09 12:51:56,169 - DEBUG - https://api.datadoghq.eu:443 "GET /api/v1/validate HTTP/1.1" 200 15
2023-10-09 12:51:56,169 - INFO - clients validated successfully
2023-10-09 12:51:56,285 - INFO - Starting sync...
2023-10-09 12:51:56,354 - DEBUG - https://api.datadoghq.eu:443 "GET /api/v1/logs/config/pipeline-order HTTP/1.1" 200 None
2023-10-09 12:51:56,355 - WARNING - logs_pipelines_order with ID: logs-pipeline-order. Failed to connect resource. {'logs_pipelines': ['6P7jGlyzQqmWYUyn_-F1lw', 'TNDMVe05RsiJmoHgFFbWCQ', 'I-8dmnN5Sk6hEUZdoT8Pdw', 'pAlQHQrvTGOTnrMdHsMVAA', 'c02R-uDlSUqtNFN4yZcc3g', '-eC5_z7ASnyRY74WJ8Zp7g', 'tdAfnrdtRWmSYgSm1V_ewA', '763j0DksQ8qftv17JFORoQ', 'CIpxRivjQ2GG-rsunYLZgA', 'T2ZbdeZiSN64rGcx_hMj3A', 'jl9IrjFLTYeWytPhKHQY5Q', '7cKr42xYRaSKJEQ_Tv1N9w', 'wJQRgMvmRGyJl30P78aPTw', 'g5Cs5sr2SemXtMu9Kx87mQ', '4zK7IVI7SbOF5CVgKNgWSA', 'm1EY3k75R6W_DjRPVrA-SQ', 'e1aJMuvWQDC6cjfLILdlFA', 'ulZwKxd3TJGK4F17V3HiJQ', 'GcuKoWxhRJeJqEMEzsyZdA', 'OAusavmVRo2MDS2LwvBHFQ', 'gjs9S_ToRPeEKsT3I4rd9g', 'bxzFAkgeR3GYupJiC9C0Gw', 'CIgFm_mdT5uzKUROm0K6vQ', 'JyPx6rEFRkmKAFMBTAeMeA']}
2023-10-09 12:51:56,374 - INFO - Running update for logs_pipelines_order with logs-pipeline-order
2023-10-09 12:51:56,601 - DEBUG - https://api.datadoghq.eu:443 "PUT /api/v1/logs/config/pipeline-order HTTP/1.1" 200 None
2023-10-09 12:51:56,617 - INFO - Finished update for logs_pipelines_order with logs-pipeline-order
2023-10-09 12:51:56,622 - INFO - Finished sync: 1 successes, 0 errors

=> sync OK (although the WARNING is still there when re-running a diffs or sync afterward)

Not compiled with python 3.9 or 3.10

Overview of issue:
The tool does not compile with python 3.9 or 3.10.

Steps to reproduce the issue:

  1. Using python 3.9 or 3.10
  2. pip install .

Describe the results you received:
datadog-sync ModuleNotFoundError: No module named 'pyimod02_importers'

Describe the results you expected:
Successfully installed and working properly.

README/help output have unclear information about CLI options.

(to be fleshed out further)

Currently, the help output in the README demonstrates the CLI flags as being global options, regardless of subcommand. This is not the case, and trying to pass them as global options results in error messages about the options being unknown.

We should, ideally:

  • clean up the help output itself to show the command line usage as something closer to datadog-sync [global options] [command] [options] [args]
  • remove the help text in the README itself, since it's a lot of extra noise and is at risk of falling out of sync with the actual help output.

Unable to sync synthetic browser tests containing subtests

Overview of issue:
When trying to sync synthetic tests containing subtests you will receive an error stating invalid subtest-id.

Steps to reproduce the issue:

  1. Create a synthetic browser test with a different test as a subtest
  2. import + sync

Describe the results you received:
Tests without a subtest are being synced as expected.
Tests with a subtest are not being synced.

Describe the results you expected:
All tests are synced

Additional information you deem important (e.g. issue happens only occasionally):
I was able to complete the sync successfully by running it once, updating the "subtestPublicID" in resources/source/synthetics_tests.json with the new PublicID of the subtest on the destination and running the sync again.
The error you receive is the following:
Error while creating resource synthetics_tests. source ID: ***** - Error: 400 Bad Request - {"errors":["Invalid steps data: subtest id"]}

Document unsupported resources

The supported resources are documented here https://github.com/DataDog/datadog-sync-cli#supported-resources, but I would like to see a list of resources that a user might expect to be handled by this tool but aren't. I found that #189 lists some of these resources.

For example, log views are not migrated, but I think they would be a valid resource for this tool to migrate. However, there are things like logs and metrics are out of scope for this tool.

Ideally, this would be a table of all of the resources within Datadog, with a note for each of supported/not yet supported (link to GitHub issue)/out of scope.

Eg:

resource status
... ...
logs_indexes supported
logs_views not yet supported #xyz
logs out of scope
... ...
metrics out of scope
metric_tag_configurations supported
metric_metadatas not yet support #xyz
... ...

Syncing resources multiple times creates duplicates.

Overview of issue:

Steps to reproduce the issue:

  1. Create dashboard in source org.
  2. Use datadog-sync-cli to import diff and sync dashboard to destination org
  3. dashboard is created in destination org.
  4. Rerun the import diff and sync command and a duplicate dashboard is created in destination org.

Describe the results you received:
Dashboard was duplicated in destination org when the sync command is rerun.

Describe the results you expected:
Dashboard should not be duplicated in destination org when the sync command is rerun

Additional information you deem important (e.g. issue happens only occasionally):

ModuleNotFoundError: No module named 'pyimod02_importers'

Overview of issue:
Using the windows release version 0.5.0, we get the following error while trying to run:

PS C:\Users\Ewerton\Downloads> .\datadog-sync-cli-windows-amd64.exe
[20960] Module object for pyimod02_importers is NULL!
Traceback (most recent call last):
  File "PyInstaller\loader\pyimod02_importers.py", line 22, in <module>
  File "pathlib.py", line 14, in <module>
  File "urllib\parse.py", line 40, in <module>
ModuleNotFoundError: No module named 'ipaddress'
Traceback (most recent call last):
  File "PyInstaller\loader\pyiboot01_bootstrap.py", line 17, in <module>
ModuleNotFoundError: No module named 'pyimod02_importers'
[20960] Failed to execute script 'pyiboot01_bootstrap' due to unhandled exception!

Steps to reproduce the issue:

  1. Download the datadog-sync-cli-windows-amd64.exe from releases
  2. Run it with powershell

Describe the results you received:
Get the following error:

[29556] Module object for pyimod02_importers is NULL!
Traceback (most recent call last):
  File "PyInstaller\loader\pyimod02_importers.py", line 22, in <module>
  File "pathlib.py", line 14, in <module>
  File "urllib\parse.py", line 40, in <module>
ModuleNotFoundError: No module named 'ipaddress'
Traceback (most recent call last):
  File "PyInstaller\loader\pyiboot01_bootstrap.py", line 17, in <module>
ModuleNotFoundError: No module named 'pyimod02_importers'
[29556] Failed to execute script 'pyiboot01_bootstrap' due to unhandled exception!

Describe the results you expected:
Should see the help message

Additional information you deem important (e.g. issue happens only occasionally):
Tried on two diferente computers, same issue,
tried versions 0.5.0 and 0.4.2

Running from source with python worked fine.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.