Datadog cli tool to sync resources across organizations.
The purpose of the datadog-sync-cli package is to provide an easy way to sync Datadog resources across Datadog organizations.
Note: this tool does not, nor is intended, for migrating intake data such as ingested logs, metrics, etc.
The source organization will not be modified, but the destination organization will have resources created and updated during by sync
command.
- Python >= v3.9
Resource | Description |
---|---|
roles | Sync Datadog roles. |
users | Sync Datadog users. |
synthetics_private_locations | Sync Datadog synthetics private locations. |
synthetics_tests | Sync Datadog synthetics tests. |
synthetics_global_variables | Sync Datadog synthetics global variables. |
monitors | Sync Datadog monitors. |
downtimes | Sync Datadog downtimes. |
service_level_objectives | Sync Datadog SLOs. |
slo_corrections | Sync Datadog SLO corrections. |
spans_metrics | Sync Datadog spans metrics. |
dashboards | Sync Datadog dashboards. |
dashboard_lists | Sync Datadog dashboard lists. |
logs_pipelines | Sync Datadog logs OOTB integration and custom pipelines. |
logs_pipelines_order | Sync Datadog logs pipelines order. |
logs_custom_pipelines (deprecated) | Sync Datadog logs custom pipelines. |
notebooks | Sync Datadog notebooks. |
host_tags | Sync Datadog host tags. |
logs_indexes | Sync Datadog logs indexes. |
logs_metrics | Sync Datadog logs metrics. |
logs_restriction_queries | Sync Datadog logs restriction queries. |
metric_tag_configurations | Sync Datadog metric tags configurations. |
Note: logs_custom_pipelines
resource has been deprecated in favor of logs_pipelines
resource which supports both logs OOTB integration and custom pipelines. To migrate to the new resource, rename the existing state files from logs_custom_pipelines.json
to logs_pipelines.json
for both source and destination files.
- Clone the project repo and CD into the directory
git clone https://github.com/DataDog/datadog-sync-cli.git; cd datadog-sync-cli
- Install datadog-sync-cli tool using pip
pip install .
- Invoke the cli tool using
datadog-sync <command> <options>
- Download the executable from releases page
- Provide the executable with executable permission
chmod +x datadog-sync-cli-{system-name}-{machine-type}
- Move the executable to your bin directory
sudo mv datadog-sync-cli-{system-name}-{machine-type} /usr/local/bin/datadog-sync
- Invoke the cli tool using
datadog-sync <command> <options>
- Download the executable with extension
.exe
from releases page - Add the directory containing the
exe
file to your path - Invoke the cli tool in cmd/powershell using the file name ommiting the extention
datadog-sync-cli-windows-amd64 <command> <options>
- Clone the project repo and CD into the directory
git clone https://github.com/DataDog/datadog-sync-cli.git; cd datadog-sync-cli
- Build the probided Dockerfile
docker build . -t datadog-sync
- Run the docker image using entrypoint below:
docker run --rm -v <PATH_TO_WORKING_DIR>:/datadog-sync:rw \
-e DD_SOURCE_API_KEY=<DATADOG_API_KEY> \
-e DD_SOURCE_APP_KEY=<DATADOG_APP_KEY> \
-e DD_SOURCE_API_URL=<DATADOG_API_URL> \
-e DD_DESTINATION_API_KEY=<DATADOG_API_KEY> \
-e DD_DESTINATION_APP_KEY=<DATADOG_APP_KEY> \
-e DD_DESTINATION_API_URL=<DATADOG_API_URL> \
datadog-sync:latest <command> <options>
Note: The above docker run command will mount specified <PATH_TO_WORKING_DIR>
working directory to the container.
Usage: datadog-sync COMMAND [OPTIONS]
Initialize cli
Options:
--source-api-key TEXT Datadog source organization API key. [required for import]
--source-app-key TEXT Datadog source organization APP key. [required for import]
--source-api-url TEXT Datadog source organization API url.
--destination-api-key TEXT Datadog destination organization API key. [required for sync/diffs]
--destination-app-key TEXT Datadog destination organization APP key. [required for sync/diffs]
--destination-api-url TEXT Datadog destination organization API url.
--validate BOOLEAN Enables validation of the provided API
during client initialization. On import,
only source api key is validated. On
sync/diffs, only destination api key is validated. [default: True]
--http-client-timeout INTEGER The HTTP request timeout period. Defaults to `30s`.
--http-client-retry-timeout INTEGER The HTTP request retry timeout period. Defaults to `60s`.
--resources TEXT Optional comma separated list of resource to
import. All supported resources are imported
by default. See [Filtering] section for more details.
--cleanup [True|False|Force] Cleanup resources from destination org. [default: False]
-v, --verbose Enable verbose logging.
--filter TEXT Filter resources. See [Filtering] section for more details.
--filter-operator TEXT Filter operator when multiple filters are passed. Supports `AND` or `OR`.
--config FILE Read configuration from FILE. See [Config] section for more details.
--max-workers INTEGER Max number of workers when running
operations in multi-threads. Defaults to the number of processors on the machine, multiplied by 5.
--skip-failed-resource-connections BOOLEAN Skip resource if resource connection fails. [default: True] [sync + import only]
--force-missing-dependencies Force importing and syncing resources that
could be potential dependencies to the
requested resources. [sync only]
--help Show this message and exit.
Commands:
diffs Log resource diffs.
import Import Datadog resources.
sync Sync Datadog resources to destination.
Available URL's for the source and destination API URLs are:
https://api.datadoghq.com
https://api.datadoghq.eu
https://api.us5.datadoghq.com
https://api.us3.datadoghq.com
https://api.ddog-gov.com
https://api.ap1.datadoghq.com
See https://docs.datadoghq.com/getting_started/site/ for all available regions.
Filtering is done on two levels, at top resource level and per individual resource using --resources
and --filter
respectevily.
By default all resources are imported, synced, etc. If you would like to perform actions on a specific top level resource, or subset of resources, use --resources
option. For example, the command datadog-sync import --resources="dashboard_lists,dashboards"
will import ALL dashboards and dashboard lists in your Datadog organization.
Individual resources can be further filtered using the --filter
flag. For example, the following command datadog-sync import --resources="dashboards,dashboard_lists" --filter='Type=dashboard_lists;Name=name;Value=My custom list'
, will import ALL dashboards and ONLY dashboard lists with the name
attribute equal to My custom list
.
Filter option (--filter
) accepts a string made up of key=value
pairs separated by ;
.
--filter 'Type=<resource>;Name=<attribute_name>;Value=<attribute_value>;Operator=<operator>'
Available keys:
Type
: Resource e.g. Monitors, Dashboards, etc. [required]Name
: Attribute key to filter on. This can be any attribute represented in dot notation (e.g.attributes.user_count
). [required]Value
: Regex to filter attribute value by. Note: special regex characters need to be escaped if filtering by raw string. [required]Operator
: Available operators are below. All invalid operator's default toExactMatch
.SubString
: Sub string matchingExactMatch
: Exact string match.
By default, if multiple filters are passed for the same resource, OR
logic is applied to the filters. This behavior can be adjusted using the --filter-operator
option.
Custom config textfile can be passed in place of options. Example config file:
# config
destination_api_url="https://api.datadoghq.eu"
destination_api_key="<API_KEY>"
destination_app_key="<APP_KEY>"
source_api_key="<API_KEY>"
source_app_key="<APP_KEY>"
source_api_url="https://api.datadoghq.com"
filter=["Type=Dashboards;Name=title;Value=Test screenboard", "Type=Monitors;Name=tags;Value=sync:true"]
Usage: datadog-sync import --config config
The tools sync
command provides a cleanup flag (--cleanup
). Passing the cleanup flag will delete resources from the destination organization which have been removed from the source organization. The resources to be deleted are determined based on the difference between the state files of source and destination organization.
For example, ResourceA
and ResourceB
are imported and synced, followed by deleting ResourceA
from the source organization. Running the import
command will update the source organizations state file to only include ResourceB
. The following sync --cleanup=Force
command will now delete ResourceA
from the destination organization.
To use the tool, first run the import
command, which will read the wanted items from the specified resources and save them locally into JSON files in the directory resources/source
.
Then, you can run the sync
command which will use that local cache (unless --force-missing-dependencies
is passed) to create
the resources on the destination, and saves locally what has been pushed.
Many Datadog resources are interdependent. For example, Users resource can references Roles and Dashboards can include widgets which use Monitors or Synthetics. The datadog-sync tool syncs these resources in order to ensure dependencies are not broken.
If importing/syncing subset of resources, users should ensure that dependent resources are imported and synced as well.
See Supported resources section below for potential resource dependencies.
Resource | Dependencies |
---|---|
roles | - |
users | roles |
synthetics_private_locations | - |
synthetics_tests | synthetics_private_locations, synthetics_global_variables, roles |
synthetics_global_variables | synthetics_tests |
monitors | roles, service_level_objectives |
downtimes | monitors |
service_level_objectives | monitors, synthetics_tests |
slo_corrections | service_level_objectives |
spans_metrics | - |
dashboards | monitors, roles, service_level_objectives |
dashboard_lists | dashboards |
logs_pipelines | - |
logs_pipelines_order | logs_pipelines |
logs_custom_pipelines (deprecated) | - |
notebooks | - |
host_tags | - |
logs_indexes | - |
logs_metrics | - |
logs_restriction_queries | roles |
metric_tag_configurations | - |