Coder Social home page Coder Social logo

cloudposse / terraform-aws-datadog-lambda-forwarder Goto Github PK

View Code? Open in Web Editor NEW
18.0 14.0 20.0 333 KB

Terraform module to provision all the necessary infrastructure to deploy Datadog Lambda forwarders

Home Page: https://cloudposse.com/accelerate

License: Apache License 2.0

Makefile 4.40% HCL 88.52% Go 7.07%
aws aws-lambda datadog forwarder terraform terraform-module

terraform-aws-datadog-lambda-forwarder's Introduction

terraform-aws-datadog-lambda-forwarder

Latest ReleaseLast UpdatedSlack Community

Terraform module to provision all the necessary infrastructure to deploy Datadog Lambda forwarders

Tip

๐Ÿ‘ฝ Use Atmos with Terraform

Cloud Posse uses atmos to easily orchestrate multiple environments using Terraform.
Works with Github Actions, Atlantis, or Spacelift.

Watch demo of using Atmos with Terraform
Example of running atmos to manage infrastructure from our Quick Start tutorial.

Usage

For a complete example, see examples/complete.

For automated tests of the complete example using bats and Terratest (which tests and deploys the example on AWS), see test.

To enable Datadog forwarder for RDS Enhanced monitoring:

module "datadog_lambda_forwarder" {
  source = "cloudposse/datadog-lambda-forwarder/aws"
  # Cloud Posse recommends pinning every module to a specific version
  # version = "x.x.x"

  forwarder_rds_enabled = true
}

To enable Datadog forwarder for a CloudTrail S3 bucket:

module "datadog_lambda_forwarder" {
  source = "cloudposse/datadog-lambda-forwarder/aws"
  # Cloud Posse recommends pinning every module to a specific version
  # version = "x.x.x"

  forwarder_log_enabled = true
  s3_buckets            = ["cloudtrail-audit-bucket"]
  s3_bucket_kms_arns    = ["arn:aws:kms:us-west-2:1234567890:key/b204f3d2-1111-2222-94333332-4444ccc222"]
}

To enable Datadog forwarder for a S3 bucket with prefix:

module "datadog_lambda_forwarder" {
  source = "cloudposse/datadog-lambda-forwarder/aws"
  # Cloud Posse recommends pinning every module to a specific version
  # version = "x.x.x"

  forwarder_log_enabled = true
  s3_buckets_with_prefixes = {
    MyBucketWithPrefix = {bucket_name = "my-bucket-with-prefix", bucket_prefix = "events/"}
    AnotherWithPrefix  = {bucket_name = "another-with-prefix", bucket_prefix = "records/"}
  }
  s3_bucket_kms_arns       = ["arn:aws:kms:us-west-2:1234567890:key/b204f3d2-1111-2222-94333332-4444ccc222"]
}

To enable Datadog forwarder for RDS authentication CloudWatch logs:

module "datadog_lambda_forwarder" {
  source = "cloudposse/datadog-lambda-forwarder/aws"
  # Cloud Posse recommends pinning every module to a specific version
  # version = "x.x.x"

  forwarder_log_enabled = true
  cloudwatch_forwarder_log_groups = {
    postgres = {
      name           = "/aws/rds/cluster/pg-main/postgresql"
      filter_pattern = ""
    }
  }
}

To enable Datadog forwarder for VPC Flow Logs CloudWatch logs:

module "datadog_lambda_forwarder" {
  source = "cloudposse/datadog-lambda-forwarder/aws"
  # Cloud Posse recommends pinning every module to a specific version
  # version = "x.x.x"

  forwarder_vpc_logs_enabled   = true
  vpclogs_cloudwatch_log_group = "/aws/vpc/flowlogs/vpc1"
}

To use a local copy of the lambda code you can specify the artifact url:

module "datadog_lambda_forwarder" {
  source = "cloudposse/datadog-lambda-forwarder/aws"
  # Cloud Posse recommends pinning every module to a specific version
  # version = "x.x.x"

  forwarder_rds_enabled      = true
  forwarder_rds_artifact_url = file("${path.module}/function.zip")
}

Important

In Cloud Posse's examples, we avoid pinning modules to specific versions to prevent discrepancies between the documentation and the latest released versions. However, for your own projects, we strongly advise pinning each module to the exact version you're using. This practice ensures the stability of your infrastructure. Additionally, we recommend implementing a systematic approach for updating versions to avoid unexpected changes.

Examples

Here is an example of using this module:

Makefile Targets

Available targets:

  help                                Help screen
  help/all                            Display help for all targets
  help/short                          This help short screen
  lint                                Lint terraform code

Requirements

Name Version
terraform >= 1.3.0
archive >= 2.2.0
aws >= 3.0

Providers

Name Version
archive >= 2.2.0
aws >= 3.0

Modules

Name Source Version
cloudwatch_event cloudposse/cloudwatch-events/aws 0.6.1
forwarder_log_artifact cloudposse/module-artifact/external 0.8.0
forwarder_log_label cloudposse/label/null 0.25.0
forwarder_log_s3_label cloudposse/label/null 0.25.0
forwarder_rds_artifact cloudposse/module-artifact/external 0.8.0
forwarder_rds_label cloudposse/label/null 0.25.0
forwarder_vpclogs_artifact cloudposse/module-artifact/external 0.8.0
forwarder_vpclogs_label cloudposse/label/null 0.25.0
this cloudposse/label/null 0.25.0

Resources

Name Type
aws_cloudwatch_log_group.forwarder_log resource
aws_cloudwatch_log_group.forwarder_rds resource
aws_cloudwatch_log_group.forwarder_vpclogs resource
aws_cloudwatch_log_subscription_filter.cloudwatch_log_subscription_filter resource
aws_cloudwatch_log_subscription_filter.datadog_log_subscription_filter_rds resource
aws_cloudwatch_log_subscription_filter.datadog_log_subscription_filter_vpclogs resource
aws_iam_policy.datadog_custom_policy resource
aws_iam_policy.lambda_forwarder_log resource
aws_iam_policy.lambda_forwarder_log_s3 resource
aws_iam_policy.lambda_forwarder_rds resource
aws_iam_policy.lambda_forwarder_vpclogs resource
aws_iam_role.lambda_forwarder_log resource
aws_iam_role.lambda_forwarder_rds resource
aws_iam_role.lambda_forwarder_vpclogs resource
aws_iam_role_policy_attachment.datadog_s3 resource
aws_iam_role_policy_attachment.lambda_forwarder_log resource
aws_iam_role_policy_attachment.lambda_forwarder_rds resource
aws_iam_role_policy_attachment.lambda_forwarder_vpclogs resource
aws_lambda_function.forwarder_log resource
aws_lambda_function.forwarder_rds resource
aws_lambda_function.forwarder_vpclogs resource
aws_lambda_permission.allow_eventbridge resource
aws_lambda_permission.allow_s3_bucket resource
aws_lambda_permission.cloudwatch_enhanced_rds_monitoring resource
aws_lambda_permission.cloudwatch_groups resource
aws_lambda_permission.cloudwatch_vpclogs resource
aws_s3_bucket_notification.s3_bucket_notification resource
aws_s3_bucket_notification.s3_bucket_notification_with_prefixes resource
archive_file.forwarder_rds data source
archive_file.forwarder_vpclogs data source
aws_caller_identity.current data source
aws_iam_policy_document.assume_role data source
aws_iam_policy_document.lambda_default data source
aws_iam_policy_document.s3_log_bucket data source
aws_partition.current data source
aws_region.current data source
aws_ssm_parameter.api_key data source

Inputs

Name Description Type Default Required
additional_tag_map Additional key-value pairs to add to each map in tags_as_list_of_maps. Not added to tags or id.
This is for some rare cases where resources want additional configuration of tags
and therefore take a list of maps with tag key, value, and additional configuration.
map(string) {} no
api_key_ssm_arn ARN of the SSM parameter for the Datadog API key.
Passing this removes the need to fetch the key from the SSM parameter store.
This could be the case if the SSM Key is in a different region than the lambda.
string null no
attributes ID element. Additional attributes (e.g. workers or cluster) to add to id,
in the order they appear in the list. New attributes are appended to the
end of the list. The elements of the list are joined by the delimiter
and treated as a single ID element.
list(string) [] no
cloudwatch_forwarder_event_patterns Map of title => CloudWatch Event patterns to forward to Datadog. Event structure from here: https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/CloudWatchEventsandEventPatterns.html#CloudWatchEventsPatterns
Example:
hcl
cloudwatch_forwarder_event_rules = {
"guardduty" = {
source = ["aws.guardduty"]
detail-type = ["GuardDuty Finding"]
}
"ec2-terminated" = {
source = ["aws.ec2"]
detail-type = ["EC2 Instance State-change Notification"]
detail = {
state = ["terminated"]
}
}
}
map(object({
version = optional(list(string))
id = optional(list(string))
detail-type = optional(list(string))
source = optional(list(string))
account = optional(list(string))
time = optional(list(string))
region = optional(list(string))
resources = optional(list(string))
detail = optional(map(list(string)))
}))
{} no
cloudwatch_forwarder_log_groups Map of CloudWatch Log Groups with a filter pattern that the Lambda forwarder will send logs from. For example: { mysql1 = { name = "/aws/rds/maincluster", filter_pattern = "" }
map(object({
name = string
filter_pattern = string
}))
{} no
context Single object for setting entire context at once.
See description of individual variables for details.
Leave string and numeric variables as null to use default value.
Individual variable settings (non-null) override settings in context object,
except for attributes, tags, and additional_tag_map, which are merged.
any
{
"additional_tag_map": {},
"attributes": [],
"delimiter": null,
"descriptor_formats": {},
"enabled": true,
"environment": null,
"id_length_limit": null,
"label_key_case": null,
"label_order": [],
"label_value_case": null,
"labels_as_tags": [
"unset"
],
"name": null,
"namespace": null,
"regex_replace_chars": null,
"stage": null,
"tags": {},
"tenant": null
}
no
datadog_forwarder_lambda_environment_variables Map of environment variables to pass to the Lambda Function map(string) {} no
dd_api_key_kms_ciphertext_blob CiphertextBlob stored in environment variable DD_KMS_API_KEY used by the lambda function, along with the KMS key, to decrypt Datadog API key string "" no
dd_api_key_source One of: ARN for AWS Secrets Manager (asm) to retrieve the Datadog (DD) api key, ARN for the KMS (kms) key used to decrypt the ciphertext_blob of the api key, or the name of the SSM (ssm) parameter used to retrieve the Datadog API key
object({
resource = string
identifier = string
})
{
"identifier": "",
"resource": ""
}
no
dd_artifact_filename The Datadog artifact filename minus extension string "aws-dd-forwarder" no
dd_forwarder_version Version tag of Datadog lambdas to use. https://github.com/DataDog/datadog-serverless-functions/releases string "3.39.0" no
dd_module_name The Datadog GitHub repository name string "datadog-serverless-functions" no
dd_tags A list of Datadog tags to apply to all logs forwarded to Datadog list(string) [] no
dd_tags_map A map of Datadog tags to apply to all logs forwarded to Datadog. This will override dd_tags. map(string) {} no
delimiter Delimiter to be used between ID elements.
Defaults to - (hyphen). Set to "" to use no delimiter at all.
string null no
descriptor_formats Describe additional descriptors to be output in the descriptors output map.
Map of maps. Keys are names of descriptors. Values are maps of the form
{<br> format = string<br> labels = list(string)<br>}
(Type is any so the map values can later be enhanced to provide additional options.)
format is a Terraform format string to be passed to the format() function.
labels is a list of labels, in order, to pass to format() function.
Label values will be normalized before being passed to format() so they will be
identical to how they appear in id.
Default is {} (descriptors output will be empty).
any {} no
enabled Set to false to prevent the module from creating any resources bool null no
environment ID element. Usually used for region e.g. 'uw2', 'us-west-2', OR role 'prod', 'staging', 'dev', 'UAT' string null no
forwarder_iam_path Path to the IAM roles and policies created string "/" no
forwarder_lambda_datadog_host Datadog Site to send data to. Possible values are datadoghq.com, datadoghq.eu, us3.datadoghq.com, us5.datadoghq.com and ddog-gov.com string "datadoghq.com" no
forwarder_lambda_debug_enabled Whether to enable or disable debug for the Lambda forwarder bool false no
forwarder_log_artifact_url The URL for the code of the Datadog forwarder for Logs. It can be a local file, URL or git repo string null no
forwarder_log_enabled Flag to enable or disable Datadog log forwarder bool false no
forwarder_log_layers List of Lambda Layer Version ARNs (maximum of 5) to attach to Datadog log forwarder lambda function list(string) [] no
forwarder_log_retention_days Number of days to retain Datadog forwarder lambda execution logs. One of [0 1 3 5 7 14 30 60 90 120 150 180 365 400 545 731 1827 3653] number 14 no
forwarder_rds_artifact_url The URL for the code of the Datadog forwarder for RDS. It can be a local file, url or git repo string null no
forwarder_rds_enabled Flag to enable or disable Datadog RDS enhanced monitoring forwarder bool false no
forwarder_rds_filter_pattern Filter pattern for Lambda forwarder RDS string "" no
forwarder_rds_layers List of Lambda Layer Version ARNs (maximum of 5) to attach to Datadog RDS enhanced monitoring lambda function list(string) [] no
forwarder_vpc_logs_artifact_url The URL for the code of the Datadog forwarder for VPC Logs. It can be a local file, url or git repo string null no
forwarder_vpc_logs_enabled Flag to enable or disable Datadog VPC flow log forwarder bool false no
forwarder_vpc_logs_layers List of Lambda Layer Version ARNs (maximum of 5) to attach to Datadog VPC flow log forwarder lambda function list(string) [] no
forwarder_vpclogs_filter_pattern Filter pattern for Lambda forwarder VPC Logs string "" no
id_length_limit Limit id to this many characters (minimum 6).
Set to 0 for unlimited length.
Set to null for keep the existing setting, which defaults to 0.
Does not affect id_full.
number null no
kms_key_id Optional KMS key ID to encrypt Datadog Lambda function logs string null no
label_key_case Controls the letter case of the tags keys (label names) for tags generated by this module.
Does not affect keys of tags passed in via the tags input.
Possible values: lower, title, upper.
Default value: title.
string null no
label_order The order in which the labels (ID elements) appear in the id.
Defaults to ["namespace", "environment", "stage", "name", "attributes"].
You can omit any of the 6 labels ("tenant" is the 6th), but at least one must be present.
list(string) null no
label_value_case Controls the letter case of ID elements (labels) as included in id,
set as tag values, and output by this module individually.
Does not affect values of tags passed in via the tags input.
Possible values: lower, title, upper and none (no transformation).
Set this to title and set delimiter to "" to yield Pascal Case IDs.
Default value: lower.
string null no
labels_as_tags Set of labels (ID elements) to include as tags in the tags output.
Default is to include all labels.
Tags with empty values will not be included in the tags output.
Set to [] to suppress all generated tags.
Notes:
The value of the name tag, if included, will be the id, not the name.
Unlike other null-label inputs, the initial setting of labels_as_tags cannot be
changed in later chained modules. Attempts to change it will be silently ignored.
set(string)
[
"default"
]
no
lambda_custom_policy_name Additional IAM policy document that can optionally be passed and merged with the created policy document string "DatadogForwarderCustomPolicy" no
lambda_memory_size Amount of memory in MB your Lambda Function can use at runtime number 128 no
lambda_policy_source_json Additional IAM policy document that can optionally be passed and merged with the created policy document string "" no
lambda_reserved_concurrent_executions Amount of reserved concurrent executions for the lambda function. A value of 0 disables Lambda from being triggered and -1 removes any concurrency limitations. Defaults to Unreserved Concurrency Limits -1 number -1 no
lambda_runtime Runtime environment for Datadog Lambda string "python3.11" no
lambda_timeout Amount of time your Datadog Lambda Function has to run in seconds number 120 no
log_permissions_boundary ARN of the policy that is used to set the permissions boundary for the lambda-log role managed by this module. string null no
name ID element. Usually the component or solution name, e.g. 'app' or 'jenkins'.
This is the only ID element not also included as a tag.
The "name" tag is set to the full id string. There is no tag with the value of the name input.
string null no
namespace ID element. Usually an abbreviation of your organization name, e.g. 'eg' or 'cp', to help ensure generated IDs are globally unique string null no
rds_permissions_boundary ARN of the policy that is used to set the permissions boundary for the lambda-rds role managed by this module. string null no
regex_replace_chars Terraform regular expression (regex) string.
Characters matching the regex will be removed from the ID elements.
If not set, "/[^a-zA-Z0-9-]/" is used to remove all characters other than hyphens, letters and digits.
string null no
s3_bucket_kms_arns List of KMS key ARNs for s3 bucket encryption list(string) [] no
s3_buckets The names of S3 buckets to forward logs to Datadog list(string) [] no
s3_buckets_with_prefixes The names S3 buckets and prefix to forward logs to Datadog map(object({ bucket_name : string, bucket_prefix : string })) {} no
security_group_ids List of security group IDs to use when the Lambda Function runs in a VPC list(string) null no
stage ID element. Usually used to indicate role, e.g. 'prod', 'staging', 'source', 'build', 'test', 'deploy', 'release' string null no
subnet_ids List of subnet IDs to use when deploying the Lambda Function in a VPC list(string) null no
tags Additional tags (e.g. {'BusinessUnit': 'XYZ'}).
Neither the tag keys nor the tag values will be modified by this module.
map(string) {} no
tenant ID element _(Rarely used, not included by default)_. A customer identifier, indicating who this instance of a resource is for string null no
tracing_config_mode Can be either PassThrough or Active. If PassThrough, Lambda will only trace the request from an upstream service if it contains a tracing header with 'sampled=1'. If Active, Lambda will respect any tracing header it receives from an upstream service string "PassThrough" no
vpc_logs_permissions_boundary ARN of the policy that is used to set the permissions boundary for the lambda-vpc-logs role managed by this module. string null no
vpclogs_cloudwatch_log_group The name of the CloudWatch Log Group for VPC flow logs string null no

Outputs

Name Description
lambda_forwarder_log_function_arn Datadog Lambda forwarder CloudWatch/S3 function ARN
lambda_forwarder_log_function_name Datadog Lambda forwarder CloudWatch/S3 function name
lambda_forwarder_rds_enhanced_monitoring_function_name Datadog Lambda forwarder RDS Enhanced Monitoring function name
lambda_forwarder_rds_function_arn Datadog Lambda forwarder RDS Enhanced Monitoring function ARN
lambda_forwarder_vpc_log_function_arn Datadog Lambda forwarder VPC Flow Logs function ARN
lambda_forwarder_vpc_log_function_name Datadog Lambda forwarder VPC Flow Logs function name

Related Projects

Check out these related projects.

  • terraform-null-label - Terraform module designed to generate consistent names and tags for resources. Use terraform-null-label to implement a strict naming convention.

References

For additional context, refer to some of these links.

  • Terraform Standard Module Structure - HashiCorp's standard module structure is a file and directory layout we recommend for reusable modules distributed in separate repositories.
  • Terraform Module Requirements - HashiCorp's guidance on all the requirements for publishing a module. Meeting the requirements for publishing a module is extremely easy.
  • Terraform random_integer Resource - The resource random_integer generates random values from a given range, described by the min and max attributes of a given resource.
  • Terraform Version Pinning - The required_version setting can be used to constrain which versions of the Terraform CLI can be used with your configuration

Tip

Use Terraform Reference Architectures for AWS

Use Cloud Posse's ready-to-go terraform architecture blueprints for AWS to get up and running quickly.

โœ… We build it with you.
โœ… You own everything.
โœ… Your team wins.

Request Quote

๐Ÿ“š Learn More

Cloud Posse is the leading DevOps Accelerator for funded startups and enterprises.

Your team can operate like a pro today.

Ensure that your team succeeds by using Cloud Posse's proven process and turnkey blueprints. Plus, we stick around until you succeed.

Day-0: Your Foundation for Success

  • Reference Architecture. You'll get everything you need from the ground up built using 100% infrastructure as code.
  • Deployment Strategy. Adopt a proven deployment strategy with GitHub Actions, enabling automated, repeatable, and reliable software releases.
  • Site Reliability Engineering. Gain total visibility into your applications and services with Datadog, ensuring high availability and performance.
  • Security Baseline. Establish a secure environment from the start, with built-in governance, accountability, and comprehensive audit logs, safeguarding your operations.
  • GitOps. Empower your team to manage infrastructure changes confidently and efficiently through Pull Requests, leveraging the full power of GitHub Actions.

Request Quote

Day-2: Your Operational Mastery

  • Training. Equip your team with the knowledge and skills to confidently manage the infrastructure, ensuring long-term success and self-sufficiency.
  • Support. Benefit from a seamless communication over Slack with our experts, ensuring you have the support you need, whenever you need it.
  • Troubleshooting. Access expert assistance to quickly resolve any operational challenges, minimizing downtime and maintaining business continuity.
  • Code Reviews. Enhance your teamโ€™s code quality with our expert feedback, fostering continuous improvement and collaboration.
  • Bug Fixes. Rely on our team to troubleshoot and resolve any issues, ensuring your systems run smoothly.
  • Migration Assistance. Accelerate your migration process with our dedicated support, minimizing disruption and speeding up time-to-value.
  • Customer Workshops. Engage with our team in weekly workshops, gaining insights and strategies to continuously improve and innovate.

Request Quote

โœจ Contributing

This project is under active development, and we encourage contributions from our community.

Many thanks to our outstanding contributors:

For ๐Ÿ› bug reports & feature requests, please use the issue tracker.

In general, PRs are welcome. We follow the typical "fork-and-pull" Git workflow.

  1. Review our Code of Conduct and Contributor Guidelines.
  2. Fork the repo on GitHub
  3. Clone the project to your own machine
  4. Commit changes to your own branch
  5. Push your work back up to your fork
  6. Submit a Pull Request so that we can review your changes

NOTE: Be sure to merge the latest changes from "upstream" before making a pull request!

๐ŸŒŽ Slack Community

Join our Open Source Community on Slack. It's FREE for everyone! Our "SweetOps" community is where you get to talk with others who share a similar vision for how to rollout and manage infrastructure. This is the best place to talk shop, ask questions, solicit feedback, and work together as a community to build totally sweet infrastructure.

๐Ÿ“ฐ Newsletter

Sign up for our newsletter and join 3,000+ DevOps engineers, CTOs, and founders who get insider access to the latest DevOps trends, so you can always stay in the know. Dropped straight into your Inbox every week โ€” and usually a 5-minute read.

๐Ÿ“† Office Hours

Join us every Wednesday via Zoom for your weekly dose of insider DevOps trends, AWS news and Terraform insights, all sourced from our SweetOps community, plus a live Q&A that you canโ€™t find anywhere else. It's FREE for everyone!

License

License

Preamble to the Apache License, Version 2.0

Complete license is available in the LICENSE file.

Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements.  See the NOTICE file
distributed with this work for additional information
regarding copyright ownership.  The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License.  You may obtain a copy of the License at

  https://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied.  See the License for the
specific language governing permissions and limitations
under the License.

Trademarks

All other trademarks referenced herein are the property of their respective owners.

Copyrights

Copyright ยฉ 2021-2024 Cloud Posse, LLC

README footer

Beacon

terraform-aws-datadog-lambda-forwarder's People

Contributors

aknysh avatar benbentwo avatar bendrucker avatar bwmetcalf avatar cloudpossebot avatar dependabot[bot] avatar dudymas avatar dylanbannon avatar ebram-va avatar jamengual avatar jbrt avatar joshmyers avatar kevcube avatar luigiclemente-awin avatar max-lobur avatar natw avatar nitrocode avatar nuru avatar osterman avatar renovate[bot] avatar woz5999 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

terraform-aws-datadog-lambda-forwarder's Issues

More partition hardcoding

Describe the Bug

$ grep ':aws:' lambda*.tf
lambda-log.tf:  source_arn    = "arn:aws:s3:::${each.value}"
lambda-log.tf:    resources = concat(formatlist("arn:aws:s3:::%s", var.s3_buckets), formatlist("arn:aws:s3:::%s/*", var.s3_buckets))
lambda-log.tf:  source_arn    = "arn:aws:logs:${local.aws_region}:${local.aws_account_id}:log-group:${each.value}:*"
lambda-rds.tf:  source_arn    = "arn:aws:logs:${local.aws_region}:${local.aws_account_id}:log-group:RDSOSMetrics:*"
lambda-vpc-logs.tf:  source_arn    = "arn:aws:logs:${local.aws_region}:${local.aws_account_id}:log-group:${var.vpclogs_cloudwatch_log_group}:*"

This among other possible issues, doesn't associate the correct arn with the lambda triggers resulting in

โ”‚ Error: Error creating Cloudwatch log subscription filter: InvalidParameterException: Could not execute the lambda function. Make sure you have given CloudWatch Logs permission to execute your function.
โ”‚ 
โ”‚   with module.monitoring.module.monitoring_common.module.datadog_forwarder.module.datadog_lambda_forwarder.aws_cloudwatch_log_subscription_filter.cloudwatch_log_subscription_filter["vpclogs"],
โ”‚   on .terraform/modules/monitoring.monitoring_common.datadog_forwarder.datadog_lambda_forwarder/lambda-log.tf line 150, in resource "aws_cloudwatch_log_subscription_filter" "cloudwatch_log_subscription_filter":
โ”‚  150: resource "aws_cloudwatch_log_subscription_filter" "cloudwatch_log_subscription_filter" {

hashicorp/template deprecated, unable to use this package on darwin-arm64

Found a bug? Maybe our Slack Community can help.

Slack Community

Describe the Bug

Checked out a working project that uses this module on new M1 machine. run terraform init and receive

Error: Incompatible provider version

Provider registry.terraform.io/hashicorp/template v2.2.0 does not have a package available for your current platform, darwin_arm64.

According to hashicorp/templates, the provider was deprecated before M1 was a thing and the suggest switching to template_file. hashicorp/terraform#30055

Expected Behavior

Be able to use this module on x86 or arm64 darwin machines.

Steps to Reproduce

Steps to reproduce the behavior:

  1. terraform init on M1 mac.
  2. See error

Lambda role policy requires `kms:Decrypt`

Describe the Bug

If the datadog api key is stored in ASM, the lambda role IAM policy requires kms:Decrypt privileges for the CMK used to encrypt the api key.

Expected Behavior

The lamba function should be able to read the datadog api key from ASM.

Steps to Reproduce

Add DD api key to ASM and use this module to create the forwarder lambda function. Without adding

        {
            "Sid": "AllowKMS",
            "Effect": "Allow",
            "Action": "kms:Decrypt",
            "Resource": [
                "<cmk arn>"
            ]
        }

to the lamba role policy, the function throws the following error:

[ERROR] ClientError: An error occurred (AccessDeniedException) when calling the GetSecretValue operation: Access to KMS is not allowed

Resource-based policy permissions are exceeded when many log group triggers are provided

Found a bug? Maybe our Slack Community can help.

Slack Community

Describe the Bug

When supplying a large number of keys (>50) to cloudwatch_forwarder_log_groups, resource-based policy permissions throws a PolicyLengthExceededException with no way to recover. In addition to this, when the exception is thrown by the AWS SDK, terraform does not stop attempting to run PutSubscriptionFilter for every log group provided, with up to 25 retries each. This leads to incredibly long terraform apply times without any apparent reason if you don't have TF_LOGS set.

Expected Behavior

Provide any number of cloudwatch_forwarder_log_groups to the module that scales.

Potential fix?

Allow user to supply a "catch-all" source arn that can provide permissions to a larger number of log groups.

Steps to Reproduce

Steps to reproduce the behavior:

  1. Add new datasource referencing your account log groups.
data "aws_cloudwatch_log_groups" "user_log_groups" {
  log_group_name_prefix = "/aws/lambda/${var.resource_prefix}"
}
  1. Map datasource to local variable
locals {
  log_groups = { for value in data.aws_cloudwatch_log_groups.user_log_groups.log_group_names :
    # Remove invalid key characters
    trim(replace("${value}", "/", "-"), "-") => { 
      name           = "${value}"
      filter_pattern = ""
    }
  }
}
  1. Provide to cloudposse/datadog-lambda-forwarder/aws module
cloudwatch_forwarder_log_groups = local.log_groups
  1. Run terraform apply with TF_LOGS set to DEBUG to see SDK calls lambda/AddPermission and logs/PutSubscriptionFilter failing

Screenshots

If applicable, add screenshots or logs to help explain your problem.

module.DataDogIntegration.module.datadog_lambda_forwarder.aws_lambda_permission.cloudwatch_groups[REDACTED]: Creating...
2022-09-12T14:23:19.719+0800 [INFO]  provider.terraform-provider-aws_v3.75.2_x5: 2022/09/12 14:23:19 [DEBUG] [aws-sdk-go] DEBUG: Response lambda/AddPermission Details:
---[ RESPONSE ]--------------------------------------
HTTP/2.0 400 Bad Request
Content-Length: 91
Content-Type: application/json
Date: Mon, 12 Sep 2022 06:23:19 GMT
X-Amzn-Errortype: PolicyLengthExceededException
X-Amzn-Requestid: 5e0ad4a0-f35d-4314-8601-fb2c48d7be13


-----------------------------------------------------: timestamp=2022-09-12T14:23:19.719+0800
2022-09-12T14:23:19.719+0800 [INFO]  provider.terraform-provider-aws_v3.75.2_x5: 2022/09/12 14:23:19 [DEBUG] [aws-sdk-go] {"Type":"User","message":"The final policy size (20898) is bigger than the limit (20480)."}: timestamp=2022-09-12T14:23:19.719+0800
2022-09-12T14:23:19.719+0800 [INFO]  provider.terraform-provider-aws_v3.75.2_x5: 2022/09/12 14:23:19 [DEBUG] [aws-sdk-go] DEBUG: Validate Response lambda/AddPermission failed, attempt 0/25, error PolicyLengthExceededException: The final policy size (20898) is bigger than the limit (20480).
{
  RespMetadata: {
    StatusCode: 400,
    RequestID: "5e0ad4a0-f35d-4314-8601-fb2c48d7be13"
  },
  Message_: "The final policy size (20898) is bigger than the limit (20480).",
  Type: "User"
}: timestamp=2022-09-12T14:23:19.719+0800

Environment (please complete the following information):

Anything that will help us triage the bug will help. Here are some ideas:

terraform -v

Terraform v1.2.6
on darwin_arm64
+ provider registry.terraform.io/franckverrot/stripe v1.9.0
+ provider registry.terraform.io/hashicorp/archive v2.2.0
+ provider registry.terraform.io/hashicorp/aws v3.75.2
+ provider registry.terraform.io/hashicorp/external v2.2.2
+ provider registry.terraform.io/hashicorp/local v2.2.3
+ provider registry.terraform.io/hashicorp/random v3.4.3
+ provider registry.terraform.io/hashicorp/template v2.2.0

Additional Context

resource "aws_lambda_permission" "cloudwatch_groups" {
for_each = local.lambda_enabled && var.forwarder_log_enabled ? var.cloudwatch_forwarder_log_groups : {}
statement_id = "datadog-forwarder-${each.key}-permission"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.forwarder_log[0].function_name
principal = "logs.${local.aws_region}.amazonaws.com"
source_arn = "${local.arn_format}:logs:${local.aws_region}:${local.aws_account_id}:log-group:${each.value.name}:*"
}

Lambda function environment variables

Hi all

I do not understand how to add a new environment variable for the Lambda forwarder configuration.
In my case, for example, I need to set DD_SITE as we use the european datadog endpoint.

Thanks,

Luigi

Add DLQ for Lambda

Describe the Feature

Add Dead Letter Queue configuration for lambda

Expected Behavior

If i add DLQ for Lambda function, we have it configured. We can create SQL for DLQ of provide SQS ARN via dead_letter_config section. Any solution will suite.

Use Case

To be able to pass SOC2/CIS compliance, we need to have DLQ for all lambdas. Even if we do not use it, better to have the ability to set it up or not, using some flag, like enable_dlq.

Currently, we have to find a workaround for that that does not look natural.

Describe Ideal Solution

  • add option enable_dlq = true/false
  • add dead_letter_config section to setup configuration if enable_dlq = true

Alternatives Considered

No response

Additional Context

No response

AWS partition should be parameterized

Describe the Bug

The validation checks in variables.tf do not work with govcloud due to aws being hardcoded as the AWS partition name. Example,

  # Check ASM ARN format
  validation {
    condition     = var.dd_api_key_source.resource == "asm" ? can(regex("arn:aws:secretsmanager:.*:secret:.*", var.dd_api_key_source.identifier)) : true
    error_message = "ARN for AWS Secrets Manager (asm) does not appear to be valid format (example: arn:aws:secretsmanager:us-west-2:111122223333:secret:aes128-1a2b3c)."
  }

Expected Behavior

The data source aws_partition should be used to retrieve the partition name.

Steps to Reproduce

Use module in a govcloud environment and the following error will occur:

 Error: Invalid value for variable
โ”‚ 
โ”‚   on ../../../../../terraform-modules/general/monitoring/datadog/forwarder/main.tf line 42, in module "datadog_lambda_forwarder":
โ”‚   42:   dd_api_key_source               = local.dd_api_key_source
โ”‚ 
โ”‚ ARN for AWS Secrets Manager (asm) does not appear to be valid format (example: arn:aws:secretsmanager:us-west-2:111122223333:secret:aes128-1a2b3c).

Need unique names for IAM role and policy

Describe the Bug

Enabling forwarder_log_enabled and forwarder_vpc_logs_enabled in different module calls causes both to create the IAM role and policy used by the lambda functions using the same name. While the former creates a lambda function using the name dev-test-datadog-forwarder-log and the latter uses dev-test-datadog-forwarder-vpclogs which allows them to coexist, they both use the name dev-test-datadog-forwarder-lambda for the role name and policy which is going to fail.

Note, It is possible to enable both of the above within the same module call to this module and they will share the one IAM role and policy.

Expected Behavior

Each type of lambda that can be enabled by this module should use unique names for all resources created.

Steps to Reproduce

Call this module twice with one enabling forwarder_log_enabled and the other enabling forwarder_vpc_logs_enabled.

Bug on upgrade to v1.6.1

Describe the Bug

Hi there ๐Ÿ‘‹ ,
After upgrading the module to v1.6.1, our logs stopped getting forwarged to Datadog.
I found the following error on CloudWatch:

[ERROR] Runtime.ImportModuleError: Unable to import module 'lambda_function': cannot import name 'formatargspec' from 'inspect' (/var/lang/lib/python3.11/inspect.py)

After downgrading, it worked again.

Expected Behavior

Logs getting forwarded to Datadog properly

Steps to Reproduce

Use the module like this:


module "datadog-lambda-forwarder" {
  source                = "cloudposse/datadog-lambda-forwarder/aws"
  version               = "1.6.1"
  forwarder_log_enabled = true

  dd_tags_map = {
    Environment = "production"
  }

  name               = "datadog_log_forwarder_production"
  lambda_memory_size = 1024
  lambda_timeout     = 900
  dd_api_key_source = {
    resource   = "ssm"
    identifier = "[redacted]"
  }

Screenshots

No response

Environment

  • Terraform 1.8.0

Additional Context

No response

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Repository problems

These problems occurred while renovating this repository. View logs.

  • WARN: Base branch does not exist - skipping

Ignored or Blocked

These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.

Detected dependencies

Branch main
terraform
lambda-log.tf
  • cloudposse/cloudwatch-events/aws 0.6.1
  • cloudposse/module-artifact/external 0.8.0
  • cloudposse/label/null 0.25.0
  • cloudposse/label/null 0.25.0
lambda-rds.tf
  • cloudposse/module-artifact/external 0.8.0
  • cloudposse/label/null 0.25.0
lambda-vpc-logs.tf
  • cloudposse/module-artifact/external 0.8.0
  • cloudposse/label/null 0.25.0
versions.tf
  • archive >= 2.2.0
  • aws >= 3.0
  • hashicorp/terraform >= 1.3.0
Branch release/v0
terraform
lambda-log.tf
  • cloudposse/module-artifact/external 0.7.1
  • cloudposse/label/null 0.25.0
  • cloudposse/label/null 0.25.0
lambda-rds.tf
  • cloudposse/module-artifact/external 0.7.1
  • cloudposse/label/null 0.25.0
lambda-vpc-logs.tf
  • cloudposse/module-artifact/external 0.7.1
  • cloudposse/label/null 0.25.0
versions.tf
  • archive >= 2.2.0
  • aws >= 3.0
  • hashicorp/terraform >= 0.13

  • Check this box to trigger a request for Renovate to run again on this repository

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.