Coder Social home page Coder Social logo

cloudposse / terraform-aws-efs-backup Goto Github PK

View Code? Open in Web Editor NEW
43.0 17.0 33.0 2.06 MB

Terraform module designed to easily backup EFS filesystems to S3 using DataPipeline

Home Page: https://cloudposse.com/accelerate

License: Apache License 2.0

HCL 96.41% Makefile 3.59%
terraform terraform-modules datapipeline aws s3 efs nfs backup snapshot lambda

terraform-aws-efs-backup's Introduction

Project Banner

Latest ReleaseLast UpdatedSlack Community

Terraform module designed to easily backup EFS filesystems to S3 using DataPipeline.

The workflow is simple:

  • Periodically launch resource (EC2 instance) based on schedule
  • Execute the shell command defined in the activity on the instance
  • Sync data from Production EFS to S3 Bucket by using aws-cli
  • The execution log of the activity is stored in S3
  • Publish the success or failure of the activity to an SNS topic
  • Automatically rotate the backups using S3 lifecycle rule

Tip

πŸ‘½ Use Atmos with Terraform

Cloud Posse uses atmos to easily orchestrate multiple environments using Terraform.
Works with Github Actions, Atlantis, or Spacelift.

Watch demo of using Atmos with Terraform
Example of running atmos to manage infrastructure from our Quick Start tutorial.

Usage

Include this module in your existing terraform code:

module "efs_backup" {
  source = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=master"

  name                               = "${var.name}"
  stage                              = "${var.stage}"
  namespace                          = "${var.namespace}"
  vpc_id                             = "${var.vpc_id}"
  efs_mount_target_id                = "${var.efs_mount_target_id}"
  use_ip_address                     = "false"
  noncurrent_version_expiration_days = "${var.noncurrent_version_expiration_days}"
  ssh_key_pair                       = "${var.ssh_key_pair}"
  datapipeline_config                = "${var.datapipeline_config}"
  modify_security_group              = "true"
}

output "efs_backup_security_group" {
  value = "${module.efs_backup.security_group_id}"
}

Integration with EFS

To enable connectivity between the DataPipeline instances and the EFS, use one of the following methods to configure Security Groups:

  1. Explicitly add the DataPipeline SG (the output of this module security_group_id) to the list of the ingress rules of the EFS SG. For example:
module "elastic_beanstalk_environment" {
  source     = "git::https://github.com/cloudposse/terraform-aws-elastic-beanstalk-environment.git?ref=master"
  namespace  = "${var.namespace}"
  name       = "${var.name}"
  stage      = "${var.stage}"
  delimiter  = "${var.delimiter}"
  attributes = ["${compact(concat(var.attributes, list("eb-env")))}"]
  tags       = "${var.tags}"

  # ..............................
}

module "efs" {
  source     = "git::https://github.com/cloudposse/terraform-aws-efs.git?ref=tmaster"
  namespace  = "${var.namespace}"
  name       = "${var.name}"
  stage      = "${var.stage}"
  delimiter  = "${var.delimiter}"
  attributes = ["${compact(concat(var.attributes, list("efs")))}"]
  tags       = "${var.tags}"

  # Allow EB/EC2 instances and DataPipeline instances to connect to the EFS
  security_groups = ["${module.elastic_beanstalk_environment.security_group_id}", "${module.efs_backup.security_group_id}"]
}

module "efs_backup" {
  source     = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=master"
  name       = "${var.name}"
  stage      = "${var.stage}"
  namespace  = "${var.namespace}"
  delimiter  = "${var.delimiter}"
  attributes = ["${compact(concat(var.attributes, list("efs-backup")))}"]
  tags       = "${var.tags}"
  
  # Important to set it to `false` since we added the `DataPipeline` SG (output of the `efs_backup` module) to the `security_groups` of the `efs` module
  # See NOTE below for more information
  modify_security_group = "false"

  # ..............................
}
  1. Set modify_security_group attribute to true so the module will modify the EFS SG to allow the DataPipeline to connect to the EFS

NOTE: Do not mix these two methods together. Terraform does not support using a Security Group with in-line rules in conjunction with any Security Group Rule resources. https://www.terraform.io/docs/providers/aws/r/security_group_rule.html

NOTE on Security Groups and Security Group Rules: Terraform currently provides both a standalone Security Group Rule resource (a single ingress or egress rule), and a Security Group resource with ingress and egress rules defined in-line. At this time you cannot use a Security Group with in-line rules in conjunction with any Security Group Rule resources. Doing so will cause a conflict of rule settings and will overwrite rules.

Important

In Cloud Posse's examples, we avoid pinning modules to specific versions to prevent discrepancies between the documentation and the latest released versions. However, for your own projects, we strongly advise pinning each module to the exact version you're using. This practice ensures the stability of your infrastructure. Additionally, we recommend implementing a systematic approach for updating versions to avoid unexpected changes.

Makefile Targets

Available targets:

  help                                Help screen
  help/all                            Display help for all targets
  help/short                          This help short screen
  lint                                Lint terraform code

Requirements

No requirements.

Providers

Name Version
aws n/a

Modules

Name Source Version
backups_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
datapipeline_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
logs_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
resource_role_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
role_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
sns_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1

Resources

Name Type
aws_cloudformation_stack.datapipeline resource
aws_cloudformation_stack.sns resource
aws_iam_instance_profile.resource_role resource
aws_iam_role.resource_role resource
aws_iam_role.role resource
aws_iam_role_policy_attachment.resource_role resource
aws_iam_role_policy_attachment.role resource
aws_s3_bucket.backups resource
aws_s3_bucket.logs resource
aws_security_group.datapipeline resource
aws_security_group_rule.datapipeline_efs_ingress resource
aws_ami.amazon_linux data source
aws_efs_mount_target.default data source
aws_iam_policy_document.resource_role data source
aws_iam_policy_document.role data source
aws_region.default data source
aws_subnet_ids.default data source
aws_vpc.default data source

Inputs

Name Description Type Default Required
attributes Additional attributes (e.g. efs-backup) list(string) [] no
datapipeline_config DataPipeline configuration options map(string)
{
"email": "",
"instance_type": "t2.micro",
"period": "24 hours",
"timeout": "60 Minutes"
}
no
datapipeline_security_group Optionally specify a security group to use for the datapipeline instances string "" no
delimiter Delimiter to be used between name, namespace, stage, etc. string "-" no
efs_mount_target_id EFS Mount Target ID (e.g. fsmt-279bfc62) string n/a yes
modify_security_group Should the module modify the EFS security group string "false" no
name The Name of the application or solution (e.g. bastion or portal) any n/a yes
namespace Namespace (e.g. cp or cloudposse) any n/a yes
noncurrent_version_expiration_days S3 object versions expiration period (days) string "35" no
region (Optional) AWS Region. If not specified, will be derived from 'aws_region' data source string "" no
ssh_key_pair SSH key that will be deployed on DataPipeline's instance string n/a yes
stage Stage (e.g. prod, dev, staging) any n/a yes
subnet_id Optionally specify the subnet to use string "" no
tags Additional tags (e.g. map('BusinessUnit,XYZ) map(string) {} no
use_ip_address If set to true, will use IP address instead of DNS name to connect to the EFS string "false" no
vpc_id VPC ID string "" no

Outputs

Name Description
backups_bucket_name Backups bucket name
datapipeline_ids Datapipeline ids
logs_bucket_name Logs bucket name
security_group_id Security group id
sns_topic_arn Backup notification SNS topic ARN

Related Projects

Check out these related projects.

References

For additional context, refer to some of these links.

Tip

Use Terraform Reference Architectures for AWS

Use Cloud Posse's ready-to-go terraform architecture blueprints for AWS to get up and running quickly.

βœ… We build it together with your team.
βœ… Your team owns everything.
βœ… 100% Open Source and backed by fanatical support.

Request Quote

πŸ“š Learn More

Cloud Posse is the leading DevOps Accelerator for funded startups and enterprises.

Your team can operate like a pro today.

Ensure that your team succeeds by using Cloud Posse's proven process and turnkey blueprints. Plus, we stick around until you succeed.

Day-0: Your Foundation for Success

  • Reference Architecture. You'll get everything you need from the ground up built using 100% infrastructure as code.
  • Deployment Strategy. Adopt a proven deployment strategy with GitHub Actions, enabling automated, repeatable, and reliable software releases.
  • Site Reliability Engineering. Gain total visibility into your applications and services with Datadog, ensuring high availability and performance.
  • Security Baseline. Establish a secure environment from the start, with built-in governance, accountability, and comprehensive audit logs, safeguarding your operations.
  • GitOps. Empower your team to manage infrastructure changes confidently and efficiently through Pull Requests, leveraging the full power of GitHub Actions.

Request Quote

Day-2: Your Operational Mastery

  • Training. Equip your team with the knowledge and skills to confidently manage the infrastructure, ensuring long-term success and self-sufficiency.
  • Support. Benefit from a seamless communication over Slack with our experts, ensuring you have the support you need, whenever you need it.
  • Troubleshooting. Access expert assistance to quickly resolve any operational challenges, minimizing downtime and maintaining business continuity.
  • Code Reviews. Enhance your team’s code quality with our expert feedback, fostering continuous improvement and collaboration.
  • Bug Fixes. Rely on our team to troubleshoot and resolve any issues, ensuring your systems run smoothly.
  • Migration Assistance. Accelerate your migration process with our dedicated support, minimizing disruption and speeding up time-to-value.
  • Customer Workshops. Engage with our team in weekly workshops, gaining insights and strategies to continuously improve and innovate.

Request Quote

✨ Contributing

This project is under active development, and we encourage contributions from our community.

Many thanks to our outstanding contributors:

For πŸ› bug reports & feature requests, please use the issue tracker.

In general, PRs are welcome. We follow the typical "fork-and-pull" Git workflow.

  1. Review our Code of Conduct and Contributor Guidelines.
  2. Fork the repo on GitHub
  3. Clone the project to your own machine
  4. Commit changes to your own branch
  5. Push your work back up to your fork
  6. Submit a Pull Request so that we can review your changes

NOTE: Be sure to merge the latest changes from "upstream" before making a pull request!

🌎 Slack Community

Join our Open Source Community on Slack. It's FREE for everyone! Our "SweetOps" community is where you get to talk with others who share a similar vision for how to rollout and manage infrastructure. This is the best place to talk shop, ask questions, solicit feedback, and work together as a community to build totally sweet infrastructure.

πŸ“° Newsletter

Sign up for our newsletter and join 3,000+ DevOps engineers, CTOs, and founders who get insider access to the latest DevOps trends, so you can always stay in the know. Dropped straight into your Inbox every week β€” and usually a 5-minute read.

πŸ“† Office Hours

Join us every Wednesday via Zoom for your weekly dose of insider DevOps trends, AWS news and Terraform insights, all sourced from our SweetOps community, plus a live Q&A that you can’t find anywhere else. It's FREE for everyone!

License

License

Preamble to the Apache License, Version 2.0

Complete license is available in the LICENSE file.

Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements.  See the NOTICE file
distributed with this work for additional information
regarding copyright ownership.  The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License.  You may obtain a copy of the License at

  https://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied.  See the License for the
specific language governing permissions and limitations
under the License.

Trademarks

All other trademarks referenced herein are the property of their respective owners.


Copyright Β© 2017-2024 Cloud Posse, LLC

README footer

Beacon

terraform-aws-efs-backup's People

Contributors

abferm avatar actions-user avatar aknysh avatar dylanbannon avatar josephchoe avatar max-lobur avatar osterman avatar plumdog avatar rfvermut avatar s2504s avatar solairerove avatar vadim-hleif avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

terraform-aws-efs-backup's Issues

Resource 'aws_cloudformation_stack.sns' does not have attribute 'outputs.TopicArn' for variable 'aws_cloudformation_stack.sns.outputs.TopicArn'

Hi,

I'm trying to create EFS backups using this module but I keep running into the following error:

* module.efs_backup.output.sns_topic_arn: Resource 'aws_cloudformation_stack.sns' does not have attribute 'outputs.TopicArn' for variable 'aws_cloudformation_stack.sns.outputs.TopicArn'

The current configuration of the module is:

### Backup
module "efs_backup" {
  source = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=0.8.0"

  name                = "${var.name}"
  stage               = "${var.stage}"
  namespace           = "${var.namespace}"
  vpc_id              = "${module.vpc.vpc_id}"
  ssh_key_pair        = "${aws_key_pair.backupadmin.key_name}"
  efs_mount_target_id = "${element(module.efs.mount_target_ids, 0)}"

  datapipeline_config = {
    instance_type = "t2.micro"
    email         = "[email protected]"
    period        = "24 hours"
    timeout       = "60 Minutes"
  }

  modify_security_group = "false"
}

What am I missing/doing wrong?

Add Example Usage

what

  • Add example invocation

why

  • We need this so we can soon enable automated continuous integration testing of module

aws_cloudformation_stack.datapipeline resource does not produce an array

The "datapipeline_ids" output references the "aws_cloudformation_stack.datapipeline" as though it produces the array. This hasn't been the case since 8bc76ef.

Error: Error refreshing state: 1 error(s) occurred:

* module.efs_backup.output.datapipeline_ids: __builtin_StringToInt: strconv.ParseInt: parsing "DataPipelineId": invalid syntax in:

${aws_cloudformation_stack.datapipeline.*.outputs["DataPipelineId"]}

"current": [REMOVED] Defaults to current provider region if no other filtering is enabled

The module is failing and I am getting the following error:
Error: module.jenkins_efs_backup.data.aws_region.default: "current": [REMOVED] Defaults to current provider region if no other filtering is enabled

The cause of the issue is in here:

data "aws_region" "default" {
current = true
}

The solution is to remove the current = true since it is deprecated and throwing an error for Terraform v0.11.13.

Reference: https://www.terraform.io/docs/providers/aws/d/region.html

Error: Missing resource instance key

Hello,

I want to do some tests with terraform-aws-efs-backup module in version 0.9.0. I have this code:
`module "efs_backup" {
source = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=master"

name = "${var.name}"
stage = "${var.stage}"
namespace = "${var.namespace}"
vpc_id = "${var.vpc_id}"
efs_mount_target_id = "${var.efs_mount_target_id}"
use_ip_address = "false"
noncurrent_version_expiration_days = "${var.noncurrent_version_expiration_days}"
ssh_key_pair = "${var.ssh_key_pair}"
datapipeline_config = "${var.datapipeline_config}"
modify_security_group = "true"
}

output "efs_backup_security_group" {
value = "${module.efs_backup.security_group_id}"
}with these variables:variable "name" {
default = "jira-efs"
}

variable "stage" {
default = "stage"
}

variable "namespace" {
default = "namespace"
}

variable "vpc_id" {
default = "vpc-02c0d010183c3c1ce"
}

variable "efs_mount_target_id" {
default = "fs-2c057de4"
}

variable "noncurrent_version_expiration_days" {
default = "30"
}

variable "ssh_key_pair" {}

variable "datapipeline_config" {
type = "map"
default = {
instance_type = "t2.micro"
email = ""
period = "24 hours"
timeout = "60 Minutes"
}
}`

when I launched terraform plan I got this error multiple times with all the attributes:

`Error: Missing resource instance key

on .terraform/modules/efs_backup.backups_label/outputs.tf line 2, in output "id":
2: value = "${null_resource.default.triggers.id}"

Because null_resource.default has "count" set, its attributes must be accessed
on specific instances.

For example, to correlate with indices of a referring resource, use:
null_resource.default[count.index]`

I am pretty new with terraform, so probably I am missing something, but after readinng some posts I think it could be related with the new version of terrform (I am using version 0.12).

Thank you very much for your work!

Regards

Add Example Usage

what

  • Add example invocation

why

  • We need this so we can soon enable automated continuous integration testing of module

Add folder path to EFS targer

Hello,

I need to make backup of folders inside EFS because EFS size is too big and takes a lot of time to make full backup. So it would be great to have option to specify targed folder in EFS. Is it possible to add path inside EFS to backup from?

I tried to add:

  • in efs.tf
    data "aws_efs_mount_target_folder" "default" {
    mount_target_folder = "${var.efs_mount_target_folder}"
    }
  • in variables.tf
    variable "efs_mount_target_folder" {
    type = "string"
    default = "<target_folder>"
  • in cloudformation.tf
    myEFSHostFolder = "${var.efs_mount_target_folder}"
    -in templates/datapipeline.yaml
    add variable "$4" for target folder, place it after "$source":/, and add 4th scriptArgument with value "#{myEFSHostFolder}"

But got errors after "terraform plan or init" - Error: resource 'data.aws_efs_mount_target_folder.default' config: unknown variable referenced: 'efs_mount_target_folder'; define it with a 'variable' block
and
Error: resource 'aws_cloudformation_stack.datapipeline' config: unknown variable referenced: 'efs_mount_target_folder'; define it with a 'variable' block

What did I miss?
Or add please opportunity to specify folders for backup inside the EFS.

Best Regards.

Add Example Usage

what

  • Add example invocation

why

  • We need this so we can soon enable automated continuous integration testing of module

parameter value for parameter name myKeyPair does not exist

Hi,

I have got following the errors applying (terraform apply):

module.efs_backup.aws_cloudformation_stack.datapipeline: 1 error(s) occurred: aws_cloudformation_stack.datapipeline: ROLLBACK_COMPLETE: ["Parameter validation failed: parameter value for parameter name myKeyPair does not exist. Rollback requested by user."]

I have used the following code:

main.tf

module "efs_backup" {
source = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=master"

name = "${var.name}"
stage = "${var.stage}"
namespace = "${var.namespace}"
vpc_id = "${var.vpc_id}"
efs_mount_target_id = "${var.efs_mount_target_id}"
use_ip_address = "false"
noncurrent_version_expiration_days = "${var.noncurrent_version_expiration_days}"
ssh_key_pair = "${var.ssh_key_pair}"
datapipeline_config = "${var.datapipeline_config}"
modify_security_group = "true"
}

output "efs_backup_security_group" {
value = "${module.efs_backup.security_group_id}"

terraform.tfvars

namespace = "namespace"

stage = "stage"

name = "efs-backup"

region = "eu-central-1"

vpc_id = "vpc-0123456"

efs_mount_target_id = "fsmt-0123456"

#use_ip_address = "false"

#modify_security_group = "false"

noncurrent_version_expiration_days = "35"

ssh_key_pair = ""

#datapipeline_config = "${map("instance_type", "t2.micro", "email", "", "period", "24 hours", "timeout", "60 Minutes")}"

attributes = []

tags = {}

delimiter = "-"

variables.tf

variable "name" {
type = "string"
}

variable "namespace" {
type = "string"
}

variable "stage" {
type = "string"
}

variable "region" {
type = "string"
default = ""
description = "(Optional) AWS Region. If not specified, will be derived from 'aws_region' data source"
}

variable "vpc_id" {
type = "string"
}

variable "use_ip_address" {
default = "false"
}

variable "datapipeline_config" {
type = "map"

default = {
instance_type = "t2.micro"
email = "[email protected]"
period = "24 hours"
timeout = "60 Minutes"
}
}

variable "efs_mount_target_id" {
type = "string"
description = "EFS Mount Target ID (e.g. fsmt-279bfc62)"
}

variable "modify_security_group" {
default = "false"
}

variable "ssh_key_pair" {
type = "string"
}

variable "noncurrent_version_expiration_days" {
default = "35"
}

variable "delimiter" {
type = "string"
default = "-"
description = "Delimiter to be used between name, namespace, stage, etc."
}

variable "attributes" {
type = "list"
default = []
description = "Additional attributes (e.g. efs-backup)"
}

variable "tags" {
type = "map"
default = {}
description = "Additional tags (e.g. map('BusinessUnit,XYZ)"
}

I have set the SSH Public Key in this way:

ssh_key_pair = "ssh-rsa ABCDEF123456"
but got error applying:
module.efs_backup.aws_cloudformation_stack.datapipeline: 1 error(s) occurred: aws_cloudformation_stack.datapipeline: ROLLBACK_COMPLETE: ["Parameter validation failed: parameter value ssh-rsa ABCDDEF123456 for parameter name myKeyPair does not exist. Rollback requested by user."]
then I set the variable to "", but then came the following error:
module.efs_backup.aws_cloudformation_stack.datapipeline: 1 error(s) occurred: aws_cloudformation_stack.datapipeline: ROLLBACK_COMPLETE: ["Parameter validation failed: parameter value for parameter name myKeyPair does not exist. Rollback requested by user."]

But according to documentation, this variable is optional so actually the second solution should work..

Do you have an idea how to solve it?

Thanks in advance!

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Repository problems

These problems occurred while renovating this repository. View logs.

  • WARN: Base branch does not exist - skipping

Ignored or Blocked

These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.

Detected dependencies

terraform
cloudformation.tf
  • github.com/cloudposse/terraform-null-label tags/0.3.1
  • github.com/cloudposse/terraform-null-label tags/0.3.1
iam.tf
  • github.com/cloudposse/terraform-null-label tags/0.3.1
  • github.com/cloudposse/terraform-null-label tags/0.3.1
main.tf
  • github.com/cloudposse/terraform-null-label tags/0.3.1
s3.tf
  • github.com/cloudposse/terraform-null-label tags/0.3.1
  • github.com/cloudposse/terraform-null-label tags/0.3.1

  • Check this box to trigger a request for Renovate to run again on this repository

The following resource(s) failed to create: [Topic]

Hi,

I have got following the errors applying (terraform apply):

module.efs_backup.aws_cloudformation_stack.sns: 1 error(s) occurred: aws_cloudformation_stack.sns: ROLLBACK_COMPLETE: ["The following resource(s) failed to create: [Topic]. . Rollback requested by user." "Invalid parameter: Endpoint (Service: AmazonSNS; Status Code: 400; Error Code: InvalidParameter; Request ID: 0f1e8da9-7469-5dfd-8e5e-de14ff3509fa)"]

and afterwards applying it once again pops up the following:

module.efs_backup.aws_cloudformation_stack.datapipeline: 1 error(s) occurred: module.efs_backup.aws_cloudformation_stack.datapipeline: At column 39, line 1: map "aws_cloudformation_stack.sns.outputs" does not have any elements so cannot determine type. in: ${aws_cloudformation_stack.sns.outputs["TopicArn"]}

I have used the following code:

main.tf

module "efs_backup" {
source = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=master"

name = "${var.name}"
stage = "${var.stage}"
namespace = "${var.namespace}"
vpc_id = "${var.vpc_id}"
efs_mount_target_id = "${var.efs_mount_target_id}"
use_ip_address = "false"
noncurrent_version_expiration_days = "${var.noncurrent_version_expiration_days}"
ssh_key_pair = "${var.ssh_key_pair}"
datapipeline_config = "${var.datapipeline_config}"
modify_security_group = "true"
}

output "efs_backup_security_group" {
value = "${module.efs_backup.security_group_id}"

terraform.tfvars

namespace = "namespace"

stage = "stage"

name = "efs-backup"

region = "eu-central-1"

vpc_id = "vpc-0123456"

efs_mount_target_id = "fsmt-0123456"

#use_ip_address = "false"

#modify_security_group = "false"

noncurrent_version_expiration_days = "35"

ssh_key_pair = "ssh-rsa key"

#datapipeline_config = "${map("instance_type", "t2.micro", "email", "", "period", "24 hours", "timeout", "60 Minutes")}"

attributes = []

tags = {}

delimiter = "-"

variables.tf

variable "name" {
type = "string"
}

variable "namespace" {
type = "string"
}

variable "stage" {
type = "string"
}

variable "region" {
type = "string"
default = ""
description = "(Optional) AWS Region. If not specified, will be derived from 'aws_region' data source"
}

variable "vpc_id" {
type = "string"
}

variable "use_ip_address" {
default = "false"
}

variable "datapipeline_config" {
type = "map"

default = {
instance_type = "t2.micro"
email = ""
period = "24 hours"
timeout = "60 Minutes"
}
}

variable "efs_mount_target_id" {
type = "string"
description = "EFS Mount Target ID (e.g. fsmt-279bfc62)"
}

variable "modify_security_group" {
default = "false"
}

variable "ssh_key_pair" {
type = "string"
}

variable "noncurrent_version_expiration_days" {
default = "35"
}

variable "delimiter" {
type = "string"
default = "-"
description = "Delimiter to be used between name, namespace, stage, etc."
}

variable "attributes" {
type = "list"
default = []
description = "Additional attributes (e.g. efs-backup)"
}

variable "tags" {
type = "map"
default = {}
description = "Additional tags (e.g. map('BusinessUnit,XYZ)"
}

Do you have maybe an idea how to solve it?

Thanks in advance!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.