Coder Social home page Coder Social logo

roandevs / serverless-backup Goto Github PK

View Code? Open in Web Editor NEW
0.0 1.0 0.0 32 KB

A serverless backup script used with AWS Lambda. This repository includes the code, detailed information regarding the process & setup instructions.

JavaScript 100.00%
aws-lambda aws-s3 google-drive mediafire mediafire-api meganz s3 s3-bucket backup-script backup-solution

serverless-backup's Introduction

☁️ Serverless Backup - My personal serverless backup system

Information

  • This is a completely serverless-based backup system which utilizes serverless technologies to backup my everyday files to multiple cloud providers.

  • The idea is that your S3 bucket would be setup to be auto-mounted on your desktop on bootup using rclone and would be treated like it's another hard drive on your filesystem. Then every 24 hours using AWS EventBridge, you can execute this code in Lambda which will connect to your S3 bucket with the right credentials and automatically back them up to multiple cloud providers, even if your desktop is offline or not available, as the S3 bucket is hosted by AWS and the execution through Lambda is handled by AWS.

  • The reason for using multiple providers is so that you don't rely on just one and have at-least one guaranteed backup incase a provider goes down or you lose access to an account that hosted your backups. You also don't have to maintain a server for your backups, as the serverless framework by AWS takes care of that.

  • Technologies involve using: AWS S3 buckets for initially storing my files, AWS Lambda for transferring the files across multiple cloud providers & AWS EventBridge for periodically executing the backup process every 24 hours.

Features & Limitations

  • Built in handler system to allow the execution of backing your files up to different cloud providers. By default, there are handlers for your files to be backed up to google drive, mediafire and mega.nz. You can of course extend this or specify in settings.json which providers you want and don't want to use.

  • Configurable settings to specify which providers you want to use, which files you want and don't want backed up within the process.

  • This was built originally with the aim to be ran and used with no operating costs, so obviously this comes with AWS' limit of 5GB of size on your S3 bucket for free, after that it costs around $0.023 per GB you use, more information can be found about that here. Other services such as Lambda have limiitations too, but they would not be exceeded in this use-case.

  • Lambda initially is designed to have a maximum execution time of 15 minutes, so if your file is too big or/and you're using a provider that has slow download speeds, then you may want to use the settings file to specify which files you want and don't want uploaded, then you can also make a seperate Lambda functions to handle the big files on its own.

Brief Setup Instructions

  1. Create an AWS account, then proceed with creating an S3 bucket and setting up programmatic access via IAM. See this example for more on how to do this.

  2. Setup rclone to auto-mount your S3 bucket on bootup and treat it like as if it's another hard-drive. See this example for more on how to do this.

  3. Setup programmatic access to your cloud providers i.e. for Google Drive you may want to follow this example. For mega.nz and MediaFire, all you have to do is register an account and set the email and password in your environment variables.

  4. Create a Lambda function, set the Runtime to Node.js 16.x. Go to the configuration section and edit the general configuration to have 10240 MB for Ephemeral storage, 3008 MB for Memory and the timeout to to 15 minutes. You can change this to your needs, depending on how many files you are backing up and their sizes, however use the values I specified if you're unsure as they're the maximum.

  5. In the configuration tab, go to the environment variables section and set the environment variables. Below this instructions section is a table with all the environment variables used within this project, set the necessary ones for your use-case.

  6. On your local machine, clone this repository, install node & npm, go to the directory where this repository is and run npm install, then create a zip with all the files in the directory inside of the zip. Go back to Lambda's console and on the code tab, click upload from a .zip file and attach the zip.

  7. in Lambda, go to the Test tab and click the Test button to see if the backup process works. You can use AWS CloudWatch to monitor what files are being uploaded at the moment and to discover if anything goes wrong.

  8. If step 7 was successful, then head to AWS EventBridge, go to the rules section and create a new rule that runs on a schedule/regular rate, i.e. every 24 hours. Then set the target to a Lambda function, and choose your function that you created earlier. Then proceed and create the rule, this should then execute your backup process every so often, depending on the rate you set.

Environment Variables (to set in your Lambda function)

You only have to set the environment variables for the providers you are using, i.e. if you are using mega.nz only, you do not have to configure the values for any google-related environment variables or mediafire related variables, however the S3 ones are always required to be filled out.

Variable Name Value
S3_BUCKET_NAME The name of your S3 bucket tied to your account, see this example for more on how to create & fetch this.
S3_REGION The region of where your S3 bucket is located in, you would of set this while creating your S3 bucket, however it's also possible to fetch this info from AWS S3's management panel.
S3_ACCESS_KEY_ID Your access key ID for your S3 bucket which can be setup with IAM on AWS, see this example for more on how to create & fetch this.
S3_SECRET_ACCESS_KEY Your secret access key for your S3 bucket which can be setup with IAM on AWS, see this example for more on how to create & fetch this.
GOOGLE_CLIENT_ID Your google app client ID, see this example for more on how to create & fetch this.
GOOGLE_CLIENT_SECRET Your google app client secret, see this example for more on how to create & fetch this.
GOOGLE_DRIVE_FOLDER_ID The folder ID of where your backed up files are going to, you can create this folder in Google Drive's Web UI and once you enter the folder, it's normally specified in the URL bar after '/drive/folders/'
GOOGLE_REFRESH_TOKEN Your google app refresh token, see this example for more on how to create & fetch this.
MEDIAFIRE_EMAIL Your email for your MediaFire account, used within MediaFire's web authentication protocol. See my implementation of interacting with MediaFire's API here.
MEDIAFIRE_PASSWORD Your password for your MediaFire account, used within MediaFire's web authentication protocol. See my implementation of interacting with MediaFire's API here.
MEGA_EMAIL Your email for your mega.nz account, used within mega.nz's authentication protocol. See the implementation of interacting with mega.nz's API here.
MEGA_PASSWORD Your password for your mega.nz account, used within mega.nz's authentication protocol. See the implementation of interacting with mega.nz's API here.

Settings (settings.json)

Name Editing Required Value
enabledHandlers Yes Set the providers you want to utilize in the form of an array i.e. ['google', 'mega.nz', 'mediafire'] or any of your own that you have implemented. They must be available in handlers.js or the execution of the code will stop once the code detects that this handler is not found.
filesToUpload No Specify which files you want to upload in the form of an array i.e. ['File1', 'File2', 'ImportantFile'], if you want to choose specific ones out of many files on the S3 bucket. By default, if the array is empty, it will use ALL the files.
filesToExclude No Specify which files you do not want to upload in the form of an array i.e. ['plaintextpasswords.txt', 'averybigfile.zip'], if you want to exclude a specific file out of many files on the S3 bucket. By default, if the array is empty, it will just use the files that was selected to upload based on filesToUpload.

Plans

  • Make a desktop app that has a clean and easy to use UI that lets a user specify their configuration, AWS account and then proceeds to auto setup the S3 buckets, installs and uses rclone to mount your S3 bucket, auto publishes your Lambda function and sets up your EventBridge rule to auto backup your files, all done with minimal user interaction.

serverless-backup's People

Contributors

roandevs avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.