Coder Social home page Coder Social logo

sukria / backup-manager Goto Github PK

View Code? Open in Web Editor NEW
278.0 278.0 88.0 2.02 MB

Versatile yet easy to use command line backup tool for GNU/Linux. Suitable for desktop and servers.

Home Page: http://www.backup-manager.org/

License: GNU General Public License v2.0

Perl 21.39% Shell 66.74% Makefile 3.23% Smarty 8.64%

backup-manager's Introduction

                      Backup Manager

         *    A really simple to use backup tool    *
	
           https://github.com/sukria/Backup-Manager/

Description
---------------------------------------------------------------------

Backup Manager is a command line backup tool for GNU/Linux, designed 
to help you make daily archives of your file system.

Written in bash and perl, it can make archives in many formats
and can be run in a parallel mode with different configuration files.

Archives are kept for a given number of days and the upload system can
use ftp or scp to transfer the generated archives to a list of remote 
hosts. 
The configuration file is very simple and basic and gettext is used 
for internationalization.


Installation
---------------------------------------------------------------------

See the file INSTALL


Rporting Bugs
---------------------------------------------------------------------

Use the GitHub bug tracking system: 
https://github.com/sukria/Backup-Manager/issues


backup-manager's People

Contributors

benmur avatar cypriani avatar dkogan avatar ggtools avatar iangreenleaf avatar kissifrot avatar llaumgui avatar maxyz avatar mr-greywolf avatar toubib avatar utopiabound avatar zwenna avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

backup-manager's Issues

Invalid mysqldump option

The current version of MySQL (I work with 5.6) does not support the --defaults-extra-file using a password which is transmitted, resulting in a backup instead get "mysqldump: Got error: 1045:" Access denied for user ..... . "

Why not just use the standard option -p?
such as: -p$BM_MYSQL_ADMINPASS

Upload of incremental.bin files

Hi,

I can't find why my incremental files are not uploaded...

I find some fix but they don't work.
I try to edit /usr/bin/backup-manager-upload :
adding while (<$g_root_dir/*incremental*>) { push @{$ra_files}, $_; }

Upload method is FTP and everything works fine except incremental.bin upload...

Thx for help

System is Debian Jessie

s3 file size limit

Hi,

not sure if you know, but S3 has a 5G file size limit which is a problem if you want to copy 9G backup. Here's what I faced today:

backup-manager --verbose

/ebs/archives/localhost.combinlogs.20091022.master.tar.gz.gpg: ok (134M, 33de5a1a62c515110538824377f65562)
/ebs/archives/localhost.commysql.20091022.master.tar.gz.gpg: ok (9135M, 15b58d23f9aab50b3001bb281edfcf0d)
Trying to upload files to s3 service
Connected to s3.amazon.com
opened /ebs/archives/localhost.com-20091022.md5 of length 174 and will name the key localhost.com-20091022.md5
opened /ebs/archives/localhost.combinlogs.20091022.master.tar.gz.gpg of length 136275275 and will name the key localhost.combinlogs.20091022.master.tar.gz.gpg
Error reported by backup-manager-upload for method "s3", check "/ebs/tmp/bmu-log.uD1361".

cat /ebs/tmp/bmu-log.uD1361

Out of memory!

It is usually addressed by spliting file before uploading. Since your scripts are mainly bash based, it may make sense to just use "split" to do the splittin, maybe based on BM_TARBALL_SLICESIZE if not dar is used. In the meantime we'll probably fall back to dar if that will work around the issue.

Alternative to BM_MYSQL_SAFEDUMPS

BM_MYSQL_SAFEDUMPS add "--opt" to the command line: https://github.com/sukria/Backup-Manager/blob/0c6c3cf314c1df0d807adb617f6cd9e38f3b5378/lib/backup-methods.sh

But according to the documentation, this option is enabled by default, and can be disabled by using "--skip-opt": http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html#option_mysqldump_opt

So it is like BM_MYSQL_SAFEDUMPS option does not have any impact.

Morever, "--opt" is a shortcut for "--add-drop-table --add-locks --create-options --disable-keys --extended-insert --lock-tables --quick --set-charset". So the tables are locked during the backup, which is really annoying for large databases as it can take several minutes.

If you are using innoDB tables, a better option is to disable locks "--lock-tables=false" and append "--single-transactions": http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html#option_mysqldump_single-transaction as suggested here: http://stackoverflow.com/questions/104612/run-mysqldump-without-locking-tables

With the master version of Backup-Manager, this can be done by using BM_MYSQL_EXTRA_OPTIONS="--lock-tables=false --single-transactions" I guess.

So, 2 suggestions:

  • if I am right, the BM_MYSQL_SAFEDUMPS is unuseful (at least from MySQL 5.0). We can keep it or change the behavior (to append --lock-tables if BM_MYSQL_SAFEDUMPS=false).
  • we could add an option: MYSQL_INNO_DB = true, and in this case, replace locks by single transaction in combination with BM_MYSQL_SAFEDUMPS=true

"make install" creates paths starting with double-slash

"make install" creates paths starting with double-slash, which should not be used unless it is a path on the network.

In the Makefile, there are paths combined from $(DESTDIR)/$(PREFIX) and other similar constructs. Defaults are set as:
DESTDIR?=
PREFIX?="/usr/local"
which results in paths beginning with "//usr/local/...", which is apparently not intended and wrong.

One fix I can suggest is to require DESTDIR to end with a slash (if not empty), and change all "$(DESTDIR)/$(" to "$(DESTDIR)$(".

Some files aren't backed up

Hi.

I just accidentally deleted some data, and had to restore from backup, made with backup-manager. Most of the data was there, but some files were missing from the backup tarball (the master), for no obvious reason. It's not filenames: some files named 'Makefile' were skipped and some not. It's not paths: some directories had all subdirectories backed up, but only some of the files. I can clearly tell when stuff is missing, since lots of the backed up directories had a git repo, and the repo knows when files are deleted. Any idea how to track this down?

Thanks

Support SFTP

It would be great, if Backup-Manager would support SFTP (not SCP) as an upload protocol, where a SSH-terminal access is forbidden.

ftp backup uploads the data two times

Hi,
Not sure if my problem is related to the configuration. I think I kept it very simple.
When running the backup, it will upload the same file two times.

Getting lock for backup-manager 13814 with /etc/backup-manager.conf
Cleaning /var/archives
Using method "tarball".
/var/archives/bnmmnb-etc.20140506.master.tar.gz: ok (3M, 88830cea5027470d15210a1340951f45)
Using the upload method "ftp".
Trying to upload files with ftp
Logged on 10.xx.xx.xx, in public_html (FTP binary mode)
Cleaning remote directory through FTP
File /var/archives/bnmmnb-20140506.md5 transfered
File /var/archives/bnmmnb-etc.20140506.master.tar.gz transfered
All transfers done, loging out from 10.xx.xx.xx
Logged on 10.xx.xx.xx, in public_html (FTP binary mode)
Cleaning remote directory through FTP
File /var/archives/bnmmnb-20140506.md5 transfered
File /var/archives/bnmmnb-etc.20140506.master.tar.gz transfered
All transfers done, loging out from 10.xx.xx.xx
No burning method used.

configuration file

export BM_REPOSITORY_ROOT="/var/archives"
export BM_TEMP_DIR="/tmp"
export BM_REPOSITORY_SECURE="true"
export BM_REPOSITORY_USER="root"
export BM_REPOSITORY_GROUP="root"
export BM_REPOSITORY_CHMOD="770"
export BM_ARCHIVE_CHMOD="660"
export BM_ARCHIVE_TTL="1"
export BM_REPOSITORY_RECURSIVEPURGE="false"
export BM_ARCHIVE_PURGEDUPS="true"
export BM_ARCHIVE_PREFIX="bnmmnb"
export BM_ARCHIVE_STRICTPURGE="true"
export BM_ARCHIVE_NICE_LEVEL="10"
export BM_ARCHIVE_METHOD="tarball"
export BM_ENCRYPTION_METHOD="false"
export BM_ENCRYPTION_RECIPIENT=""
export BM_TARBALL_NAMEFORMAT="long"
export BM_TARBALL_FILETYPE="tar.gz"
export BM_TARBALL_OVER_SSH="false"
export BM_TARBALL_DUMPSYMLINKS="false"
export BM_TARBALL_DIRECTORIES="/etc"
export BM_TARBALL_BLACKLIST="/var/archives"
export BM_TARBALL_SLICESIZE="1000M"
export BM_TARBALL_EXTRA_OPTIONS=""
export BM_TARBALLINC_MASTERDATETYPE="weekly"
export BM_TARBALLINC_MASTERDATEVALUE="1"
export BM_MYSQL_DATABASES="ALL"
export BM_MYSQL_SAFEDUMPS="true"
export BM_MYSQL_ADMINLOGIN="root"
export BM_MYSQL_ADMINPASS=""
export BM_MYSQL_HOST="localhost"
export BM_MYSQL_PORT="3306"
export BM_MYSQL_FILETYPE="bzip2"
export BM_SVN_REPOSITORIES=""
export BM_SVN_COMPRESSWITH="bzip2"
declare -a BM_PIPE_COMMAND
declare -a BM_PIPE_NAME
declare -a BM_PIPE_FILETYPE
declare -a BM_PIPE_COMPRESS
export BM_PIPE_COMMAND
export BM_PIPE_NAME
export BM_PIPE_FILETYPE
export BM_PIPE_COMPRESS
export BM_UPLOAD_METHOD="ftp"
export BM_UPLOAD_HOSTS="10.xx.xx.xx"
export BM_UPLOAD_DESTINATION="public_html"
export BM_UPLOAD_SSH_USER=""
export BM_UPLOAD_SSH_KEY=""
export BM_UPLOAD_SSH_HOSTS=""
export BM_UPLOAD_SSH_PORT=""
export BM_UPLOAD_SSH_DESTINATION=""
export BM_UPLOAD_SSH_PURGE="true"
export BM_UPLOAD_SSH_TTL=""
export BM_UPLOAD_SSHGPG_RECIPIENT=""
export BM_UPLOAD_FTP_SECURE="false"
export BM_UPLOAD_FTP_PASSIVE="true"
export BM_UPLOAD_FTP_USER="bnmmnb"
export BM_UPLOAD_FTP_PASSWORD="somerandompassword"
export BM_UPLOAD_FTP_HOSTS="10.xx.xx.xx"
export BM_UPLOAD_FTP_PURGE="true"
export BM_UPLOAD_FTP_TTL="14"
export BM_UPLOAD_FTP_DESTINATION=""
export BM_UPLOAD_S3_DESTINATION=""
export BM_UPLOAD_S3_ACCESS_KEY=""
export BM_UPLOAD_S3_SECRET_KEY=""
export BM_UPLOAD_S3_PURGE="false"
export BM_UPLOAD_RSYNC_DIRECTORIES=""
export BM_UPLOAD_RSYNC_DESTINATION=""
export BM_UPLOAD_RSYNC_HOSTS=""
export BM_UPLOAD_RSYNC_DUMPSYMLINKS="false"
export BM_BURNING_METHOD=""
export BM_BURNING_CHKMD5="false"
export BM_BURNING_DEVICE=""
export BM_BURNING_DEVFORCED=""
export BM_BURNING_ISO_FLAGS="-R -J"
export BM_BURNING_MAXSIZE=""
export BM_LOGGER="true"
export BM_LOGGER_LEVEL="info"
export BM_LOGGER_FACILITY="user"
export BM_PRE_BACKUP_COMMAND=""
export BM_POST_BACKUP_COMMAND=""

nice value not set correctly

I have a server configured with mysql backup with compression, and this starves the CPU even when I am running it at nice level 15. I think this is happening because nice needs to be invoked for every command in the pipeline, and right now it seems to be invoked only for mysqldump, not for the following compression command.

we have currently

nice -n nice_value mysqldump .... | bzip2 ...

which could be modified to

nice -n nice_value mysqldump .... | nice -n nice_value bzip2 ...

Use --rsyncable for gzip archives

According to the gzip manpage, using the --rsyncable flag improves data usage:

--rsyncable
While compressing, synchronize the output occasionally based on the input.
This increases size by less than 1 percent most cases,  but  means  that  the 
rsync(1)  program can take advantage of similarities in the uncompressed input when 
syncronizing two files compressed with this flag. gunzip cannot tell the difference
between a compressed file created with this option, and one created without it.

PostgreSQL dump

Currently BM allow MySQL dump. Why not to add also PG dump ?

Release

Hi,

is it possible to do a new release with all evolution since 2010 ?

Thx

send_files_with_s3 just ignores the $repository argument

I am checking the backup-manager-upload script from devel branch, finding out the send_files_with_s3 just ignores the $repository argument which causing the BM_UPLOAD_DESTINATION doesn't work for s3 upload method, is this expected?

I also noticed that

The bucket to upload to. This bucket must be dedicated to backup-manager
export BM_UPLOAD_S3_DESTINATION=""

does "dedicated" imply the s3 upload method will take no destination / prefix?

In my opinion, I think it's very useful to specify a s3 key prefix in conf file so it allows backup-manager to share the bucket with other guys (by using different prefix).

Is management of Backup-Manager still active?

I like the tool very much, and want to use it on all mu servers, but I'm woried that there is no development since Sep 30, 2014. Is this repository still active?
I've already got a patch almost ready and will create a pull request in a few hours

Hardcoded paths in backup-manager exec

Some paths in backup-manager exec are hardcoded

# All the paths we provide
libdir="/usr/share/backup-manager"
vardir="/var/lib/backup-manager"
bmu="/usr/bin/backup-manager-upload"
bmp="/usr/bin/backup-manager-purge"

Ii's problematic when BM is installed with PREFIX=/usr/local

backup-manager didn't delete old backup files

Config file is:
export BM_REPOSITORY_ROOT="/backups/org/projects"
export BM_TEMP_DIR="/tmp"
export BM_REPOSITORY_SECURE="true"
export BM_REPOSITORY_USER="root"
export BM_REPOSITORY_GROUP="root"
export BM_REPOSITORY_CHMOD="770"
export BM_ARCHIVE_CHMOD="660"
export BM_ARCHIVE_TTL="10"
export BM_ARCHIVE_STRICTPURGE="false"
export BM_ARCHIVE_NICE_LEVEL="10"
export BM_ARCHIVE_METHOD="tarball-incremental"

export BM_TARBALL_NAMEFORMAT="short"
export BM_TARBALL_FILETYPE="tar.gz"
export BM_TARBALL_OVER_SSH="false"
export BM_TARBALL_DUMPSYMLINKS="false"

export BM_TARBALL_DIRECTORIES="/store/org/projects"

export BM_TARBALL_BLACKLIST="*/nobackup"
export BM_TARBALL_EXTRA_OPTIONS=""

export BM_TARBALLINC_MASTERDATETYPE="weekly"
export BM_TARBALLINC_MASTERDATEVALUE="5"

export BM_LOGGER="true"
export BM_LOGGER_FACILITY="user"
export BM_PRE_BACKUP_COMMAND=""
export BM_POST_BACKUP_COMMAND=""

It creates master and increment archives successfully, but don't remove files older that 10 days.
Version is 0.7.9-3 on Debian Squeeze.

What is wrong in config file?
How can I debug all the process? Adding "--verbose" didn't help, output is only:
When validating the configuration file /etc/backup-manager.conf, 4 warnings were found.

Blacklisted Directories

I have a directory that is a subdirectory of a directory I am backing up using Backup Manager. This subdirectory has some very large files in it. I don't want anything from that subdirectory included as part of the backup. Here is the main directory I am backing up:

BM_TARBALL_TARGETS[0]="/home/trails"

And here are a couple different ways I tried to exclude the directory using the blacklist feature without success:

export BM_TARBALL_BLACKLIST="/dev /sys /proc /tmp /home/trails/data"
export BM_TARBALL_BLACKLIST="/dev /sys /proc /tmp /home/trails/data/*"

I'd appreciate any advice on how to set this up correct to exclude files in the 'data' directory. Thanks in advance for your help!

BM_TARBALL_BLACKLIST could use wildcards

BM_TARBALL_BLACKLIST could use wildcards. A good example is skipping .gvfs directories which throw access denied errors for root user:

BM_TARBALL_BLACKLIST="/home/*/.gvfs"

Currently it needs to be exact:

BM_TARBALL_BLACKLIST="/home/bob/.gvfs /home/george/.gvfs /home/paul/.gvfs /home/john/.gvfs /home/ringo/.gvfs ..."

There is no "make uninstall"

After doing "make install" there is no easy way to uninstall backup-manager.

Good practice in makefiles with "install" is to provide an "uninstall".

FreeBSD 9.x

Hello,

Just a question, has anyone tried to get this great script to work on FreeBSD 9? I saw the remark about the group ID on FreeBSD so I am wondering if that was just a long time ago on a much older version.

Cheers

backup-manager will omit files unintended

Due to backup-manager making excludes relative to the actual target-path, tar will leave out identically named directories anywhere in the target space.
Example: target is /var, exclude is /var/archives, then /var/lib/mailman/archives will be omitted also. To get the intended behaviour comment out the lines

            pattern="${pattern#$target}"
            length=$(expr length $pattern)
            pattern=$(expr substr $pattern 2 $length)

in file lib/backup_methods.sh near line 259 (function __get_flags_relative_blacklist)

tarball-incremental method doesn't apply BM_ARCHIVE_CHMOD to *.incremental.bin

With "tarball-incremental" method, it generates *.tar.gz and *.incremental.bin files. The tar.gz files are chmod'ed with BM_ARCHIVE_CHMOD if BM_REPOSITORY_SECURE is true; but not the incremental.bin ones.

By default it has -rw------- permissions, which will fail remote backup when pulling the archives to another location with a dedicated user.

Temporary fix:
Create a /home/backup/fix-perms.sh script:

#!/bin/bash
chown root:backup /home/backup/*.incremental.bin && chmod g+r /home/backup/*.incremental.bin

Then, in /etc/backup-manager.conf:

export BM_POST_BACKUP_COMMAND="/home/backup/fix-perms.sh"

empty mysql backups

If a .my.cnf file already exists, backup-manager will create an empty sql file (and a 14 byte bz2 file if compression is on). This happens because it uses the --defaults-extra-file option from mysql, and this overrides everything except the ~/.my.cnf. The failure is not reported anywhere (not even in the verbose mode).

It should not fail silently - if mysqldump failed, $PIPESTATUS could be used to determine failure.

Parallel compression using pbzip2 or pigz

Hi,
I'm always using BM on few servers and, some years ago, I've wrote a patch to use pbzip2 if installed, for a parallel compression process.

Unfortunately, I've lost my patch… but I can write it again and pull request it, but is this repo always managed by someone ? Or need I fork to another repo ?

I wanted to also write a new upload method using HubiC.

Detail info about warnings in conf files

I have use backup-manager on Debian Squeeze and it shows me sometimeswarning at output:
When validating the configuration file /etc/backup-manager.conf, 1 warnings were found.
When validating the configuration file /etc/backup-manager.conf, 3 warnings were found.

But I can't find what is the problem string and what is wrong on it.

Adding "--verbose" param to command didn't help.
How I can debug this problems?
Can I see which key, which test is failed or string number where the warning was found?

What happens with backup-manager.org site?

I can't open the backup-manager.org site very long time.
Is there hosting/domain problems or project moved to other domain?

Development of new version and support old versions will continue? or the project have any problems and developing is stopped?

backup-manager is the best tool for backup data, please don't stop the development!

Debian Update

Hi, from Debian Repository I have 0.7.10.1-debian1 version installed.

How I can update it to latest?

I don't know how to disable "UPLOAD" option: when I write "export BM_UPLOAD_METHOD="none" script wait for "Insert CD-ROM Drive to continue". How I can resolve that problem? I don't have CD-ROM at my VDS. :)

Hour by hour incremental backup

Hello,

Thanks for your great tool ! My servers feels better with it. 👍

Is it possible to make an hourly backup ?
I have find out some things about a BM_ARCHIVE_FREQUENCY but it's not in the doc or anywhere else...

Thanks.

Too many levels of symbolic links

When BM_ARCHIVE_PURGEDUPS is set to true, a symlink is created from the old file to the new one. However, when a configuration file didn't change for 8 days, the symlink limit is reached, and old file can't be accessed by name.

I got no obvious solution for that.

If symlinks are created from the new file to the old file, then we'll get in trouble when the real purge comes.
It is maybe possible to go backward in previous symlinks and fix them when creating a new one, but then the script could get really complex.
Finally, the use of hardlinks could change this. However, I'm nearly sure there's also a limitations on the number of hardlinks an inode can have.

Pgpass format regexp failed

The regexp test in backups-methods.sh for pgsql failed each time for me

if ! grep -qE "(${BM_PGSQL_HOST}|[^:]*):(${BM_PGSQL_PORT}|[^:]*):[^:]*:${BM_PGSQL_ADMINLOGIN}:${BM_PGSQL_ADMINPASS}" $pgsql_conffile; then

Support of asterisk * in BM_TARBALL_BLACKLIST

Hello,

Here's my configuration:

export BM_TARBALL_BLACKLIST="/var/archives/* /dev/* /home/user/no_backup/* /media/* /mnt/* /proc/* /run/* /sys/* /tmp/* /lost+found /boot/lost+found /home/lost+found /var/lost+found"

But after that I've still files that are in the backup in /home/user/no_backup/* or /proc/* but it seems to work for /run/* /sys/*. It's completely random, why?

For information, if I use the asterisk to avoid to manually recreate folders after restoring my system.

Incomplete error messages

eg: The "pgsql" method is chosen, but $pgdump is not found.

When pgdump is actually missing it displays:
The "pgsql" method is chosen, but is not found.

which is not very helpful

(actually only tested with french translation)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.