Coder Social home page Coder Social logo

attic's People

Contributors

adept avatar brodul avatar c4rlo avatar dfries avatar ernest0x avatar jbms avatar jborg avatar jdchristensen avatar kannes avatar kljohann avatar sherbang avatar thomaswaldmann avatar tungd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

attic's Issues

Deletion failure

tu@cis02:~/Desktop/attic$ attic create -v --stats cis-tanpc11:/home/tu/Desktop/attic/Backup.attic::1st-backup ~/Desktop/Dataset/GeoPicResult/GeoPic_Dataset.tar.gz
Initializing cache...

/home/tu/Desktop/Dataset/GeoPicResult/GeoPic_Dataset.tar.gz

Archive name: 1st-backup
Archive fingerprint: 205b174a2756a1ed58ace08058e154d6fc8e4577f90b5461cd81553f7ea83da1
Start time: Sun Sep 29 10:46:17 2013
End time: Sun Sep 29 11:40:09 2013
Duration: 53 minutes 51.70 seconds
Number of files: 1
Original size: 2803798569 (2.61 GB)
Compressed size: 2804904820 (2.61 GB)
Unique data: 2804904820 (2.61 GB)


tu@cis02:~/Desktop/attic$ attic delete cis-tanpc11:/home/tu/Desktop/attic/Backup.attic::1st-backup
hashindex: /home/tu/.cache/attic/b684cdc0feff87d8b7cd7b26688cbfc65dfb906fd2ac53ba76f7d3b2b7dad23f/chunks.tmp: Incorrect file length
Traceback (most recent call last):
File "/usr/local/bin/attic", line 3, in
main()
File "/usr/local/lib/python3.2/dist-packages/attic/archiver.py", line 481, in main
exit_code = archiver.run(sys.argv[1:])
File "/usr/local/lib/python3.2/dist-packages/attic/archiver.py", line 475, in run
return args.func(args)
File "/usr/local/lib/python3.2/dist-packages/attic/archiver.py", line 193, in do_delete
archive.delete(cache)
File "/usr/local/lib/python3.2/dist-packages/attic/archive.py", line 334, in delete
self.cache.chunk_decref(chunk_id)
File "/usr/local/lib/python3.2/dist-packages/attic/cache.py", line 190, in chunk_decref
del self.chunks[id]
File "hashindex.pyx", line 60, in attic.hashindex.IndexBase.delitem (attic/hashindex.c:1519)
Exception: hashindex_delete failed

The repository is locked while a backup is running

It seems that a repository gets locked while an archive creation is in process, which means that only one archive at a time can be created on a specific repository.

Is this a known limitation of attic or a bug?

This behavior is very inconvenient, since it is impossible to have many hosts pushing their backups to a central repository in parallel.

Investigate if Backup-Bouncer results can be improved

Lee Joramo reported on the mailing list that Attic fails some Backup-Bonucer tests on OS X.

We should investigate if these results can be improved

Current results

Verifying:    basic-permissions ... ok (Critical)
Verifying:           timestamps ... ok (Critical)
Verifying:     &n bsp;       symlinks ... ok (Critical)
Verifying:    symlink-ownership ... ok 
Verifying:            hardlinks ... ok (Important)
Verifying:       resource-forks ... 
   Sub-test:             on files ... ok (Critical)
   Sub-test:  on hardlinked files ... ok (Important)
Verifying:         finder-flags ... ok (Critical)
Verifying:         finder-locks ... FAIL 
Verifying:        creation-date ... FAIL 
Verifying:            bsd-flags ... FAIL 
Verifying:       extended-attrs ... 
   Sub-test:             on files ... ok (Important)
   Sub-test:       on directories ... ok (Important)
   Sub-test:          on symlinks ... FAIL 
Verifying: access-control-lists ... 
   Sub-test:             on files ... FAIL (Important)
   Sub-test:              on dirs ... FAIL (Important)
Verifying:                 fifo ... ok 
Verifying:              devices ... ok 
Verifying:          combo-tests ... 
   Sub-test:  xattrs + rsrc forks ... ok 
   Sub-test:     lots of metadata ... FAIL 

https://github.com/n8gray/Backup-Bouncer

config file whis tasks

want's to see config file in .attic dir whis named tasks.
now backup command may be very big if use many executes

Incorrect display of time in archive listings and info

The time that is displayed with both attic list and attic info commands is wrong by one hour on a system with EET as local time. It should be "Wed Dec 11 16:07:29 2013" but "Wed Dec 11 17:07:29 2013" is displayed instead. It seems to be related to timezone conversion that is done by attic, because if I try not to pass "archive.ts" through "to_localtime()" in archiver.py, it displays the correct time.

Add option to `attic serve` process to restrict access to specific path

There is a discussion in the mailing list about ways to achieve isolation between hosts that push backups on a central host with repositories.

Besides the obvious, more secure but with bigger overhead method of maintaining separate accounts for each client, Jonas suggested an option for attic serve to restrict access to a specific path, which, along with encryption and restricting ssh to a specific command, may be an acceptable solution for securing attic usage. I am copying Jonas' example below:

For example something like this is /home/attic/.ssh/authorized_keys

command="attic serve --restrict-to-path /data/clientA" ssh-rsa clientA's key
command="attic serve --restrict-to-path /data/clientB" ssh-rsa clientB's key

Prune command deletes archives that have been created last

Using the following script:

#!/binbash
REPOSITORY=/path/to/repository
HOSTNAME=hostname1.example.org

# Backup 
attic create --stats                                       \
    $REPOSITORY::$HOSTNAME-`date +%Y-%m-%d-%H:%M`          \
    /path/to/include                                       \
    --exclude /path/to/exclude

# Use the `prune` subcommand to maintain 7 daily, 4 weekly
# and 6 monthly archives.
attic prune -v $REPOSITORY --daily=7 --weekly=4 --monthly=6

Got this:

Initializing cache...
Analyzing archive: hostname1.example.org-2013-12-10-19:45
Analyzing archive: hostname1.example.org-2013-12-05-20:06
----------------------------------------
Archive name: hostname1.example.org-2013-12-10-19:48
Archive fingerprint: cbf35a8e616ac302acaf2489ac9bbf1ef272bb6203df77d5474b65251c4f9e8c
Start time: Tue Dec 10 19:48:26 2013
End time: Tue Dec 10 20:28:17 2013
Duration: 39 minutes 51.04 seconds
Number of files: 203921
Original size: 8798879751 (8.19 GB)
Compressed size: 2983340404 (2.78 GB)
Unique data: 180336967 (171.98 MB)
----------------------------------------
Initializing cache...
Analyzing archive: hostname1.example.org-2013-12-10-19:45
Analyzing archive: hostname1.example.org-2013-12-05-20:06
Analyzing archive: hostname2.example.org-2013-12-10-19:58
Analyzing archive: hostname1.example.org-2013-12-10-19:48
Keeping archive "hostname2.example.org-2013-12-10-19:58"
Keeping archive "hostname1.example.org-2013-12-05-20:06"
Pruning archive "hostname1.example.org-2013-12-10-19:48"
Pruning archive "hostname1.example.org-2013-12-10-19:45"

Why did it prune the last backups?
It pruned even the archive that was created just before the prune command!

(Don't get surprised with hostname2 backup not being mentioned at the analyzing step. It was created just after the above backup.)

BlockingIOError in verbose mode

With latest attic (commit: 50cabd5) as well as with previous commits and verbose mode:

Traceback (most recent call last):
  File "/usr/local/bin/attic", line 5, in <module>
    pkg_resources.run_script('Attic==0.9-6-g50cabd5', 'attic')
  File "/usr/local/lib/python3.3/site-packages/setuptools-2.0-py3.3.egg/pkg_resources.py", line 483, in run_script
  File "/usr/local/lib/python3.3/site-packages/setuptools-2.0-py3.3.egg/pkg_resources.py", line 1341, in run_script
  File "/usr/local/lib/python3.3/site-packages/setuptools-2.0-py3.3.egg/pkg_resources.py", line 50, in execfile
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/EGG-INFO/scripts/attic", line 3, in <module>
    main()
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 479, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 473, in run
    return args.func(args)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 104, in do_create
    self._process(archive, cache, args.excludes, skip_inodes, path, restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 150, in _process
    os.path.join(path, filename), restrict_dev)
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 135, in _process
    self.print_verbose(remove_surrogates(path))
  File "/usr/local/lib/python3.3/site-packages/Attic-0.9_6_g50cabd5-py3.3-linux-x86_64.egg/attic/archiver.py", line 42, in print_verbose
    print(msg)
BlockingIOError: [Errno 11] write could not complete without blocking
Exception BlockingIOError: BlockingIOError(11, 'write could not complete without blocking', 0) in <_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'> ignored

I had trouble trying to backup around ~330GB/1.5 million files of data to a remote repository, so I enabled verbose mode to see what is happening and I reached to a point that the above error occured.

The remote repository path on the remote host is under an NFS mount on another remote host (NFS server), so the setup can be described like this: backup source host --SSH--> remote host --NFS--> repository path. This setup works without problems with other backup source hosts with far less data (up to 7GB e.g.) sending their backups on the same remote host under the same NFS mount, so I suspect that the bigger size of the source data or maybe the longer time needed to send these may play a role on this.

Any idea what may be causing this?

create directories for mount points

currently, the --do-not-cross-mountpoints option works great and helps a lot with root filesystem backups.

unfortunately, it seems to be skipping the mount points entirely, not just the inner contents. this results in /run, /boot, /tmp, ... missing in the backup so recreating a fully working system is not just a matter of unpacking it from the backup.

imho, the directories for the mount points should be included (and empty).

Crash when pruning an archive

I run into this issue:

Keeping archive "servers-2014-03-24-082109"
Pruning archive "servers-2014-03-22-082916"
Traceback (most recent call last):
  File "Attic-0.12/build/scripts-3.3/attic", line 3, in <module>
    main()
  File "/usr/lib/python3/dist-packages/attic/archiver.py", line 479, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/usr/lib/python3/dist-packages/attic/archiver.py", line 473, in run
    return args.func(args)
  File "/usr/lib/python3/dist-packages/attic/archiver.py", line 329, in do_prune
    archive.delete(cache)
  File "/usr/lib/python3/dist-packages/attic/archive.py", line 315, in delete
    self.cache.chunk_decref(chunk_id)
  File "/usr/lib/python3/dist-packages/attic/cache.py", line 195, in chunk_decref
    count, size, csize = self.chunks[id]
  File "hashindex.pyx", line 140, in attic.hashindex.ChunkIndex.__getitem__ (attic/hashindex.c:2761)
KeyError

exception during "attic create" with repeated files

If the same file is specified on the command line (e.g. if both "/" and "/etc" are listed), then there is an error.

$ mkdir test-repeat

$ touch test-repeat/foo

$ attic init test-repeat.attic
Initializing repository at "test-repeat.attic"
Encryption NOT enabled.
Use the "--encryption=passphrase|keyfile" to enable encryption.

$ attic create test-repeat.attic::1 test-repeat test-repeat
Initializing cache...
Traceback (most recent call last):
  File "/home/jdc/bin/attic", line 3, in <module>
    main()
  File "/home/scratchy/computers/backups/attic/attic/archiver.py", line 608, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/home/scratchy/computers/backups/attic/attic/archiver.py", line 602, in run
    return args.func(args)
  File "/home/scratchy/computers/backups/attic/attic/archiver.py", line 124, in do_create
    self._process(archive, cache, args.excludes, skip_inodes, path, restrict_dev)
  File "/home/scratchy/computers/backups/attic/attic/archiver.py", line 170, in _process
    os.path.join(path, filename), restrict_dev)
  File "/home/scratchy/computers/backups/attic/attic/archiver.py", line 158, in _process
    archive.process_file(path, st, cache)
  File "/home/scratchy/computers/backups/attic/attic/archive.py", line 366, in process_file
    ids = cache.file_known_and_unchanged(path_hash, st)
  File "/home/scratchy/computers/backups/attic/attic/cache.py", line 209, in file_known_and_unchanged
    self.files[path_hash][0] = 0
TypeError: 'tuple' object does not support item assignment

Corrupted my Repo

I Seem to have done something to corrupt both my attic repos, Probably didn't help that at some point in time I was an idiot and deleted .cache

Tried upgrading from the version installed by pip to the latest version from Git and running attic check but the result is the same

attic: Error: /data/backup.attic is not a valid repository

I have another repo that does the same thing

The backup.attic repo could have being accessed at the time I had to hard reboot the system after running into a BTRFS SMP bug in the debian 3.2.04 kernel (I've since built the latest stable kernel from source and yet to see a occurrence)

Anyway have I completely screwed the repo or is it just I'm being an idiot and overlooking something simple

More informative "attic create -v" output

As suggested by Christian Neukirchen on the mailing list it would be great if Attic would show more detailed progress during archive creation. Like if a file was modified or not and perhaps also the deduplicated size (or ratio).

http://librelist.com/browser//attic/2014/3/16/a-few-questions-on-attic/

Christian's suggestion (like duplicity)


A /tmp/addedfile
M /tmp/changedfile
D /tmp/deletedfile
(I guess "D" is not detected by Attic.)

Another suggestion from Petros:

Changed: /path/to/some/file [original size: 50KB, compressed size: 20KB,
unique data: 2KB]
New: /path/to/another/file [original size: 1MB, compressed size: 600KB,
unique data: 30KB]

Jonas' suggestion:

/tmp
/tmp/addedfile  (2kB, 3%)
/tmp/changedfile  (23MB, 25%)
/dev/zero

Attic and large files

I've recently found out about attic and it seems really interesting, but I'm having trouble backing up large files (ie. dvd images). Is it possible to do something about it - skip deduplication, decrease compression or something that would speed things up?
I've tried creating and running attic on a large directory with many small files and it works great, but now I'm creating a new repository of a bunch of home folders and from the output of lsof I can see it takes several hours to backup large files.
Update: I'm using encryption, transferring over ssh with arcfour and compression enabled over a 10mbit line. It seems that the network is the bottleneck.

Feature request: --include

We need --include as a complement to --exclude, to enable e.g.
--exclude /usr --include /usr/local
which is currently awkward to simulate, at best.

Two running attics at once - trouble?

Hi,

This report is about attic built from git revision 7be0ad6, running on Debian, backing up from ext3 to smbfs.

I had a scheduled attic invocation in crontab. Then, one day, I ran attic by hand because I just removed several --exclude switches and wanted to update my archive right now.

This took more time than I expected and at some point scheduled attic started as well. When I realized this, I Ctrl-C'ed the attic that I started, but attic started by cron terminated shortly after saying "cant open /one/of/the/files/to/back/up". File in that error message was in one one the volatile directories (chromium settings, I think), so I was not really worried.

However, since then attic ceased to back things up. Scheduled runs of "attic creat" ended up with:

Initializing cache...
Analyzing archive: dimail-2014-02-28_20.00.checkpoint
Analyzing archive: dimail-2014-02-26_20.00
Analyzing archive: dimail-2014-02-27_20.47
Analyzing archive: dimail-2014-02-22_20.00
Analyzing archive: dimail-2014-02-28_23.13
attic: Error: Repository /mnt/backup/attic does not exist

It always stopped after the same line (Analyzing archive: dimail-2014-02-28_23.13).

I made a copy of the repo on the removable disk and ran "attic check -v". Here is what I got:

dimail:~# attic check -v /tmp/disk/attic/
Starting repository check...
Repository check complete, no problems found.
Starting archive consistency check...
Analyzing archive dimail-2014-02-27_20.47 (1/17)
Analyzing archive dimail-2014-02-24_20.00.checkpoint (2/17)
Analyzing archive dimail-2014-02-26_20.00 (3/17)
Analyzing archive dimail-2014-02-28_20.00.checkpoint (4/17)
Analyzing archive dimail-2014-02-25_20.00 (5/17)
Analyzing archive dimail-2014-02-22_20.00 (6/17)
Analyzing archive dimail-2014-03-01_20.00 (7/17)
[some "Missing file chunk detected" errors skipped]
home/adept/.config/chromium/Default/Shortcuts: Missing file chunk detected (Byte 204276-259454)
home/adept/.config/chromium/Default/Visited Links: Missing file chunk detected (Byte 548745-575647)
home/adept/.config/google-chrome/Safe Browsing Bloom: Missing file chunk detected (Byte 2896309-2953547)
Archive metadata damage detected
Archive metadata damage detected
Traceback (most recent call last):
  File "/usr/local/bin/attic", line 3, in <module>
    main()
  File "/usr/local/lib/python3.3/dist-packages/attic/archiver.py", line 612, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/usr/local/lib/python3.3/dist-packages/attic/archiver.py", line 602, in run
    return args.func(args)
  File "/usr/local/lib/python3.3/dist-packages/attic/archiver.py", line 78, in do_check
    if args.phase in ('all', 'archive') and not ArchiveChecker().check(repository, repair=args.repair):
  File "/usr/local/lib/python3.3/dist-packages/attic/archive.py", line 470, in check
    self.rebuild_refcounts()
  File "/usr/local/lib/python3.3/dist-packages/attic/archive.py", line 619, in rebuild_refcounts
    for item in robust_iterator(archive):
  File "/usr/local/lib/python3.3/dist-packages/attic/archive.py", line 599, in robust_iterator
    for item in unpacker:
  File "/usr/local/lib/python3.3/dist-packages/attic/archive.py", line 437, in __next__
    item = next(self._unpacker)
  File "_unpacker.pyx", line 377, in msgpack._unpacker.Unpacker.__next__ (msgpack/_unpacker.cpp:377)
  File "_unpacker.pyx", line 306, in msgpack._unpacker.Unpacker._unpack (msgpack/_unpacker.cpp:306)
MemoryError

Ok. So I ran "attic check --repair -v" and got this:

dimail:~# attic check -v --repair /tmp/disk/attic/
attic: Warning: 'check --repair' is an experimental feature that might result
in data loss.

Type "Yes I am sure" if you understand this and want to continue.

Do you want to continue? Yes I am sure
Starting repository check...
Repository check complete, no problems found.
Starting archive consistency check...
Analyzing archive dimail-2014-02-24_20.00.checkpoint (1/17)
Traceback (most recent call last):
  File "/usr/local/lib/python3.3/dist-packages/attic/repository.py", line 476, in get_fd
    return self.fds[segment]
  File "/usr/local/lib/python3.3/dist-packages/attic/lrucache.py", line 24, in __getitem__
    return super(LRUCache, self).__getitem__(key)
KeyError: 5277

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/attic", line 3, in <module>
    main()
  File "/usr/local/lib/python3.3/dist-packages/attic/archiver.py", line 612, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/usr/local/lib/python3.3/dist-packages/attic/archiver.py", line 602, in run
    return args.func(args)
  File "/usr/local/lib/python3.3/dist-packages/attic/archiver.py", line 78, in do_check
    if args.phase in ('all', 'archive') and not ArchiveChecker().check(repository, repair=args.repair):
  File "/usr/local/lib/python3.3/dist-packages/attic/archive.py", line 470, in check
    self.rebuild_refcounts()
  File "/usr/local/lib/python3.3/dist-packages/attic/archive.py", line 619, in rebuild_refcounts
    for item in robust_iterator(archive):
  File "/usr/local/lib/python3.3/dist-packages/attic/archive.py", line 597, in robust_iterator
    for chunk_id, cdata in zip(items, self.repository.get_many(items)):
  File "/usr/local/lib/python3.3/dist-packages/attic/repository.py", line 348, in get_many
    yield self.get(id_)
  File "/usr/local/lib/python3.3/dist-packages/attic/repository.py", line 342, in get
    return self.io.read(segment, offset, id_)
  File "/usr/local/lib/python3.3/dist-packages/attic/repository.py", line 540, in read
    fd = self.get_fd(segment)
  File "/usr/local/lib/python3.3/dist-packages/attic/repository.py", line 478, in get_fd
    fd = open(self.segment_filename(segment), 'rb')
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/disk/attic/data/0/5277'

This is where it became evident that warning about data loss was a real one, cause the whole repo was basically wiped away:

dimail:~# ls -lR /tmp/disk/attic
/tmp/disk/attic:
total 164064
-rw-r--r-- 1 root root        28 Feb 21 10:08 README
-rw-r--r-- 1 1026 users      148 Feb 25 00:08 config
drwxr-xr-x 3 root root      4096 Feb 21 10:08 data
-rw-r--r-- 1 root root     35499 Mar  4 20:49 hints.0
-rw-r--r-- 1 root root  83886098 Mar  4 20:49 index.0
-rw-r--r-- 1 root root  83886098 Mar  4 20:50 index.tmp

/tmp/disk/attic/data:
total 144
drwxr-xr-x 2 root root 143360 Mar  4 20:50 0

/tmp/disk/attic/data/0:
total 4972
-rw-r--r-- 1 root root 5030282 Mar  4 20:46 0
-rw-r--r-- 1 root root   43760 Mar  4 20:50 1

I made another copy of the repo and went through the process again to confirm that "check --repair" indeed deletes all the data.

What did I do wrong and is it possible to repair this repo, or should I just scrap it and start fresh?

Archive stdin or fifo

(From http://librelist.com/browser//attic/2014/1/2/feature-request-archive-stdin-or-fifo/)

Hi,

first of all thanks to Jonas for developing attic. I am using attic
since 4 weeks on Debian and that's a great tool for doing backup with
deduplication. I am excited about the dedup ratio (original size vs.
unique data) achieved in my single repository although throwing in
different backup sources (tar, mysqldump) from different machines.

I would like to send a feature request here because i dont't have a
github account:
Like (all) the other Unix/Linux tools attic should be able to open
stdin or a fifo for backup and dedup the stream until eof. And of
course it should be possible to extract these archives to stdout.

Greetings,
Matthias.

error during verify on a sshfs-mounted repo

backing up the /home directory (2TB) to a sshfs-mounted repo.
"attic create" completes successfully, but "attic verify" or info fails with
IOError: [Errno 1] Operation not permitted: '/backup/attic/home.helium/data/13/135691'.

After attic exits, I am able to the 135691 file from command line by computing its md5sum via the sshfs mount, without any problem.

My first suspect was that sftp-server on the remote server hits the maximum number of opened files limit. See http://permalink.gmane.org/gmane.comp.file-systems.fuse.sshfs/995

However, increasing this limit from 1024 to 4096 did not help.

Complete error message :

attic: home/xxx/yyy/zzz.cpp: verification failed
Traceback (most recent call last):
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/repository.py", line 349, in get_fd
    return self.fds[segment]
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/lrucache.py", line 24, in __getitem__
    return super(LRUCache, self).__getitem__(key)
KeyError: 135691

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/python-3.2/bin/attic", line 3, in <module>
    main()
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/archiver.py", line 479, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/archiver.py", line 473, in run
    return args.func(args)
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/archiver.py", line 274, in do_verify
    for item in archive.iter_items(lambda item: not exclude_path(item[b'path'], patterns), preload=True):
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/archive.py", line 142, in iter_items
    for item in self.pipeline.unpack_many(self.metadata[b'items'], filter=filter, preload=preload):
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/archive.py", line 34, in unpack_many
    for data in self.fetch_many(ids):
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/archive.py", line 47, in fetch_many
    for id_, data in zip_longest(ids, self.repository.get_many(ids, is_preloaded=is_preloaded)):
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/repository.py", line 225, in get_many
    yield self.get(id_)
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/repository.py", line 219, in get
    return self.io.read(segment, offset, id)
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/repository.py", line 391, in read
    fd = self.get_fd(segment)
  File "/opt/python-3.2/lib/python3.2/site-packages/attic/repository.py", line 351, in get_fd
    fd = open(self.segment_filename(segment), 'rb')
IOError: [Errno 1] Operation not permitted: '/backup/attic/home.helium/data/13/135691'

Controlling memory usage?

Hi there,

I seem to have been searching for ages to find something like attic, and it seems great so far.

My question involves memory usage; I set it backing up a Red Hat Enterprise 6.5 server last night with probably ~60GB of data to back-up, but a lot of duplication within that.

Attic ended up using >2GB memory (according to 'top'), and unfortunately gradually ground the server to a halt somewhat, as the server started swapping continually.

I've now excluded a bunch of files, but, after maybe 90 minutes of good progress, attic is now using 1GB.

My question is: is there any way I can keep that memory usage down? (I assume it's using it keeping track of duplicate data; can I perhaps limit that in some way, even at the cost of less de-duplication?)

Many thanks,
Neil.

Tests failing on Arch Linux

Hello! As the title says, some of the Attic tests are failing on Arch Linux. Here's some information:

Arch Linux x86_64 (fully updated)
Python 3.3.2

Attic git master branch (as of today)
python-msgpack 0.4.0
openssl 1.0.1.e-5
python-llfuse 0.39-2

This is the test results:

$ python -m attic.testsuite.run
....Key file "/tmp/tmpt36959/tmp_tmpu3luws" created.
Keep this file safe. Your data will be inaccessible without it.
..Remember your passphrase. Your data will be inaccessible without it.
.Encryption NOT enabled.
Use the "--encryption=passphrase|keyfile" to enable encryption.
..........EEE...E.....EEE...E.....................s
======================================================================
ERROR: test_aes_counter_uniqueness_keyfile (attic.testsuite.archiver.ArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 287, in test_aes_counter_uniqueness_keyfile
    self.verify_aes_counter_uniqueness('keyfile')
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 274, in verify_aes_counter_uniqueness
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

======================================================================
ERROR: test_aes_counter_uniqueness_passphrase (attic.testsuite.archiver.ArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 290, in test_aes_counter_uniqueness_passphrase
    self.verify_aes_counter_uniqueness('passphrase')
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 274, in verify_aes_counter_uniqueness
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

======================================================================
ERROR: test_basic_functionality (attic.testsuite.archiver.ArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 130, in test_basic_functionality
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

======================================================================
ERROR: test_mount (attic.testsuite.archiver.ArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 242, in test_mount
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

======================================================================
ERROR: test_aes_counter_uniqueness_keyfile (attic.testsuite.archiver.RemoteArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 287, in test_aes_counter_uniqueness_keyfile
    self.verify_aes_counter_uniqueness('keyfile')
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 274, in verify_aes_counter_uniqueness
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

======================================================================
ERROR: test_aes_counter_uniqueness_passphrase (attic.testsuite.archiver.RemoteArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 290, in test_aes_counter_uniqueness_passphrase
    self.verify_aes_counter_uniqueness('passphrase')
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 274, in verify_aes_counter_uniqueness
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

======================================================================
ERROR: test_basic_functionality (attic.testsuite.archiver.RemoteArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 130, in test_basic_functionality
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

======================================================================
ERROR: test_mount (attic.testsuite.archiver.RemoteArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 242, in test_mount
    self.create_test_files()
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 111, in create_test_files
    os.chown('input/file1', 100, 200)
PermissionError: [Errno 1] Operation not permitted: 'input/file1'

----------------------------------------------------------------------
Ran 58 tests in 11.198s

FAILED (errors=8, skipped=1)

Since some of all the fails are due to permission problems, I also tried running the tests with sudo, resulting in this instead:

....Key file "/tmp/tmpiqlgc_/tmp_tmp4wvrh6" created.
Keep this file safe. Your data will be inaccessible without it.
..Remember your passphrase. Your data will be inaccessible without it.
.Encryption NOT enabled.
Use the "--encryption=passphrase|keyfile" to enable encryption.
................F...........F.....................s
======================================================================
FAIL: test_mount (attic.testsuite.archiver.ArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 247, in test_mount
    self.assert_dirs_equal(self.input_path, os.path.join(mountpoint, 'input'))
  File "/usr/lib/python3.3/site-packages/attic/testsuite/__init__.py", line 44, in assert_dirs_equal
    self._assert_dirs_equal_cmp(diff)
  File "/usr/lib/python3.3/site-packages/attic/testsuite/__init__.py", line 73, in _assert_dirs_equal_cmp
    self.assert_equal(d1, d2)
AssertionError: Lists differ: ['dir2', 16749, 0, 0, 0, 13847... != ['dir2', 16749, 0, 0, 0, 13847...

First differing element 6:
1384796870792234107
1384796870792233943

- ['dir2', 16749, 0, 0, 0, 1384796870792230000, 1384796870792234107, {}]
?                                                               ^^^

+ ['dir2', 16749, 0, 0, 0, 1384796870792230000, 1384796870792233943, {}]
?                                                              ++ ^


======================================================================
FAIL: test_mount (attic.testsuite.archiver.RemoteArchiverTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3.3/site-packages/attic/testsuite/archiver.py", line 247, in test_mount
    self.assert_dirs_equal(self.input_path, os.path.join(mountpoint, 'input'))
  File "/usr/lib/python3.3/site-packages/attic/testsuite/__init__.py", line 44, in assert_dirs_equal
    self._assert_dirs_equal_cmp(diff)
  File "/usr/lib/python3.3/site-packages/attic/testsuite/__init__.py", line 73, in _assert_dirs_equal_cmp
    self.assert_equal(d1, d2)
AssertionError: Lists differ: ['dir2', 16749, 0, 0, 0, 13847... != ['dir2', 16749, 0, 0, 0, 13847...

First differing element 6:
1384796879912256569
1384796879912256479

- ['dir2', 16749, 0, 0, 0, 1384796879912260000, 1384796879912256569, {}]
?                                                               ^^

+ ['dir2', 16749, 0, 0, 0, 1384796879912260000, 1384796879912256479, {}]
?                                                               ^^


----------------------------------------------------------------------
Ran 58 tests in 19.521s

FAILED (failures=2, skipped=1)

Anything I can try to fix this?

S3 support

Would it be possible to add S3 support using a custom backend? I tried s3fs but something went wrong, I'm guessing attic requires some metadata which wasn't available over that filesystem. How hard would it be to write a plugin that would store files to S3? Is there a "filesystem" abstraction/class one could swap out?

Add repository consistency check subcommand

The verify subcommand can be used to verify that a specific archive is extractable. That's fine for some use cases. But sometimes you would like to be able to make a server side check of a whole repository to detect bit rot, preferably without requiring any encryption keys.

Installing Attic on Red Hat Enterprise Linux (or CentOS) 5.10?

I'm trying to install Attic on the above OS - but, unfortunately, it only ships with OpenSSL 0.9.8 - and Attic refuses to install, saying it requires >= 1.0 headers.

So I've tried downloading and compiling OpenSSL 1.0.1g (NOT installing it), and pointing Attic towards that compiled version when using pip to install it -- this appears to work; at least the pip installation process completes.

However, when running attic, I get this:

$ attic help
Traceback (most recent call last):
  File "/usr/local/bin/attic", line 2, in <module>
    from attic.archiver import main
  File "/usr/local/lib/python3.4/site-packages/attic/archiver.py", line 13, in <module>
    from attic.archive import Archive, ArchiveChecker
  File "/usr/local/lib/python3.4/site-packages/attic/archive.py", line 7, in <module>
    from attic.key import key_factory
  File "/usr/local/lib/python3.4/site-packages/attic/key.py", line 10, in <module>
    from attic.crypto import pbkdf2_sha256, get_random_bytes, AES, bytes_to_long, long_to_bytes, bytes_to_int, num_aes_blocks
ImportError: /usr/local/lib/python3.4/site-packages/attic/crypto.cpython-34m.so: undefined symbol: PKCS5_PBKDF2_HMAC

Does anyone have any clues for me as to how I might be able to get this working? Any suggestions gratefully appreciated!

scan only updated files on create

Currently attic scans and works through all files. This takes a noticeable amount of time on large files (for example VM-diskfiles) that weren't touched during the last backup run.

Maybe it's possible to provide a reference backup to attic to check the timestamps against (kinda like incremental tars work) and not bother running the whole file through the dedup engine if the timestamps match. The file should still be in the new backup set, but maybe a cheap (repo side) ref would be enough.

Make it possible to fuse mount an entire repository

It is currently only possible to fuse mount a specific archive. For situations where the exact archive is not known it would be useful to be able to mount an entire repository.

The main reason why this is not already implemented is that Attic needs to read the archive metadata to be able to make it mountable. So if a repository contains many archives it would be both time consuming and consume a lot of memory to process it all at mount time.

One possible solution is to initially only present one top level folder containing the archive names and then fetch and process metadata for individual archives on demand when they are first accessed.

http://librelist.com/browser//attic/2014/3/24/mount-repo/

Use logging lib for messages

Instead of just calling print() to print messages, I would suggest to use the logging library for better logging capabilities, such as different logging levels (info, warning, error, debug), timestamps, log to syslog or other handlers, etc. For start, a simple handler to write to sys.stdout can be implemented.

What do you think?

IOError while analyzing unfinished archive (checkpoint)

I tried to create a backup archive from a source with ~350 GB of data, and I am getting the following error while analyzing the (unfinished) archive. To get the error I am using attic info path/to/repository/2013-12-22-00:00.checkpoint, but it is also triggered when trying to create a "new" archive, since it is actually trying to read chunks from the unfinished one by first going through the analyzing step. The error is reproducible and it is always the same:

Initializing cache...
Analyzing archive: 2013-12-22-00:00.checkpoint
Traceback (most recent call last):
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/repository.py", line 332, in get_fd
    return self.fds[segment]
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/lrucache.py", line 24, in __getitem__
    return super(LRUCache, self).__getitem__(key)
KeyError: 38122

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/attic", line 5, in <module>
    pkg_resources.run_script('Attic==0.8.1-3-g29d184d-dirty', 'attic')
  File "/usr/lib/python3/dist-packages/pkg_resources.py", line 499, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/usr/lib/python3/dist-packages/pkg_resources.py", line 1236, in run_script
    exec(compile(open(script_filename).read(), script_filename, 'exec'), namespace, namespace)
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/EGG-INFO/scripts/attic", line 3, in <module>
    main()
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/archiver.py", line 479, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/archiver.py", line 473, in run
    return args.func(args)
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/archiver.py", line 284, in do_info
    cache = Cache(repository, key, manifest)
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/cache.py", line 35, in __init__
    self.sync()
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/cache.py", line 160, in sync
    for id, chunk in zip_longest(archive[b'items'], self.repository.get_many(archive[b'items'])):
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/repository.py", line 220, in get_many
    yield self.get(id)
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/repository.py", line 214, in get
    return self.io.read(segment, offset, id)
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/repository.py", line 374, in read
    fd = self.get_fd(segment)
  File "/usr/local/lib/python3.2/dist-packages/Attic-0.8.1_3_g29d184d_dirty-py3.2-linux-x86_64.egg/attic/repository.py", line 334, in get_fd
    fd = open(self.segment_filename(segment), 'rb')
IOError: [Errno 2] No such file or directory: 'path/to/repository/data/3/38122'

The current list of archives in the repository given with attic list path/to/repository is:

2013-12-13-19:08.checkpoint Sat Dec 21 01:17:34 2013
2013-12-22-00:00.checkpoint Thu Dec 26 01:39:51 2013

It seems that I am stuck in a situation of cache inconsistency from which attic cannot recover. I also suspect that I got there because of the somewhat strange setup I have: The repository is stored on an NFS mount, mounted with the 'async' option (which I think is the default). Moreover the NFS server is off-site and is accessed through a WiFi link which can be considered unreliable by its nature and which, indeed, had some disconnections lately, during which, though, attic seemed to pause and continue without a problem.

So, considering the above, some questions arise:

  1. Is it possible that a setup like the above could lead to situations like the above?
  2. If the filesystem part of a chunk is not there, but the index exists, shouldn' t it be possible to recover from that situation, even at the cost of reindexing all files that include that chunk?
  3. Maybe in setups like the above it would help to take out of consideration the performance cost and mount the NFS export with the 'sync' option?
  4. Are there any thoughts for possible mechanisms that can be implemented to avoid or recover from inconsistencies when storing to storage with availability issues, such as network attached storage, cloud storage services, etc.?

Backup file verification

Is it currently possible to verify that the backup files are all OK? The situation I am thinking of is my backup location getting a bad sector and losing everything (or at least some things) while I'm completely unaware of it.

Would it be possible to add hashes/checksums to the final, encrypted data so I could run attic verify-hashes on the server, have the server read/re-checksum all the data (without requiring the password) and tell me if all the files can be read successfully? I could make a cron command out of this and have it e-mail me on any errors, or (optionally) discard the damaged blocks and re-upload them next time, if possible.

Two questions for the documentation

I have two questions that are (possibly) frequently wondered about, and which you may want to address in the documentation:

  1. When backing up to remote servers, is data encrypted before leaving the local machine, or do I have to trust that the remote server isn't malicious?
  2. If a backup stops mid-way, does the already-backed-up data stay there? I.e. does attic resume backups?

traceback when running "attic list" on an existing path which is not an attic repository

When running attic list command on a non-existent directory, it returns attic: Error: Repository not found which is absolutely normal. I think this error or an equivalent one should be also the expected behavior when running "attic list" on a directory that it does exist but it is not initialized as an attic repository. Instead, now it returns a traceback like the following:

Traceback (most recent call last):
  File "/usr/local/bin/attic", line 3, in <module>
    main()
  File "/usr/local/lib/python3.2/dist-packages/attic/archiver.py", line 479, in main
    exit_code = archiver.run(sys.argv[1:])
  File "/usr/local/lib/python3.2/dist-packages/attic/archiver.py", line 473, in run
    return args.func(args)
  File "/usr/local/lib/python3.2/dist-packages/attic/archiver.py", line 226, in do_list
    repository = self.open_repository(args.src)
  File "/usr/local/lib/python3.2/dist-packages/attic/archiver.py", line 29, in open_repository
    repository = Repository(location.path, create=create)
  File "/usr/local/lib/python3.2/dist-packages/attic/repository.py", line 45, in __init__
    self.open(path)
  File "/usr/local/lib/python3.2/dist-packages/attic/repository.py", line 74, in open
    self.lock_fd = open(os.path.join(path, 'config'), 'r')
IOError: [Errno 2] No such file or directory: '/some/existing/path/config'

The size of unique data is unproportional to the size of the changes in the backup source

I noticed that the size of unique data when creating an archive from a backup source that has changed only by few bytes is unproportional to the size of the changes. The backup source contains 4-5 files. To test this I just added an one-line text file with 60 random characters/bytes and ran attic create again to see that unique data was as high as 300KB! I even copied the text file above to a file with a different name and ~20KB of unique data was reported on next run. Why is this happening? It does not look normal...

Even if there are no changes at all it is still reporting unique data (~540 bytes) each time I run attic create.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.