gilesknap / gphotos-sync Goto Github PK
View Code? Open in Web Editor NEWGoogle Photos and Albums backup with Google Photos Library API
License: Apache License 2.0
Google Photos and Albums backup with Google Photos Library API
License: Apache License 2.0
Your modifications to the original script are useful, but I have an enhancement request.
I think it would be even better if you sorted the photos backed up to the drive
folder into a year/month/day folder hierarchy like those in the picasa
folder. This does not need to persist back to Google Drive, but having it as a local organization would greatly increase the usability of my local backup!
Do you think this functionality would fit in with the utility, or is it out of scope? I might try to tackle this if I have some time. Would you like me to submit a pull request if I do?
Need to record order items come out into the DB.
Some of the Google Photos generated 'MOVIES' are showing up twice in my downloaded folders.
For example in 'albums/2016/12 30 Science!', MOVIE (9) and (10) are duplicates.
Looking in the DB it appears that Picasa API reported 2 files called MOVIE in this folder, with sizes 27,427,686 and 49,566,183 and with unique IDs. The downloaded duplicates are both size 10,873,913. Note that these files are not reported in 'Auto Backup' and in fact I think the duplication happened when they slipped below the 10,000th file in 'Auto Backup'.
Looking at the album Science! in Google Photos, there are two MOVIES, the first of which is not shown in our local 'Science' but has also been downloaded to picasa/2016/12 TWICE as MOVIE (2) and (8) size 7,290,759 (but reported size in DB is (1,004,578 and 209,888 respectively).
There is a THIRD copy of MOVIE (9), MOVIE (10) called MOVIE in picasa/2016/12 with the same size but reported in DB as size 1,082,710.
Something is up with Picasa indexing on videos.
I know that the picasa ID will change between looking at 'Auto Backup' and 'Science!' but, (a) for that reason we do not index by ID, (b) this issue persists after a --flush-index
This is due to using pickleDB. Install https://github.com/olivierlemoal/pickledb to resolve.
Ran it again on latest master and skip picasa because it was not moving at all and I get a new error :
pi@raspberrypi:~/pyenvs/gphotos-sync $ gphotos-sync --skip-picasa --db-path=var/ /media/toshiba/
gphotos.drive: Indexing Drive Folders ...
gphotos.drive: Resolving paths ...
gphotos.drive: Drive Folders scanned.
gphotos.drive: Indexing Drive Files ...
gphotos.data: Saving Database ...
gphotos.data: Database Saved.
gphotos.drive: Downloading Drive Files ...
gphotos: Done.
gphotos.data: Saving Database ...
gphotos.data: Database Saved.
Traceback (most recent call last):
File "/home/pi/pyenvs/gphotos-sync/bin/gphotos-sync", line 7, in <module>
g.main()
File "/home/pi/pyenvs/gphotos-sync/local/lib/python2.7/site-packages/gphotos/Main.py", line 204, in main
self.start(args)
File "/home/pi/pyenvs/gphotos-sync/local/lib/python2.7/site-packages/gphotos/Main.py", line 182, in start
self.drive_sync.download_drive_media()
File "/home/pi/pyenvs/gphotos-sync/local/lib/python2.7/site-packages/gphotos/GoogleDriveSync.py", line 242, in download_drive_media
log.info('downloading {} ...'.format(media.local_full_path))
UnicodeEncodeError: 'ascii' codec can't encode character u'\xa0' in position 27: ordinal not in range(128)
Posted from mobile, hope formatting is ok (I'm not with my computer and ran ssh commands with JuiceSSH)
If I run comparison against the gphotos own download folder there are 820 'extra' files listed. There should be 0 since the folder should contain all files that are indexed (perhaps there should be one for each bad ID which is 28 only).
Not sure why, but all tests are failing locally and in Travis.
Currently, because the mediaItem results are only delivered in reverse date order I do not save the 'most recently indexed' date unless a complete index has occurred (i.e. back to the earliest file in the photos library).
However, all subsequent scans could do the following to make sure all items are eventually scanned.
This double scan approach negates the need for a completed scan before any incremental scans.
CAVEAT: I have noticed there is a difference between using mediaItems.list() and mediaItems.find() with no filters. For some reason, the latter misses a few (obscure format?) files.
CAVEAT2: I have also seen an issue with my library that a date range search always stops at 2010/09 (probably some strange meta on a photo)
TODO - try this approach and investigate how many files I miss in my photos library.
I would like
(a) the discovery code to build all the types declared in the discovery document (maybe as NamedTuple?) so that parameters can be easily built without resorting to dictionaries and responses can be interpreted without JSON
(b) I really would like my IDE (pycharm) to auto-complete REST API code. This would probably require generating source for it to look at.
Hi,
using in virtualenv and run script like this : python gphotos-sync /syncdir/
I get auth url and enter code, it success then I get error :
Authentication successful.
Traceback (most recent call last):
File "gphotos-sync", line 7, in
g.main()
File "/home/ark/gphotos-sync/Main.py", line 168, in main
self.setup(args)
File "/home/ark/gphotos-sync/Main.py", line 120, in setup
credentials_json=credentials_file)
File "/home/ark/gphotos-sync/GoogleDriveSync.py", line 59, in init
self._g_auth.CommandLineAuth()
File "/home/ark/gphotos-sync/venv/local/lib/python2.7/site-packages/pydrive/auth.py", line 127, in _decorated
self.SaveCredentials()
File "/home/ark/gphotos-sync/venv/local/lib/python2.7/site-packages/pydrive/auth.py", line 327, in SaveCredentials
self.SaveCredentialsFile()
File "/home/ark/gphotos-sync/venv/local/lib/python2.7/site-packages/pydrive/auth.py", line 349, in SaveCredentialsFile
raise InvalidCredentialsError('Credentials file cannot be symbolic link')
pydrive.auth.InvalidCredentialsError: Credentials file cannot be symbolic link
thx
Using the newest version (b04a7e8) gphotos-sync won't download any files.
It creates all the albums with symlinks just fine, but the symlinks point to non-existing files.
Sqlite database (table "SyncFiles") lists all my photos correctly but the logfile lists them all as skipped. E.g.:
01-17 10:44:46 gphotos.Photos DEBUG Skipped 16 photos/2019/01/IMG_20190116_130626.jpg
Any idea how I can help debug this?
@pierrerigal
This morning, log was stopped at
2017-11-29 22:01:38,814 gphotos.drive: downloading /media/toshiba/drive/Google Photos/2016/20160527_112445.jpg ...
Few minutes after my last check :(
No error message, no more running process in backgroud.
I just started it again.
Use the fields parameter to reduce the size of responses from the API calls.
See https://developers.google.com/photos/library/guides/performance-tips
To do this we will need to parse the 'parameters' (peer to 'resources') section of the API discovery document. This appears to hold parameters that are valid for all Methods. Currently, restclient.py only looks for parameters that it parsed from 'resources/*/methods'.
Currently the DB stores absolute paths as do the symlinks from album folders to drive/picasa folders.
This means that the sync directory is not portable.
local folder comparison code tests
shared folder handling tests
Even though I have clicked on 'add to library'
See my album 'Reading Brick Show'
I'm getting false positives on JPGs with no UID and the same file name. Adding filesize to the final matching clause would fix this.
See report from @pierrerigal below
The cause is described here https://stackoverflow.com/questions/5530708/can-i-redirect-unicode-output-from-the-console-directly-into-a-file
There is something weird, I can't redirect stdout. Each time I ran your script trying to redirect stdout to a log file, I fails with an error :
pi@raspberrypi:~/pyenvs/gphotos-sync/bin $ /home/pi/pyenvs/gphotos-sync/bin/gphotos-sync /media/toshiba/ &> /home/pi/pyenvs/gphotos-sync/var/gphotos.log
Then I have this in log file :
pi@raspberrypi:/media/toshiba $ tail -f /home/pi/pyenvs/gphotos-sync/var/gphotos.log
Indexing Drive Folders ...
root_id = 0AJE60RrTQzeIUk9PVA
Resolving paths ...
Drive Folders scanned.
Indexing Drive Files ...
Done.
Saving Database ...
Database Saved.
Traceback (most recent call last):
File "/home/pi/pyenvs/gphotos-sync/bin/gphotos-sync", line 7, in <module>
g.main()
File "/home/pi/pyenvs/gphotos-sync/local/lib/python2.7/site-packages/gphotos/Main.py", line 168, in main
self.start(args)
File "/home/pi/pyenvs/gphotos-sync/local/lib/python2.7/site-packages/gphotos/Main.py", line 137, in start
self.drive_sync.index_drive_media()
File "/home/pi/pyenvs/gphotos-sync/local/lib/python2.7/site-packages/gphotos/GoogleDriveSync.py", line 214, in index_drive_media
self.write_media(media, msg, False)
File "/home/pi/pyenvs/gphotos-sync/local/lib/python2.7/site-packages/gphotos/GoogleDriveSync.py", line 153, in write_media
print(msg)
UnicodeEncodeError: 'ascii' codec can't encode character u'\xa0' in position 40: ordinal not in range(128)
The albums folder is not generated unless --flush-index is used (or a previously unused root)
Last nitght is started sync again with
gphotos-sync --db-path=var/ /media/toshiba/ >> var/gphotos.log &
This morning, program is not running, log is stopped on :
gphotos.picasa: Added 8574 /media/toshiba/picasa/2016/07/20160716_135918 (3).mp4
And when starting again, it restart from beginning, look like database has not been updated ^^
(gphotos.sqlite-journal was still present)
Maybe, a forced "database save" every 1K photos is a good idea !
After syncing 100,000 files I checked against the files synced by Google on Windows, all good.
2nd Sync indexed as expected but brought down about 100 extra files into the Picasa folders, these files are also in drive and should not have been copied.
Looking in the database these files are correctly matched against their drive counterparts so the indexing worked ok.
3rd, 4th, 5th sync worked as expected and verified correctly.
(giles see FirstSyncLogs.txt for log of this behaviour)
Not yet sure what shape this should take...
I am setting an external HD as my backup device. The backup runs to completion fine, but when I eject the HD and reinsert it later to back up again, gphotos-sync
detects that all of the files have been modified and re-downloads them.
I know that you are not actively maintaining this, but do you know about this issue/have an idea why it might be happening?
Ability to synchronize 2+ separate photos accounts without having to reauthorize requestes
Hi,
correct install, api enabled. virtualenv, python 2.7, this all on debian gnu linux. What am i doing wrong, it fails with TypeError: new_request() takes exactly 1 argument (4 given).
Thank you very much
Petr
vanous@tc-vanek:/tmp/gp2$ gphotos-sync download photos/
saving metadata ...
Traceback (most recent call last):
File "/tmp/gp2/bin/gphotos-sync", line 519, in
main()
File "/tmp/gp2/bin/gphotos-sync", line 501, in main
pi.get_albums()
File "/tmp/gp2/bin/PhotoInfo.py", line 15, in get_albums
albums = self.gdata_client.GetUserFeed()
File "/tmp/gp2/local/lib/python2.7/site-packages/gdata/photos/service.py", line 235, in GetUserFeed
return self.GetFeed(uri, limit=limit)
File "/tmp/gp2/local/lib/python2.7/site-packages/gdata/photos/service.py", line 178, in GetFeed
return self.Get(uri, converter=gdata.photos.AnyFeedFromString)
File "/tmp/gp2/local/lib/python2.7/site-packages/gdata/service.py", line 1069, in Get
headers=extra_headers)
File "/tmp/gp2/local/lib/python2.7/site-packages/atom/init.py", line 93, in optional_warn_function
return f(*args, **kwargs)
File "/tmp/gp2/local/lib/python2.7/site-packages/atom/service.py", line 186, in request
data=data, headers=all_headers)
File "/tmp/gp2/local/lib/python2.7/site-packages/atom/http_interface.py", line 148, in perform_request
return http_client.request(operation, url, data=data, headers=headers)
TypeError: new_request() takes exactly 1 argument (4 given)
In some cases, photos with name clashes seem to be downloading more than once and replacing some of their name clashes.
e.g.
Filename 'Picture 045.jpg':-
I think I understand your thought behind what the reported "Already Downloaded"-number denotes.
But I find it quite unintuitive. I have a complete backup of my Google Photos collection and when I run gphotos-sync and there are no changes I get:
Downloaded 0 Items, Failed 0, Already Downloaded 0
This always has me thinking "wait. there are 0 items? But I already hat 22k last time. What happened?"
Having "Already Downloaded" print the full number of mediaitems that are already downloaded would be more intuitive to me. Even if you want to stick to the current logic behind this I'd find outputting library size (in number of items) helpful (mainly for reassuring that everything is backed up).
I just accidentally ran a sync with my personal backup folder but with my test account token. This has copied test account files into my backup tree.
A fix could be to put the name of the account in the globals table and refuse to run against a different account.
NOTE: would need to read the account BEFORE resetting the DB on --flush-index.
Google has seen fit to not provide location data with the images downloaded by Google Photos API.
I guess I could use beautiful soup and the ProductUrl to scrape this info and insert into our index - plus into the JPG EXIF as required.
Hello,
I'm trying to install it on a freshly installed raspbian on a pi zero (sync will be done over wifi on a NAS) but it failed and I don't know what to do.
pi@raspberrypi:~ $ python -V
Python 2.7.13
pi@raspberrypi:~ $ git clone https://github.com/gilesknap/gphotos-sync.git
Cloning into 'gphotos-sync'...
remote: Counting objects: 979, done.
remote: Total 979 (delta 0), reused 0 (delta 0), pack-reused 979
Receiving objects: 100% (979/979), 357.15 KiB | 96.00 KiB/s, done.
Resolving deltas: 100% (733/733), done.
pi@raspberrypi:~ $ cd gphotos-sync/
pi@raspberrypi:~/gphotos-sync $ python setup.py
Installed /home/pi/gphotos-sync/.eggs/pbr-3.1.1-py2.7.egg
[pbr] Generating ChangeLog
usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
or: setup.py --help [cmd1 cmd2 ...]
or: setup.py --help-commands
or: setup.py cmd --help
error: no commands supplied
I then tried these commands after some googling on how to use setup :
Build :
pi@raspberrypi:~/gphotos-sync $ python setup.py build
running build
running build_py
creating build
creating build/lib.linux-armv6l-2.7
creating build/lib.linux-armv6l-2.7/test
copying test/test_setup.py -> build/lib.linux-armv6l-2.7/test
copying test/test_system.py -> build/lib.linux-armv6l-2.7/test
copying test/__init__.py -> build/lib.linux-armv6l-2.7/test
copying test/test_utils.py -> build/lib.linux-armv6l-2.7/test
copying test/test_match.py -> build/lib.linux-armv6l-2.7/test
copying test/test_database.py -> build/lib.linux-armv6l-2.7/test
running egg_info
creating gphotos_sync.egg-info
writing pbr to gphotos_sync.egg-info/pbr.json
writing requirements to gphotos_sync.egg-info/requires.txt
writing gphotos_sync.egg-info/PKG-INFO
writing top-level names to gphotos_sync.egg-info/top_level.txt
writing dependency_links to gphotos_sync.egg-info/dependency_links.txt
[pbr] Processing SOURCES.txt
writing manifest file 'gphotos_sync.egg-info/SOURCES.txt'
[pbr] In git context, generating filelist from git
warning: no previously-included files found matching '.gitreview'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
writing manifest file 'gphotos_sync.egg-info/SOURCES.txt'
copying test/testDb1.sqlite -> build/lib.linux-armv6l-2.7/test
creating build/lib.linux-armv6l-2.7/test/test_credentials
copying test/test_credentials/client_secret.json -> build/lib.linux-armv6l-2.7/test/test_credentials
copying test/test_credentials/credentials.json -> build/lib.linux-armv6l-2.7/test/test_credentials
running build_scripts
creating build/scripts-2.7
copying and adjusting gphotos-sync -> build/scripts-2.7
changing mode of build/scripts-2.7/gphotos-sync from 644 to 755
Install :
pi@raspberrypi:~/gphotos-sync $ sudo python setup.py install
running install
[pbr] Writing ChangeLog
[pbr] Generating ChangeLog
[pbr] ChangeLog complete (0.1s)
[pbr] Generating AUTHORS
[pbr] AUTHORS complete (0.2s)
running build
running build_py
running egg_info
writing pbr to gphotos_sync.egg-info/pbr.json
writing requirements to gphotos_sync.egg-info/requires.txt
writing gphotos_sync.egg-info/PKG-INFO
writing top-level names to gphotos_sync.egg-info/top_level.txt
writing dependency_links to gphotos_sync.egg-info/dependency_links.txt
[pbr] Processing SOURCES.txt
[pbr] In git context, generating filelist from git
warning: no previously-included files found matching '.gitreview'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
writing manifest file 'gphotos_sync.egg-info/SOURCES.txt'
running build_scripts
running install_lib
copying build/lib.linux-armv6l-2.7/test/testDb1.sqlite -> /usr/local/lib/python2.7/dist-packages/test
copying build/lib.linux-armv6l-2.7/test/test_credentials/credentials.json -> /usr/local/lib/python2.7/dist-packages/test/test_credentials
copying build/lib.linux-armv6l-2.7/test/test_credentials/client_secret.json -> /usr/local/lib/python2.7/dist-packages/test/test_credentials
copying build/lib.linux-armv6l-2.7/test/test_setup.py -> /usr/local/lib/python2.7/dist-packages/test
copying build/lib.linux-armv6l-2.7/test/test_system.py -> /usr/local/lib/python2.7/dist-packages/test
copying build/lib.linux-armv6l-2.7/test/__init__.py -> /usr/local/lib/python2.7/dist-packages/test
copying build/lib.linux-armv6l-2.7/test/test_utils.py -> /usr/local/lib/python2.7/dist-packages/test
copying build/lib.linux-armv6l-2.7/test/test_match.py -> /usr/local/lib/python2.7/dist-packages/test
copying build/lib.linux-armv6l-2.7/test/test_database.py -> /usr/local/lib/python2.7/dist-packages/test
byte-compiling /usr/local/lib/python2.7/dist-packages/test/test_setup.py to test_setup.pyc
byte-compiling /usr/local/lib/python2.7/dist-packages/test/test_system.py to test_system.pyc
byte-compiling /usr/local/lib/python2.7/dist-packages/test/__init__.py to __init__.pyc
byte-compiling /usr/local/lib/python2.7/dist-packages/test/test_utils.py to test_utils.pyc
byte-compiling /usr/local/lib/python2.7/dist-packages/test/test_match.py to test_match.pyc
byte-compiling /usr/local/lib/python2.7/dist-packages/test/test_database.py to test_database.pyc
running install_egg_info
removing '/usr/local/lib/python2.7/dist-packages/gphotos_sync-0.9.0.dev140.egg-info' (and everything under it)
Copying gphotos_sync.egg-info to /usr/local/lib/python2.7/dist-packages/gphotos_sync-0.9.0.dev140.egg-info
running install_scripts
copying build/scripts-2.7/gphotos-sync -> /usr/local/bin
changing mode of /usr/local/bin/gphotos-sync to 755
Then I tried :
pi@raspberrypi:~/gphotos-sync $ gphotos-sync /media/toshiba/
Traceback (most recent call last):
File "/usr/local/bin/gphotos-sync", line 3, in <module>
from Main import GooglePhotosSyncMain
ImportError: No module named Main
Need help :)
The date anomoly in AVI files is because they have incorrect metadata, see example below. Conversely the Google Photos metadata is somehow correct. When I do a local folder comparison pass I extract the create date using ffprobe and write it into the DB, thus trashing the order. The actual metadata comparison is a good way to identify if two files are a copy of the original so probably want to keep this ffprobe.
Solutions?
$ ffprobe -v quiet -print_format json -show_entries stream=index,codec_type:stream_tags=creation_time:format_tags=creation_time '/media/Data/GilesPhotos/photos/2000/05/Y2000 M05 D06 Canada - DSCF0051.AVI'
{
"programs": [
],
"streams": [
{
"index": 0,
"codec_type": "video",
"tags": {
}
},
{
"index": 1,
"codec_type": "audio",
"tags": {
"creation_time": "2015-11-05T02:03:03.000000Z"
}
}
],
"format": {
"tags": {
"creation_time": "2015-11-05T02:03:03.000000Z"
}
}
}
Originally posted by @gilesknap in #57 (comment)
I'm on f95c72c.
After downloading a few hundred files I get a crash with the following log:
01-17 11:53:08 gphotos.utils DEBUG retry 0 failed: <bound method Method.do_execute of <gphotos.restclient.Method object at 0x7f26d587d0f0>>
01-17 11:53:08 gphotos.utils DEBUG retry 1 failed: <bound method Method.do_execute of <gphotos.restclient.Method object at 0x7f26d587d0f0>>
01-17 11:53:09 gphotos.utils DEBUG retry 2 failed: <bound method Method.do_execute of <gphotos.restclient.Method object at 0x7f26d587d0f0>>
01-17 11:53:10 gphotos.utils DEBUG retry 3 failed: <bound method Method.do_execute of <gphotos.restclient.Method object at 0x7f26d587d0f0>>
01-17 11:53:11 gphotos.utils DEBUG retry 4 failed: <bound method Method.do_execute of <gphotos.restclient.Method object at 0x7f26d587d0f0>>
01-17 11:53:11 gphotos.Photos ERROR failure in batch get of ['ACRh3gqHl1JT7bI0JBwAZS7tb8X7lyAKstKZ-qulC_p_iUTs593VZRFqVUuRLB6zPYxUAkU5AXCtTPTk-6o3G1Gi_GfIfXo1xg', 'ACRh3grwr4QLizR3okwzE6o3eMIflopt2WM1fic_PqGqn3SXBpQIF3yHYjHTGLNBm1m_2CL-rPUE_UnaVFh6P19x_MrPRxQXdg', 'ACRh3goT1pZL18jyksVWlwxwR78F0LiufYrmASx8DhMNuH3D8vdYmKtqiLsLdW14-S-TW4RUJM-guwM2Ptbddlrra_VoTuZJ5w', 'ACRh3gpoWhwl6nPKM-x3NGHdCozYeK2-kvhre8FSD2BvYtaMjFARLyUz99uyL1C53cPiEi_l9Oh6cqeD1BmZLyhYPJptaHlWQg', 'ACRh3gqXInOYSiiOc4piYTCONw5TObVmRFvlkDOSZYOgWpjdevMUT4Xc5KJe9QWAgH0zgfFvraZEbKqqJ1unVpOwaAqSRF4ALw', 'ACRh3gqFt0d1v7B89jAMJfsjmD3rQ1gmWdkSD2iRKaxuOxlWnP1tPO7fWMzC9L4ExIs_7dCdcJ-pJ2QrtMYkFZuZiSNDhrxXLg', 'ACRh3gpQFqmjoVvqho9T6JTQjvRCwR9De8f13e6sr0FYHJd2V_AjmR2UKoCpyH5Aau5dw0hse6dH3YF297Kk3EVMLtnlJXdxpA', 'ACRh3goOqk8j-Cd1YlFOVsq1kHvTgx2ArkHlaGqNGCKsEFWnfrzVEcDN9XqerKXSWeeTx-2fWkisgxMb4-D0ixtWK8qqKzS1Vg', 'ACRh3grmrooSV-dHeAigKCCipInH7mjRyeOrJNjUDoR8X1deqlFJXMRfJyrj_17ZbZBT1dfYQ1ciLjqeW3LVqcJdzFkQPygfAw', 'ACRh3gozoapWXPx3rkAl3NIiI_Qu9WYSCTHwnO2eRwAMnZFIsqzOmJzShM405dFhvRGxxexZifQx0KInMP2P7tE4V4wn6zQ28Q', 'ACRh3goTb8UG0tbhs165u_ASH-mCU7rZtT0vetyX5_kEzOwzacEAh2Fdq5xX03_cFo8MJoBNcLApG7MxYfa1FmjJRUxgRxDCfg', 'ACRh3gr99yaFQivyGCfLGXQnyX1UXgJxbTtHKa9BmSYcYeKjI3qOSCq1hCzoId5hP-izFk_QqvSIy4sEGTBWTq5-0ShPEBcMNw', 'ACRh3goIMfJ8V1UkEWwS6yv3iQqEsfrcN2gldfgaIDnIPGbWMyqdajXIQsRAO9Oru01zLBQjYTk0GB_1wGy06alClQ1y-vcsgg', 'ACRh3gq9n4gO1QKID9YOUqmSE0ky56T7Zs5WJKEzJC-NQ2GYyFw0PeUhEZsVwCpJNWctXZ3MQKrgPPrLfvuEWnLZ8oeiVxTz-A', 'ACRh3gr4eRK1mT6LfOKlceCUcV6Kf081gCWO4zeRzI_ON3vB7Jg2JNN5PKHKU5DkIvXOYoc9CJymHzwYy8bPOhnn5k6TrZ_d6Q', 'ACRh3gqvU6qMb1ZGlrixH45hXGLyG-afjt7OlM_-nYA1PfeYYmEs_PINYank5Hyv2W9NjZGiLs87bn_z0wdTgECUMqmPP758Sw', 'ACRh3goy6Wh56ox976n4nB3-Phe99D2nuzdgznvRZ4bRgMg1C0JlXA9mkCOM2L25KUpyWa9LlWBxmOnF6hv1aT-WhRkjlRQzfA', 'ACRh3greqz-yqX5_F9KgwXiIrGAVMGCR_BNQlmmBm2PVjhprWgamYSUzUTL46FticjkMJQO5fXNB8gbJ4YZlsKrIJYxuQgvL8g', 'ACRh3gpHVjdhpCuvnLPzYpUhalJtpwRNmldOj87lwIIaSUaqqy6uDNiFNcAdg1WP2qdj8LrysVlw_Edbk-6g_tLfPTHt1BXVIg', 'ACRh3gqAdLYp0qRNQtDf2-fG-vQNlLl4AdSq0aGdFZREgvHfGrq2I23N6K1rEtvQ1l-y_kJbVwNEXAPKGyUqLXvphu-mImmrIA']
01-17 11:53:11 gphotos.data INFO Saving Database ...
01-17 11:53:11 gphotos.data INFO Database Saved.
01-17 11:53:11 gphotos ERROR
Process failed.
Traceback (most recent call last):
File "/home/stefan/gphotos-sync/gphotos/Main.py", line 209, in main
self.start(args)
File "/home/stefan/gphotos-sync/gphotos/Main.py", line 170, in start
self.google_photos_sync.download_photo_media()
File "/home/stefan/gphotos-sync/gphotos/GooglePhotosSync.py", line 293, in download_photo_media
response = self._api.mediaItems.batchGet.execute(mediaItemIds=batch_ids)
File "/home/stefan/gphotos-sync/gphotos/restclient.py", line 42, in execute
r = Utils.retry(5, self.do_execute, body, path, query_args)
File "/home/stefan/gphotos-sync/gphotos/Utils.py", line 63, in retry
raise last_e
File "/home/stefan/gphotos-sync/gphotos/Utils.py", line 54, in retry
res = func(*arg, **k_arg)
File "/home/stefan/gphotos-sync/gphotos/restclient.py", line 48, in do_execute
result.raise_for_status()
File "/home/stefan/gphotos-sync/venv/lib/python3.7/site-packages/requests/models.py", line 940, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: https://photoslibrary.googleapis.com/v1/mediaItems:batchGet?mediaItemIds=ACRh3gqHl1JT7bI0JBwAZS7tb8X7lyAKstKZ-qulC_p_iUTs593VZRFqVUuRLB6zPYxUAkU5AXCtTPTk-6o3G1Gi_GfIfXo1xg&mediaItemIds=ACRh3grwr4QLizR3okwzE6o3eMIflopt2WM1fic_PqGqn3SXBpQIF3yHYjHTGLNBm1m_2CL-rPUE_UnaVFh6P19x_MrPRxQXdg&mediaItemIds=ACRh3goT1pZL18jyksVWlwxwR78F0LiufYrmASx8DhMNuH3D8vdYmKtqiLsLdW14-S-TW4RUJM-guwM2Ptbddlrra_VoTuZJ5w&mediaItemIds=ACRh3gpoWhwl6nPKM-x3NGHdCozYeK2-kvhre8FSD2BvYtaMjFARLyUz99uyL1C53cPiEi_l9Oh6cqeD1BmZLyhYPJptaHlWQg&mediaItemIds=ACRh3gqXInOYSiiOc4piYTCONw5TObVmRFvlkDOSZYOgWpjdevMUT4Xc5KJe9QWAgH0zgfFvraZEbKqqJ1unVpOwaAqSRF4ALw&mediaItemIds=ACRh3gqFt0d1v7B89jAMJfsjmD3rQ1gmWdkSD2iRKaxuOxlWnP1tPO7fWMzC9L4ExIs_7dCdcJ-pJ2QrtMYkFZuZiSNDhrxXLg&mediaItemIds=ACRh3gpQFqmjoVvqho9T6JTQjvRCwR9De8f13e6sr0FYHJd2V_AjmR2UKoCpyH5Aau5dw0hse6dH3YF297Kk3EVMLtnlJXdxpA&mediaItemIds=ACRh3goOqk8j-Cd1YlFOVsq1kHvTgx2ArkHlaGqNGCKsEFWnfrzVEcDN9XqerKXSWeeTx-2fWkisgxMb4-D0ixtWK8qqKzS1Vg&mediaItemIds=ACRh3grmrooSV-dHeAigKCCipInH7mjRyeOrJNjUDoR8X1deqlFJXMRfJyrj_17ZbZBT1dfYQ1ciLjqeW3LVqcJdzFkQPygfAw&mediaItemIds=ACRh3gozoapWXPx3rkAl3NIiI_Qu9WYSCTHwnO2eRwAMnZFIsqzOmJzShM405dFhvRGxxexZifQx0KInMP2P7tE4V4wn6zQ28Q&mediaItemIds=ACRh3goTb8UG0tbhs165u_ASH-mCU7rZtT0vetyX5_kEzOwzacEAh2Fdq5xX03_cFo8MJoBNcLApG7MxYfa1FmjJRUxgRxDCfg&mediaItemIds=ACRh3gr99yaFQivyGCfLGXQnyX1UXgJxbTtHKa9BmSYcYeKjI3qOSCq1hCzoId5hP-izFk_QqvSIy4sEGTBWTq5-0ShPEBcMNw&mediaItemIds=ACRh3goIMfJ8V1UkEWwS6yv3iQqEsfrcN2gldfgaIDnIPGbWMyqdajXIQsRAO9Oru01zLBQjYTk0GB_1wGy06alClQ1y-vcsgg&mediaItemIds=ACRh3gq9n4gO1QKID9YOUqmSE0ky56T7Zs5WJKEzJC-NQ2GYyFw0PeUhEZsVwCpJNWctXZ3MQKrgPPrLfvuEWnLZ8oeiVxTz-A&mediaItemIds=ACRh3gr4eRK1mT6LfOKlceCUcV6Kf081gCWO4zeRzI_ON3vB7Jg2JNN5PKHKU5DkIvXOYoc9CJymHzwYy8bPOhnn5k6TrZ_d6Q&mediaItemIds=ACRh3gqvU6qMb1ZGlrixH45hXGLyG-afjt7OlM_-nYA1PfeYYmEs_PINYank5Hyv2W9NjZGiLs87bn_z0wdTgECUMqmPP758Sw&mediaItemIds=ACRh3goy6Wh56ox976n4nB3-Phe99D2nuzdgznvRZ4bRgMg1C0JlXA9mkCOM2L25KUpyWa9LlWBxmOnF6hv1aT-WhRkjlRQzfA&mediaItemIds=ACRh3greqz-yqX5_F9KgwXiIrGAVMGCR_BNQlmmBm2PVjhprWgamYSUzUTL46FticjkMJQO5fXNB8gbJ4YZlsKrIJYxuQgvL8g&mediaItemIds=ACRh3gpHVjdhpCuvnLPzYpUhalJtpwRNmldOj87lwIIaSUaqqy6uDNiFNcAdg1WP2qdj8LrysVlw_Edbk-6g_tLfPTHt1BXVIg&mediaItemIds=ACRh3gqAdLYp0qRNQtDf2-fG-vQNlLl4AdSq0aGdFZREgvHfGrq2I23N6K1rEtvQ1l-y_kJbVwNEXAPKGyUqLXvphu-mImmrIA
01-17 11:53:11 gphotos INFO Done.
01-17 11:53:11 gphotos INFO Elapsed time = 0:00:05.487580
It always crashes with this exact request (same mediaItemIds).
Looks like a great tool, and appears to already address yannrouillard/gphotos-sync#9
Your points about this using a deprecated Picasa API are well taken. Do you have any plans for moving to the now-official Google Photos API (https://developers.google.com/photos/)?
My cron job is overwriting the logs from my tests.
Doh. It turns out that in jumping from Google drive v2 to Google Photos API I missed the fact that Google has done their own REST discovery client for python. So replace restclient.py with
apiclient.discovery import build
So that file comparison of previous syncs only shows changes. Since the album links are always re-created they currently look newer when doing file comparison
I'm not too familiar with python environments on Windows - I can make it work in pycharm but need independent instructions.
Getting some errors now on install:
gomachan:googlephotos gomachan$ python setup.py build ERROR:root:Error parsing Traceback (most recent call last): File "/Users/gomachan/Documents/_temp/googlephotos/.eggs/pbr-5.1.2-py2.7.egg/pbr/core.py", line 96, in pbr attrs = util.cfg_to_args(path, dist.script_args) File "/Users/gomachan/Documents/_temp/googlephotos/.eggs/pbr-5.1.2-py2.7.egg/pbr/util.py", line 256, in cfg_to_args pbr.hooks.setup_hook(config) File "/Users/gomachan/Documents/_temp/googlephotos/.eggs/pbr-5.1.2-py2.7.egg/pbr/hooks/__init__.py", line 25, in setup_hook metadata_config.run() File "/Users/gomachan/Documents/_temp/googlephotos/.eggs/pbr-5.1.2-py2.7.egg/pbr/hooks/base.py", line 27, in run self.hook() File "/Users/gomachan/Documents/_temp/googlephotos/.eggs/pbr-5.1.2-py2.7.egg/pbr/hooks/metadata.py", line 26, in hook self.config['name'], self.config.get('version', None)) File "/Users/gomachan/Documents/_temp/googlephotos/.eggs/pbr-5.1.2-py2.7.egg/pbr/packaging.py", line 849, in get_version name=package_name)) Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name gphotos-sync was given, but was not able to be found. error in gphotos-sync setup command: Error parsing /Users/gomachan/Documents/_temp/googlephotos/setup.cfg: Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name gphotos-sync was given, but was not able to be found.
and then also
mock 2.0.0 has requirement six>=1.9, but you'll have six 1.4.1 which is incompatible.
and then
gomachan:googlephotos gomachan$ sudo python setup.py install ERROR:root:Error parsing Traceback (most recent call last): File "/Library/Python/2.7/site-packages/pbr/core.py", line 96, in pbr attrs = util.cfg_to_args(path, dist.script_args) File "/Library/Python/2.7/site-packages/pbr/util.py", line 256, in cfg_to_args pbr.hooks.setup_hook(config) File "/Library/Python/2.7/site-packages/pbr/hooks/__init__.py", line 25, in setup_hook metadata_config.run() File "/Library/Python/2.7/site-packages/pbr/hooks/base.py", line 27, in run self.hook() File "/Library/Python/2.7/site-packages/pbr/hooks/metadata.py", line 26, in hook self.config['name'], self.config.get('version', None)) File "/Library/Python/2.7/site-packages/pbr/packaging.py", line 849, in get_version name=package_name)) Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name gphotos-sync was given, but was not able to be found. error in gphotos-sync setup command: Error parsing /Users/gomachan/Documents/_temp/googlephotos/setup.cfg: Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name gphotos-sync was given, but was not able to be found.
On google drive, it is possible to place a file in multiple folders.
In this instance, the local version will only have one copy. This makes it look like some files are missing although they are backed up somewhere.
Todo - make all copies a symlink to the first instance found.
Just noted it there to don't forget :) but it could be usefull for debug.
Currently, do not support albums containing duplicate names. The above error is reported and a link to the first file only is added to the album folder.
Perhaps a good fix is to promote album links to the SyncFiles table and have a Google Media derived class for them. Duplicate handling is then managed in the base class .
Now that I'm running gphotos as a cron job this issue is noticeable.
If a single photo is added then the rescan of the albums info is done which is OK. But also the attempted re-link of all local album links traverses the filesystem for any missing links. This is quite expensive for a background operation.
It could possibly be avoided since the DB could know what has changed and only try to update those.
Split larger files (GooglePhotosSync at least)
Do away with the property setter/getters in GooglePhotosSync
Use Python 3 type hints and comment (all?) functions
Would love to use this script to first:
I'd like to be able to point at a root folder and index all the photos in there. Then compare these against the Google Photos LIbrary and create a folder of symlinks that point to the files that would need to be uploaded to make the library match.
The upload itself would need to be done by an official Google Tool (Windows or Web). This is because of the quota issue.
This is in lieu of being able to do proper sync. It will help me restore my Insync trashed Photos Library. More importantly, it will serve as a proxy for two way sync.
The feature would be designed so that it can point at its own download folder and hence almost give two-way sync. I would suggest that the folder hierarchy by create date need to be enforced for this otherwise I'll have a very complicated problem and probably do an Insync on someone's filesystem eventually.
Add a flag to the SyncFiles table to indicate if the media item has been synchronised to disk.
Do not attempt to download files that are marked as 'clean' (downloaded)
This will allow the local copy to be reorganized if required BUT only if the DB is kept safe, --flush-db would cause all moved/renamed files to be downloaded again.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.