Coder Social home page Coder Social logo

merginmaps / db-sync Goto Github PK

View Code? Open in Web Editor NEW
43.0 43.0 19.0 733 KB

A tool for two-way synchronization between Mergin Maps and a PostGIS database

Home Page: https://merginmaps.com

License: MIT License

Python 98.53% Dockerfile 1.06% Batchfile 0.41%
geodiff gis merginmaps postgis postgresql

db-sync's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

db-sync's Issues

Geodiff error for certain column types

The base schema and the output GPKG do not exist yet, going to initialize them ...
GEODIFF: Error: unknown value type: character varying(20)

Traceback (most recent call last):
  File "dbsync_daemon.py", line 65, in <module>
    main()
  File "dbsync_daemon.py", line 37, in main
    dbsync.dbsync_init(mc, from_gpkg=False)
  File "/mergin-db-sync/dbsync.py", line 633, in dbsync_init
    _geodiff_make_copy(config.db_driver, config.db_conn_info, config.db_schema_modified,
  File "/mergin-db-sync/dbsync.py", line 199, in _geodiff_make_copy
    _run_geodiff([config.geodiffinfo_exe, "makeCopy", src_driver, src_conn_info, src, dst_driver, dst_conn_info, dst])
  File "/mergin-db-sync/dbsync.py", line 153, in _run_geodiff
    raise DbSyncError("geodiffinfo failed!\n" + str(cmd))
dbsync.DbSyncError: geodiffinfo failed!

Network issues may freeze sync

It seems that mergin python client does not have any timeouts set. In case of a temporary network connection issue, the sync daemon may freeze and hang forever. It would be good to come up with a timeout, but not too short that could prematurely kill correct requests (e.g. push finish with a lot of data)

Use Mergin webhooks

Currently we poll Mergin in regular intervals to see if there are any changes to the project. Optionally there should be a way to listen to Mergin webhooks in case there are new data pushed to the project.

File/image sync

I'm curious is there are any plans to implement/integrate related files/image sync within this tool?

There would be a wide range of places that people would be targeting for files to go but if we started with a local file system it should be easy to branch that out.

I'd put my hand up for working on an azure blob storage connector.

Document how to restart failed sync

Sometimes users modify structure of their database and this naturally breaks db-sync. It would be good to document how to restart db-sync when this happens.

If the "source" is a geopackage in Mergin project (where the database structure change happened), this boils down to these steps:

  1. delete "base" and "main" schemas in PostgreSQL
  2. start mergin-db-sync again with --init-from-gkpg

If the source is PostgreSQL:

  1. delete "base" schema in PostgreSQL and the geopackage in the Mergin project
  2. start mergin-db-sync again with --init-from-db

One docker container for multiple syncs

Currently, you need to run one docker per schema to sync between PG and Mergin Maps. It would be good to have an option to support multiple schemas and projects running with one docker container.

Mark base schema as invalid if init fails

If there are any issues with the init procedure, we list the problems, but if that is skipped by the user and they restart the tool, the base schema exists, but our special comment is missing, and users get this error: "Base schema exists but missing which project ...."

It would be good to set a comment even after failed init that would explicitly tell us that init failed, so the next time the tool is restarted, we can give user the correct error message. (Maybe it would be good to also run the "create_changeset_dr" comparison again to show the unwanted differences.)

As a bonus point, when the "create_changeset_dr" is returning non-empty changeset, that may be a long output in console, and users may not see the original error message before the diff. So it may be good to rearrange things so that we first output the diff and at the end we raise one-line error which would make it clear things went wrong.

Problem with datatypes

I get the following errors when I try to initialise my gpkg file:

The base schema and the output GPKG do not exist yet, going to initialize them ...
Warn: Converting PostgreSQL type character varying to base type unsuccessful, using text.
GEODIFF: Error: unknown value type: character varying
GEODIFF: Error: unknown value type: bigint

Unneeded logins from daemon

When running, every time we get project status, we also log in - that's quite a waste of time since we should already have a token available - and only if it has expired, we should get a new one.

Init: do nothing if gpkg+schemas already exist and in sync

In case we do init of the sync tool (e.g. from gpkg), the tool terminates with an error if the database schemas already exist, and it is necessary to remove them to make the init to pass. It would be good if init would check the content of the destination schemas in case they already exist, and if they are fully in sync, the init would do nothing.

PostgreSQL inserts are slow

With a remote PostgreSQL when doing --init-from-db, the initial copy from the main schema to base schema takes very long time.

Check that base schema has no changes to gpkg at the end of init

We have had issues that after the init there were minor differences between base schema and what the synced geopackage. This would be only discovered with another run of the init when the initial verification would fail. We should add a sanity check at the end of init that there will be no surprises later - and raise an error if there would be non-zero diff between the two, as this may mean a bug in geodiff that may cause issues later.

Add an option to notify admin when sync breaks

It would be useful if the daemon would somehow notify admin when things go wrong (e.g. database schema has changed, unable to access project) - currently it just silently stops.

Maybe sending an email to the admin would help - or send a notification to slack...

Failed to restart the db-sync

Logging in to Mergin...
Connecting to the database...
Modified and base schemas already exist
Downloading version v8 of Mergin project saber/survey_postgis to /tmp/dbsync
GEODIFF: Error: Cannot create driver host=xps dbname=survey user=postgres password=postgres
Error: Failed to create a copy of base source for driver host=xps dbname=survey user=postgres password=password

Traceback (most recent call last):
  File "dbsync_daemon.py", line 65, in <module>
    main()
  File "dbsync_daemon.py", line 37, in main
    dbsync.dbsync_init(mc, from_gpkg=False)
  File "/mergin-db-sync/dbsync.py", line 608, in dbsync_init
    summary_modified = _compare_datasets(config.db_conn_info, config.db_schema_modified,
  File "/mergin-db-sync/dbsync.py", line 211, in _compare_datasets
    _geodiff_create_changeset_dr(src_driver, src_conn_info, src, dst_driver, dst_conn_info, dst, tmp_changeset)
  File "/mergin-db-sync/dbsync.py", line 203, in _geodiff_create_changeset_dr
    _run_geodiff([config.geodiffinfo_exe, "createChangesetDr", src_driver, src_conn_info, src, dst_driver, dst_conn_info, dst, changeset])
  File "/mergin-db-sync/dbsync.py", line 153, in _run_geodiff
    raise DbSyncError("geodiffinfo failed!\n" + str(cmd))
dbsync.DbSyncError: geodiffinfo failed!
['/geodiff/build/geodiffinfo', 'createChangesetDr', 'host=xps dbname=survey user=postgres password=password', 'photos', 'sqlite', '', '/tmp/dbsync/photos.gpkg', 'postgres', '/tmp/tukTXjIf']

Rewriting of QGIS project files

It would be useful to have functionality that would take an existing QGIS project file (.qgs or .qgz) and convert references to map layer data providers from database to geopackage or the other way around - so that users do not need to do that manually.

Docker container on hub.docker.com

It would be useful to have ready-to-use container for db-sync on docker hub, so that people do not need to build the container and just run it.

Use listen/notify for database

Currently we poll the database in regular interval. It would be good to optionally make use of the LISTEN/NOTIFY commands of PostgreSQL to avoid regular database checks to make things more efficient.

"Output GPKG file does not exist"

Hello

I have tried to follow the docker setup instructions on my synology NAS but receive an error when I try to initiate it from gpkg.
Below are screenshots of my container setup, I also mounted the /tmp/dbsync folder. I think my configuration is able to login to both the postgres dB and my local instance of mergin. Not sure where I am going wrong? I have also created the schemas in postgres using pgAdmin. Not sure if it should be in the env variables as well.

container ENV

mergin project

error msg

Once again, thank you in advance for the assistance.

error trying to run tool from docker hub

Hi Guys

very sorry to bother again.

when trying to build the container from docker hub lutraconsulting/mergin-db-sync I get the following issues:

  1. just running "sudo docker run --name mergin_db_sync -it lutraconsulting/mergin-db-sync:latest" throws a manifest error:
    image

I then rather explicitly state the tag "sudo docker run --name mergin_db_sync -it lutraconsulting/mergin-db-sync:1.0.2" which works and builds the image.

  1. The built container's default command is /bin/bash :
    image

So after the container is built I am redirected inside the container and get the following errors when trying to set the environmental variables (-e) and trying to run the daemon. complete command and error:

image

any help will be appreciated ๐Ÿ™‚
thank you

support multiple GPKG files

we recommend have 1 GPKG file for each QGIS layer (due to better conflict resolution). But to have multiple GPKG with db-sync, we need for each GPKG separate PG schema and also separate sync process.

Initialization fails when base geopackage contains a polygon layer

Good day

I tried to set up syncing between my Mergin project and Postgres and found that the script fails when I include a polygon layer in the base geopackage. I tried the following:

  1. Fix polygon geometries - still fails
  2. Convert polygon layer multipart to singe parts - still fails
  3. reproject polygon layer from epsg3857 to 4326 - still fails
  4. remove polygon layer from geopackage - success, syncing initialized and tested successfully
  • Screenshot of error:
    mergin-db-sync polygon error

Init should not fail if the "modified" schema has pending changes

Right now, init expects that there is zero diff between source gpkg and base+modified schemas in the DB. This a bit more strict than necessary - if users make changes to the "modified" schema (but no structure changes), then this should not be considered as an issue - as soon as we will do dbsync_push, those changes can be simply picked up and pushed to Mergin server...

tool fails to restart after its stopped

Hi guys

After building the docker container everything works great, but once its stopped and I restart the container using:
docker start container_name -a
I get the error shown in the screenshot. This is a test project so its only me working on it and I did not make any changes so there shouldn't be any differences between the gpkg and db.

tool-restart-error

LoginError on gateway timeout

At some point when Mergin was not available, the daemon crashed with the following error:

Trying to pull
Traceback (most recent call last):
  File "/home/ubuntu/.local/lib/python3.6/site-packages/mergin/client.py", line 210, in login
    resp = self.opener.open(request)
  File "/usr/lib/python3.6/urllib/request.py", line 532, in open
    response = meth(req, response)
  File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python3.6/urllib/request.py", line 570, in error
    return self._call_chain(*args)
  File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.6/urllib/request.py", line 650, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 504: GATEWAY_TIMEOUT

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "dbsync_daemon.py", line 28, in <module>
    dbsync.dbsync_pull()
  File "/home/ubuntu/mergin-db-sync/dbsync.py", line 176, in dbsync_pull
    mc = MerginClient(config.mergin_url, login=config.mergin_username, password=config.mergin_password)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/mergin/client.py", line 80, in __init__
    self.login(login, password)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/mergin/client.py", line 217, in login
    raise LoginError(e.read().decode("utf-8"))
mergin.common.LoginError

Keep track of the project version in base schema

It would be good to keep Mergin project version metadata information somewhere in the database, then we could:

  • detect whether the base schema is not modified (by comparing it to the gpkg at the stored project version)
  • ask mergin to give us older version of the project (rather than the last one) to first get to consistent state before doing further pull/push actions

Maybe we could use "comment" functionality in postgres to store the metadata:
https://www.postgresql.org/docs/9.1/sql-comment.html
Probably best to store a simple JSON (e.g. project name + project version) so that we can add more metadata in future if needed.

A failure in push/pull may crash sync completely

If there is an exception thrown during push or pull in sync daemon, the daemon will crash with an unhandled exception. We should catch those exceptions, print them and then continue sync, hoping that the error is just temporary (e.g. if server is not available)

Optionally ignore some tables

Currently db-sync requires that both the database schema and the geopackage in Mergin project contain the same tables. But sometimes in the database there are tables that should not be synchronised to Mergin projects. It would be good to have an option in db-sync to ignore some tables and not sync them.

(This will probably need to be implemented on geodiff level)

Too many open files

After running some time the daemon has crashed with this error:

Trying to push
Traceback (most recent call last):
  File "dbsync_daemon.py", line 31, in <module>
    dbsync.dbsync_push()
  File "/home/ubuntu/mergin-db-sync/dbsync.py", line 329, in dbsync_push
    _geodiff_create_changeset(config.db_driver, config.db_conn_info, config.db_schema_base, config.db_schema_modified, tmp_changeset_file)
  File "/home/ubuntu/mergin-db-sync/dbsync.py", line 112, in _geodiff_create_changeset
    _run_geodiff([config.geodiffinfo_exe, "createChangesetEx", driver, conn_info, base, modified, changeset])
  File "/home/ubuntu/mergin-db-sync/dbsync.py", line 103, in _run_geodiff
    res = subprocess.run(cmd, stderr=subprocess.PIPE)
  File "/usr/lib/python3.6/subprocess.py", line 423, in run
    with Popen(*popenargs, **kwargs) as process:
  File "/usr/lib/python3.6/subprocess.py", line 729, in __init__
    restore_signals, start_new_session)
  File "/usr/lib/python3.6/subprocess.py", line 1254, in _execute_child
    errpipe_read, errpipe_write = os.pipe()
OSError: [Errno 24] Too many open files

Confusing errors reported when db-sync starts

When user starts db-sync, one of the geodiff calls produces a bunch of errors that are reported to the user. But these errors are not actual problems, and they just confuse users into thinking that something went wrong. I believe the failing queries are coming from libgpkg library, it would be good to find a way how to silence them:

GEODIFF: Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'features'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'tiles'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'tiles'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'features'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'tiles'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'tiles'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'features'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'tiles'"
Error: SQLITE3: (1)no such table: main.gpkg_contents in "SELECT count(*) FROM "main".gpkg_contents WHERE data_type LIKE 'tiles'"

GEODIFF: Error: Conflicts encountered while applying changes

Hi Lutra Team

For some reason I've started getting the error below when I delete records in a table on Postgres in the main schema. See 2 examples below.

The setup is as follows:

  1. mergin-db-sync v1.0.5 keeping the DB in sync with the main mergin project (running in docker).
  2. mergin work packages syncing back to main mergin project using the mergin-wp tool.

Everything sync's and works fine except when I try to delete features from postgres and have it sync back to mergin. When I then restore the deleted row, the sync tool runs as normal. Any idea why this would happen please?

2021-07-20 09:27:17.111166
Trying to pull
No changes on Mergin.
Trying to push
Changes:
fruit_monitor 0 0 2
Writing DB changes to working dir...
Warn: CONFLICT: delete_nothing:
{
"table": "fruit_monitor",
"type": "delete",
"changes": [
{
"column": 0,
"old": 284
},
{
"column": 1,
"old": "R1AAAeYQAAABAQAAANPKNcpI4DJAfGZLGc35QMA="
},
{
"column": 2,
"old": "{af58e6bd-eb99-4f03-9cb4-f06c7cfb16f2}"
},
{
"column": 3,
"old": "C86"
},
{
"column": 4,
"old": "F203"
},
{
"column": 5,
"old": "B6223"
},
{
"column": 6,
"old": null
},
{
"column": 7,
"old": "{a86e3af0-1f56-476a-8bf4-17a335456015}"
},
{
"column": 8,
"old": "Branch"
},
{
"column": 9,
"old": 13
},
{
"column": 10,
"old": 8
},
{
"column": 11,
"old": 8
},
{
"column": 12,
"old": 5
},
{
"column": 13,
"old": 6
},
{
"column": 14,
"old": 6
},
{
"column": 15,
"old": 6
},
{
"column": 16,
"old": 11
},
{
"column": 17,
"old": 6
},
{
"column": 18,
"old": 6
},
{
"column": 19,
"old": 7
},
{
"column": 20,
"old": 5
},
{
"column": 21,
"old": 6
},
{
"column": 22,
"old": 7
},
{
"column": 23,
"old": 7
},
{
"column": 24,
"old": 6
},
{
"column": 25,
"old": 7
},
{
"column": 26,
"old": 7
},
{
"column": 27,
"old": 7
},
{
"column": 28,
"old": 7
},
{
"column": 29,
"old": 8
},
{
"column": 30,
"old": 8
},
{
"column": 31,
"old": 7
},
{
"column": 32,
"old": 7
},
{
"column": 33,
"old": 7
},
{
"column": 34,
"old": 11
},
{
"column": 35,
"old": 13
},
{
"column": 36,
"old": 14
},
{
"column": 37,
"old": 12
},
{
"column": 38,
"old": 7
},
{
"column": 39,
"old": 7
},
{
"column": 40,
"old": "JPEG_20210716_160349_7910095317185585251.jpg"
},
{
"column": 41,
"old": "Comment"
},
{
"column": 42,
"old": 7.6299999999999999
},
{
"column": 43,
"old": "2021-07-16T16:18:16Z"
}
]
}
Warn: CONFLICT: delete_nothing:
{
"table": "fruit_monitor",
"type": "delete",
"changes": [
{
"column": 0,
"old": 285
},
{
"column": 1,
"old": "R1AAAeYQAAABAQAAAM3oqWU03zJA0UvvAZP5QMA="
},
{
"column": 2,
"old": "{4c592fa2-67b7-4cd7-9c26-9ca51621e7bd}"
},
{
"column": 3,
"old": "C86"
},
{
"column": 4,
"old": "F202"
},
{
"column": 5,
"old": "B6217"
},
{
"column": 6,
"old": null
},
{
"column": 7,
"old": "{8d908bf5-3527-4c1b-8234-5cabfcfa9a6d}"
},
{
"column": 8,
"old": "Tree"
},
{
"column": 9,
"old": 4
},
{
"column": 10,
"old": 10
},
{
"column": 11,
"old": 9
},
{
"column": 12,
"old": 9
},
{
"column": 13,
"old": 9
},
{
"column": 14,
"old": 9
},
{
"column": 15,
"old": 11
},
{
"column": 16,
"old": 8
},
{
"column": 17,
"old": 8
},
{
"column": 18,
"old": 11
},
{
"column": 19,
"old": 10
},
{
"column": 20,
"old": null
},
{
"column": 21,
"old": null
},
{
"column": 22,
"old": null
},
{
"column": 23,
"old": null
},
{
"column": 24,
"old": null
},
{
"column": 25,
"old": null
},
{
"column": 26,
"old": null
},
{
"column": 27,
"old": null
},
{
"column": 28,
"old": null
},
{
"column": 29,
"old": null
},
{
"column": 30,
"old": null
},
{
"column": 31,
"old": null
},
{
"column": 32,
"old": null
},
{
"column": 33,
"old": null
},
{
"column": 34,
"old": null
},
{
"column": 35,
"old": null
},
{
"column": 36,
"old": null
},
{
"column": 37,
"old": null
},
{
"column": 38,
"old": null
},
{
"column": 39,
"old": null
},
{
"column": 40,
"old": "JPEG_20210719_081756_6679607430012651497.jpg"
},
{
"column": 41,
"old": "Comment"
},
{
"column": 42,
"old": 9.4000000000000004
},
{
"column": 43,
"old": "2021-07-19T08:17:52Z"
}
]
}
Error: apply changeset failed!
GEODIFF: Error: Conflicts encountered while applying changes! Total 2

Error: geodiff failed!
['/geodiff/build/geodiff', 'apply', '--driver', 'sqlite', '', '/tmp/dbsync/client_field_apps_db.gpkg', '/tmp/dbsync-push-base2our']
Going to sleep

_ And on another table_

Trying to push
Changes:
water_flowmeter_monitor 0 0 1
Writing DB changes to working dir...
Warn: CONFLICT: delete_nothing:
{
"table": "water_flowmeter_monitor",
"type": "delete",
"changes": [
{
"column": 0,
"old": 87
},
{
"column": 1,
"old": "R1AAAeYQAAABAQAAACkRQqBK3zJAvrf49aj5QMA="
},
{
"column": 2,
"old": "{a310059b-1cde-4324-9ac5-c008b2bb1658}"
},
{
"column": 3,
"old": "C86"
},
{
"column": 4,
"old": "F203"
},
{
"column": 5,
"old": "Test flowmeter"
},
{
"column": 6,
"old": "2021-07-19T10:20:21Z"
},
{
"column": 7,
"old": 63838282
},
{
"column": 8,
"old": "JPEG_20210719_102027_8074284550380744558.jpg"
},
{
"column": 9,
"old": "Note"
},
{
"column": 10,
"old": null
},
{
"column": 11,
"old": null
},
{
"column": 12,
"old": "2021-07-19T10:20:21Z"
}
]
}
Error: apply changeset failed!
GEODIFF: Error: Conflicts encountered while applying changes! Total 1

Error: geodiff failed!
['/geodiff/build/geodiff', 'apply', '--driver', 'sqlite', '', '/tmp/dbsync/client_field_apps_db.gpkg', '/tmp/dbsync-push-base2our']
Going to sleep
2021-07-20 12:25:00.543165
Trying to pull
No changes on Mergin.
Trying to push
Changes:
water_flowmeter_monitor 0 0 1
Writing DB changes to working dir...
Warn: CONFLICT: delete_nothing:
{
"table": "water_flowmeter_monitor",
"type": "delete",
"changes": [
{
"column": 0,
"old": 87
},
{
"column": 1,
"old": "R1AAAeYQAAABAQAAACkRQqBK3zJAvrf49aj5QMA="
},
{
"column": 2,
"old": "{a310059b-1cde-4324-9ac5-c008b2bb1658}"
},
{
"column": 3,
"old": "C86"
},
{
"column": 4,
"old": "F203"
},
{
"column": 5,
"old": "Test flowmeter"
},
{
"column": 6,
"old": "2021-07-19T10:20:21Z"
},
{
"column": 7,
"old": 63838282
},
{
"column": 8,
"old": "JPEG_20210719_102027_8074284550380744558.jpg"
},
{
"column": 9,
"old": "Note"
},
{
"column": 10,
"old": null
},
{
"column": 11,
"old": null
},
{
"column": 12,
"old": "2021-07-19T10:20:21Z"
}
]
}
Error: apply changeset failed!
GEODIFF: Error: Conflicts encountered while applying changes! Total 1

Error: geodiff failed!
['/geodiff/build/geodiff', 'apply', '--driver', 'sqlite', '', '/tmp/dbsync/client_field_apps_db.gpkg', '/tmp/dbsync-push-base2our']
Going to sleep

running DB sync with Docker can use all Linux Inodes

/var/lib/docker/overlay2/$long-string$/diff/tmp/dbsyc/.mergin/work-packages directory is not cleaned in docker container and can easily reach up to a million+ diff files which uses up inodes on Linux, so although I have, for example, 100GiB ssd space of which only 33% is used, I only have 3.7 million inodes of which 90% are used by mergin-db-sync docker containers which stops the Linux machine from working. a good fix would be to delete previous diff's when a new one is downloaded

Add "force" option for init

When doing init from gpkg or from db, it would be useful to have an option to force remove:

  • working directory if it exists already
  • drop destination schemas if they exist already

Right now when restarting db sync these steps need to be done manually.

Add Dockerfile

This will make it easier to get DB sync running - just set up few env variables (Mergin project, username, password, DB conn info, DB schemas) and get things running with a single command.

db sync with Mergin CE

Hello

I have set up Mergin CE as a docker on my server. I have also installed db sync in docker. How do I point db sync to the local installation of mergin server?

The "Running with Docker" commands asks for MERGIN_USERNAME and MERGIN_PASSWORD. I assume this refers to the public cloud mergin server:

sudo docker run -it \
  -e DB_CONN_INFO="host=myhost.com dbname=mergin_dbsync user=postgres password=top_secret" \
  -e DB_SCHEMA_MODIFIED=sync_main \
  -e DB_SCHEMA_BASE=sync_base \
  -e MERGIN_USERNAME=john \
  -e MERGIN_PASSWORD=myStrongPassword \
  -e MERGIN_PROJECT_NAME=john/my_project \
  -e MERGIN_SYNC_FILE=sync_db.gpkg \
  lutraconsulting/mergin-db-sync:latest \
  python3 dbsync_daemon.py --init-from-gpkg

Would really appreciate some help.

Basic auto tests

We need a basic set of tests that would initialize DB sync and then exercise the synchronization both ways.

Set user agent

Just like mergin QGIS plugin sets user agent when using mergin client library, also DB sync tool should ideally do that so it's possible to see that on the server. It should be just a matter of setting an extra argument when creating MerginClient object...

Handle push errors to avoid broken sync

Sometimes it may happen that pushing a new version of a project fails for some reason (e.g. a network issue or a temporary server problem). Currently if that happens, the local directory is left with modifications - so the next time the daemon wakes up to do pull+push, the pull stage does not like the fact there are some pending changes, and the whole sync breaks.

If push fails, we should clean up the local directory - there's already a to-do left in the code:
https://github.com/lutraconsulting/mergin-db-sync/blob/master/dbsync.py#L486

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.