Coder Social home page Coder Social logo

momy's Introduction

Build Status NPM Status Codecov Status

Momy

Momy is a simple cli tool for replicating MongoDB to MySQL in realtime.

  • Enable SQL query on data in NoSQL database
  • Enable to be accessed by Excel / Access

Momy

Installation

Install via npm:

$ npm install -g momy

Or use docker:

$ docker run -it --rm -v $(pwd):/workdir cognitom/momy

You might want to create an alias, for example

$ echo 'alias momy="docker run -it --rm -v $(pwd):/workdir cognitom/momy"' >> ~/.bashrc

See more detail about Docker configurations below.

Preparation

MongoDB

Momy uses Replica Set feature in MongoDB. But you don't have to replicate between MongoDB actually. Just follow the steps below.

Start a new mongo instance with no data:

$ mongod --replSet "rs0" --oplogSize 100

Open another terminal, and go to MongoDB Shell:

$ mongo
....
> rs.initiate()

rs.initiate() command prepare the collections that is needed for replication.

MySQL

Launch MySQL instance, and create the new database to use. The tables will be created or updated when syncing. You'll see mongo_to_mysql, too. This is needed to store the information for syncing. (don't remove it)

Configuration

Create a new momyfile.json file like this:

{
  "src": "mongodb://localhost:27017/dbname",
  "dist": "mysql://root:password@localhost:3306/dbname",
  "prefix": "t_",
  "case": "camel",
  "collections": {
    "collection1": {
      "_id": "number",
      "createdAt": "DATETIME",
      "field1": "number",
      "field2": "string",
      "field3": "boolean",
      "field4.subfield": "string"
    },
    "collection2": {
      "_id": "string",
      "createdAt": "DATETIME",
      "field1": "number",
      "field2": "string",
      "field3": "boolean",
      "field4": "TEXT"
    }
  }
}
  • src: the URL of the MongoDB server
  • dist: the URL of the MySQL server
  • prefix: optional prefix for table name. The name of the table would be t_collection1 in the example above.
  • fieldCase: optional. snake or camel. See the section below.
  • exclusions: optional. Chars or a range of chars to exclude: "\uFFFD"
  • inclusions: optional. Chars or a range of chars to include: "\u0000-\u007F"
  • collections: set the collections and fields to sync

_id field is required for each collection and should be string or number.

Field names and types

"<field_name>": "<field_tipe>"

or, field_name could be dot-concatenated:

"<field_name>.<sub_name>": "<field_tipe>"

For example, if you have { a: { b: { c: 'hey!' } } } then "a.b.c": "string"

Currently these native types are supported:

  • BIGINT
  • TINYINT
  • VARCHAR
  • DATE
  • DATETIME
  • TIME
  • TEXT

There're also some aliases:

  • number => BIGINT
  • boolean => TINYINT
  • string => VARCHAR

Field name normalization: fieldCase

Some system like Microsoft Access don't allow dot-concatenated field names, so address.street will cause an error. For such a case, use fieldCase:

  • snake: address.street --> address_street
  • camel: address.street --> addressStreet

Note: if you set fieldCase value, the name of _id field will change into id without _, too.

Usage

At the first run, we need to import all the data from MongoDB:

$ momy --config momyfile.json --import

Then start the daemon to streaming data:

$ momy --config momyfile.json

or

$ forever momy --config momyfile.json

Usage with Docker

First thing first, create a network for your containers:

$ docker network create my-net

Then, launch database servers:

$ docker run \
    --name my-mongod \
    --detach --rm \
    --network my-net \
    --mount type=volume,source=my-mongo-store,target=/data/db \
    mongo --replSet "rs0"
$ docker run \
    --name my-mysqld \
    --detach --rm \
    --network my-net \
    --mount type=volume,source=my-mysql-store,target=/var/lib/mysql \
    --env MYSQL_ALLOW_EMPTY_PASSWORD=yes \
    mysql

If this is the first time to run the containers above, you need to initialize them:

$ docker exec my-mongod mongo --eval 'rs.initiate()'
$ docker exec my-mysqld mysql -e 'CREATE DATABASE momy;'

Create momyfile.json like this:

{
  "src": "mongodb://my-mongod:27017/momy",
  "dist": "mysql://root@my-mysqld:3306/momy",
  "collections": {...}
}

Note: you must change username, password, port, ...etc. to fit your environment.

OK, let's run momy with --import option:

$ docker run \
    --interactive --tty --rm \
    --network my-net \
    --mount type=bind,source=$(pwd),target=/workdir \
    cognitom/momy --import

Everything goes well? Then, stop the container (Ctrl + C). Now you can run it as a daemon:

$ docker run \
    --detach --rm \
    --restart unless-stopped \
    --init \
    --network my-net \
    --mount type=bind,source=$(pwd),target=/workdir \
    cognitom/momy

For contributors

See dev directory.

License

MIT

This library was originally made by @doubaokun as MongoDB-to-MySQL and rewritten by @cognitom.

momy's People

Contributors

ardinusawan avatar chemitaxis avatar cognitom avatar doubaokun avatar odede-perfecto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

momy's Issues

migrate into two database

{
    "src": "mongodb://localhost:27017/logs",
    "dist": "mysql://root:password@localhost:3306/admin",
    "collections": {
      "audit-logs": {
        "_id": "string",
        "message": "string",
        "createdAt": "string",
        "updatedAt": "DATE"
      },
      "error-logs": {
        "_id": "string",
        "createdAt": "DATETIME",
        "message": "string",
        "updatedAt": "DATETIME"
      }
    }
  }

this is my momy.json file.

  • Here I want to migrate more than one database. So I want to pass the dist link in array to achieve this one.

Like this :

    "dist":["mysql://root:password@localhost:3306/admin","mysql://root:password@localhost:3306/newdb"]

ER_PARSE_ERROR

Hello trying to make mongo keep my mysql up to date but getting bad sql error

root@e69c090f6462:/# momy --config momyfile.json --import
25 Apr 10:04:27 - Connect to MySQL...
25 Apr 10:04:28 - { Error: ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near ')' at line 1
    at Query.Sequence._packetToError (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14)    at Query.ErrorPacket (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Query.js:77:18)
    at Protocol._parsePacket (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:279:23)
    at Parser.write (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Parser.js:76:12)
    at Protocol.write (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:39:16)
    at Socket.<anonymous> (/usr/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:103:28)
    at Socket.emit (events.js:180:13)
    at addChunk (_stream_readable.js:274:12)
    at readableAddChunk (_stream_readable.js:261:11)
    at Socket.Readable.push (_stream_readable.js:218:10)
    --------------------
    at Protocol._enqueue (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:145:48)
    at Connection.query (/usr/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:208:25)
    at Promise (/usr/lib/node_modules/momy/lib/mysql.js:177:10)
    at new Promise (<anonymous>)
    at MySQL.query (/usr/lib/node_modules/momy/lib/mysql.js:175:12)
    at query.then.then (/usr/lib/node_modules/momy/lib/mysql.js:116:24)
    at <anonymous>
    at process._tickCallback (internal/process/next_tick.js:182:7)
  code: 'ER_PARSE_ERROR',
  errno: 1064,
  sqlMessage: 'You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near \')\' at line 1',
  sqlState: '42000',
  index: 1,
  sql: 'DROP TABLE IF EXISTS `t_paste`; CREATE TABLE `t_paste` ();' }
25 Apr 10:04:28 - Bye
root@e69c090f6462:/#

Here my config file

{

  "src": "mongodb://localhost:27017/paste_db",
  "dist": "mysql://root:removed@localhost:3306/mongod",
  "prefix": "t_",
  "case": "camel",
  "collections": {
    "paste": {
      "_id": "_id",
      "TEXT": "db_keywords",
      "TEXT": "emails",
      "TEXT": "hashes",
      "TEXT": "num_emals",
      "TEXT": "num_hashes",
      "VARCHAR": "url"
    }
  }
}

Password been removed to mysql being used and is allowing out of network connections

Todo: more tests

  • insert a single document
  • update a single document
  • convert values to SQL for each field type
  • insert multiple documents
  • insert/update many many documents
  • fieldCase
  • document deleting
  • CLI testing via spawn maybe later
  • importing bulk docs
  • prefix

datetime is null in mysql

"collections": {
"class": {
"_id": "string",
"tests": "string",
"lastUpdate": "DATETIME"

in mongodb "lastUpdate" : ISODate("2019-01-03T13:01:01Z") ;
but in mysql,lastUpdate is null?why?

mongo -> mysql import, ER_TOO_LONG_KEY

If the _id is treated as a string, the CREATE TABLE command will use a VARCHAR(255) which is too large for a MYSQL string index if using collation of utf8mb4_general_ci

Fix for this was to change the type defines to VARCHAR(50) instead of VARCHAR(255)

Future releases should allow variable VARCHAR size in the JSON

Stack trace
9 Mar 21:07:05 - Connect to MySQL... 9 Mar 21:07:05 - { Error: ER_TOO_LONG_KEY: Specified key was too long; max key length is 767 bytes at Query.Sequence._packetToError (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14) at Query.ErrorPacket (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Query.js:77:18) at Protocol._parsePacket (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:280:23) at Parser.write (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Parser.js:75:12) at Protocol.write (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:39:16) at Socket.<anonymous> (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:103:28) at emitOne (events.js:96:13) at Socket.emit (events.js:189:7) at readableAddChunk (_stream_readable.js:176:18) at Socket.Readable.push (_stream_readable.js:134:10) -------------------- at Protocol._enqueue (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:141:48) at Connection.query (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:208:25) at Promise (/usr/local/lib/node_modules/momy/lib/mysql.js:177:10) at MySQL.query (/usr/local/lib/node_modules/momy/lib/mysql.js:175:12) at query.then.then (/usr/local/lib/node_modules/momy/lib/mysql.js:116:24) at process._tickCallback (internal/process/next_tick.js:103:7) code: 'ER_TOO_LONG_KEY', errno: 1071, sqlState: '42000', index: 1 }

Error When Streaming

Import was success, but when streaming I got this error. What is it? When I changed data on mongodb, it was not synchronized, maybe it was caused by this error?

javan@javan-desktop:~/workspace/simpel/application/shell$ momy --config momyfile.json
29 May 12:48:31 - Connect to MySQL...
the options [auto_reconnect] is not supported
(node:24053) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'ts' of null
at db.collection.find.sort.limit.nextObject.then.item (/usr/local/lib/node_modules/momy/lib/tailer.js:127:25)
at
at process._tickCallback (internal/process/next_tick.js:188:7)
(node:24053) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:24053) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

Allows sub-key like "field.subfield"

For example, if the collection has fields like this:

{
  "_id": 0,
  "name": "Mike",
  "address": {
    "zip": "1550033",
    "city": "Setagaya"
  }
}

I'd like to make it possible to specify the sub-field in momyfile.json:

{
  "_id": "number",
  "name": "string",
  "address.zip": "string",
  "address.city": "string"
}

No more documents in tailed cursor

After a clean installation of momy, and having to set the "timestamp" field of the "mongo_to_mysql" table to something else than 0, I'm stuck getting this error.

ubuntu:~$ momy

16 Apr 20:36:52 - Connect to MySQL...
16 Apr 20:36:52 - Connect to MongoDB...
16 Apr 20:36:52 - Bigin to watch... (from 1)
16 Apr 20:36:52 - Stream closed....
16 Apr 20:36:52 - { MongoError: No more documents in tailed cursor
    at Function.MongoError.create (/home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/error.js:31:11)
    at nextFunction (/home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/cursor.js:644:50)
    at /home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/cursor.js:593:7
    at queryCallback (/home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/cursor.js:232:18)
    at /home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:461:18
    at _combinedTickCallback (internal/process/next_tick.js:73:7)
    at process._tickCallback (internal/process/next_tick.js:104:9)
  name: 'MongoError',
  message: 'No more documents in tailed cursor',
  tailable: true,
  awaitData: true }
16 Apr 20:36:52 - Bye

Data not updated in MySQL after chagne in MongoDB

Hi.

I'm not sure for how long this has been going on, but records are not being updated after initial insert. After record is created, it is imported but no changes are reflected in MySQL.

I assume it is related to _id=undefined in the logs, so it does not know what record to update? Does anyone have any clue why this is happening? Is this issue related to config or bug in application?

MongoDB version 5.0

LOG:

13 Dec 15:41:40 - Replace a record in rocketchat.rocketchat_room (_id=undefined)
13 Dec 15:43:14 - Insert a new record into rocketchat.rocketchat_message
13 Dec 15:43:14 - Replace a record in rocketchat.rocketchat_room (_id=undefined)
13 Dec 15:43:37 - Insert a new record into rocketchat.rocketchat_message
13 Dec 15:43:37 - Replace a record in rocketchat.rocketchat_room (_id=undefined)
13 Dec 15:44:42 - Insert a new record into rocketchat.rocketchat_uploads
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)
13 Dec 15:44:42 - Insert a new record into rocketchat.rocketchat_uploads
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)

CONFIG:

{
  "src": "REDACTED",
  "dist": "REDACTED",
  "prefix": "t_",
  "case": "camel",
  "fieldCase": "camel",
  "collections": {
    "rocketchat_room": {
      "_id": "string",
      "name": "string",
      "fname": "string",
      "t": "string",
      "ts": "DATETIME",
      "_updatedAt": "DATETIME",
      "usersCount": "number",
      "u.username": "string",
      "customFields.department": "string",
      "customFields.project": "string",
      "u.username": "string"
    },
    "rocketchat_uploads": {
      "_id": "string",
      "rid": "string",
      "name": "string",
      "description": "string",
      "extension": "string",
      "userId": "string",
      "size": "number",
      "uploadedAt": "DATETIME",
      "url": "string"
    },
    "rocketchat_message": {
      "_id": "string",
      "rid": "string",
      "msg": "TEXT",
      "_updatedAt": "DATETIME",
      "ts": "DATETIME",
      "u.username": "string",
      "t": "string",
      "file._id": "string",
      "file.name": "string",
      "file.type": "string"
    }
    
  }
}

operation exceeded time limit

When I am running momy, it will eventually crash and stop, instead of starting a new session. Since its impossible to revert to the last "known" state, I have to re-import all 15 million records...

16 Oct 08:17:52 - Insert a new record into database.trades
16 Oct 08:17:55 - Stream closed....
16 Oct 08:17:55 - { MongoError: operation exceeded time limit
at Function.MongoError.create (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/error.js:31:11)
at /usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:497:72
at authenticateStragglers (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:443:16)
at Connection.messageHandler (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:477:5)
at Socket. (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/connection.js:333:22)
at Socket.emit (events.js:182:13)
at addChunk (_stream_readable.js:283:12)
at readableAddChunk (_stream_readable.js:264:11)
at Socket.Readable.push (_stream_readable.js:219:10)
at TCP.onread (net.js:638:20)
name: 'MongoError',
message: 'operation exceeded time limit',
ok: 0,
errmsg: 'operation exceeded time limit',
code: 50 }
16 Oct 08:17:56 - Bye

Tests on Travis fail

I'm not sure why. Until Nov 10 it was ok, but now it doesn't pass the tests on Travis, even though the all tests are pass in local.

FATAL ERROR: MarkCompactCollector: semi-space copy, fallback in old gen Allocation failed - JavaScript heap out of memory

It seems that mongodb-runner heap out of memory. I've changed the memory size of Node, but it didn't work 😢

  • efe328e: --max_old_space_size=4096
  • 48129e9: --max_old_space_size=2560
  • f5e7f07: --max_old_space_size=2048
  • cf0daeb: --max_semi_space_size=16 --max_old_space_size=1024 --max_executable_size=512
  • f5e7f07: --max_semi_space_size=2 --max_old_space_size=256 --max_executable_size=192

dynamic addition of collections to sync

What would be the best way to add a dynamic sync, i.e.

  1. automatically detect new, unsynced collections in mongo db
  2. add all fields of a collection without having to specify in momyfile.json
  3. re-read momyfile.json regularly, possibly from a remote location (http), with auto --import on detecting new collections

With at least 3. implemented 1. & 2. would basically come for free, as one could write some application/mongodb-side script to publish a new momyfile when needed, without having to restart momy.

Replica set with data

I followed the instruction for setting up a replica set with no data and everything seems fine. Now I have data when I reboot the mongo instance using <mongod --replSet "rs0" --oplogSize 100> it seems to fail with the following errors:

forms-db_1 | 2018-09-14T04:39:26.741+0000 W REPL [replexec-0] Locally stored replica set configuration does not have a valid entry for the current node; waiting for reconfig or remote heartbeat; Got "NodeNotFound: No host described in new configuration 1 for replica set rs0 maps to this node" while validating { _id: "rs0", version: 1, protocolVersion: 1, writeConcernMajorityJournalDefault: true, members: [ { _id: 0, host: "9f7ecacad1f6:27017", arbiterOnly: false, buildIndexes: true, hidden: false, priority: 1.0, tags: {}, slaveDelay: 0, votes: 1 } ], settings: { chainingAllowed: true, heartbeatIntervalMillis: 2000, heartbeatTimeoutSecs: 10, electionTimeoutMillis: 10000, catchUpTimeoutMillis: -1, catchUpTakeoverDelayMillis: 30000, getLastErrorModes: {}, getLastErrorDefaults: { w: 1, wtimeout: 0 }, replicaSetId: ObjectId('5b9ae14fb9795b68247cb92f') } } forms-db_1 | 2018-09-14T04:39:26.741+0000 I REPL [replexec-0] New replica set config in use: { _id: "rs0", version: 1, protocolVersion: 1, writeConcernMajorityJournalDefault: true, members: [ { _id: 0, host: "9f7ecacad1f6:27017", arbiterOnly: false, buildIndexes: true, hidden: false, priority: 1.0, tags: {}, slaveDelay: 0, votes: 1 } ], settings: { chainingAllowed: true, heartbeatIntervalMillis: 2000, heartbeatTimeoutSecs: 10, electionTimeoutMillis: 10000, catchUpTimeoutMillis: -1, catchUpTakeoverDelayMillis: 30000, getLastErrorModes: {}, getLastErrorDefaults: { w: 1, wtimeout: 0 }, replicaSetId: ObjectId('5b9ae14fb9795b68247cb92f') } } forms-db_1 | 2018-09-14T04:39:26.741+0000 I REPL [replexec-0] This node is not a member of the config forms-db_1 | 2018-09-14T04:39:26.742+0000 I REPL [replexec-0] transition to REMOVED from STARTUP forms-db_1 | 2018-09-14T04:39:26.742+0000 I REPL [replexec-0] Starting replication storage threads forms-db_1 | 2018-09-14T04:39:26.743+0000 I REPL [replexec-0] Starting replication fetcher thread forms-db_1 | 2018-09-14T04:39:26.743+0000 I REPL [replexec-0] Starting replication applier thread forms-db_1 | 2018-09-14T04:39:26.744+0000 I REPL [replexec-0] Starting replication reporter thread forms-db_1 | 2018-09-14T04:39:26.744+0000 I REPL [rsSync-0] Starting oplog application forms-db_1 | 2018-09-14T04:44:26.737+0000 I NETWORK [LogicalSessionCacheRefresh] Starting new replica set monitor for rs0/9f7ecacad1f6:27017 forms-db_1 | 2018-09-14T04:44:26.745+0000 W NETWORK [LogicalSessionCacheRefresh] Unable to reach primary for set rs0 forms-db_1 | 2018-09-14T04:44:26.745+0000 I NETWORK [LogicalSessionCacheRefresh] Cannot reach any nodes for set rs0. Please check network connectivity and the status of the set. This has happened for 1 checks in a row. forms-db_1 | 2018-09-14T04:44:27.247+0000 W NETWORK [LogicalSessionCacheRefresh] Unable to reach primary for set rs0

Any help would be appreciated.

Thanks.

MongoDB cannot login, always Authentication failed

MongoDB username and password are correct. I am using mlab.com.
If src in momyfile.json is my localhost, this code run without problem.
But, if src is another ip address (in this case i use mlab.com), I got message error

(node:4177) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): MongoError: Authentication failed.

Maybe in this code auth using MONGODB-CR, not SCRAM-SHA-1?

Regards

Error if datetime is type object

Hi, my MongoDB Collection is returning an object instead of a number or string, I have changed the code:

DATETIME: {
    type: 'DATETIME',
    convert: val => {
      if (typeof val === 'string') val = getValueOfDate(val);
      if (typeof val === 'object') {
        val = moment(val).format('YYYY-MM-DD HH:mm:ss');
        return `"${val}"`;
      }
      if (typeof val !== 'number') return 'NULL';
      val = moment(val).format('YYYY-MM-DD HH:mm:ss');
      return `"${val}"`;
    }
  },

I think this error could happen too in other date formats, but with DATETIME is fixed for me. I have created a PR, please check. Thanks and excellent work ;)

First record empty and data not synced

I have a bug when i import the data, the first record is always empty. Then when i lunch the daemon i have this log:

1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
...

If i update some data in Mongo, nothing is happening in Mysql.

Where this issue can come from?

ns field has bson.D value that is not string or valid MongoDb

Thanks for this great work!! I'm getting this error, has anyone seen it?

name: 'MongoError', message: 'ns field has bson.D value that is not string or valid MongoDb RegEx: Error parsing value [{$in [db.collection db.collection2 db.collection2]}] to RegEx: Must specify $regex field', ok: 0, errmsg: 'ns field has bson.D value that is not string or valid MongoDb RegEx: Error parsing value [{$in [db.collection db.collection2 db.collection2]}] to RegEx: Must specify $regex field', code: 8000, codeName: 'AtlasError' }

Custom converters or filters

I've started thinking about custom converters or filters to solve #2 and #3 .
Here's just my brain storming...

Solution 1

Add these global fields to momyfile.json:

  • "exclusions": "\uFFFD": if this field is set, these chars will be automatically removed.
  • "inclusions": "\x00-\x7F": if this field is set, any other chars will be automatically removed.

Some pros and cons:

  • pros: simple and enough for the issues at this point
  • cons: not flexible

Solution 2

Make momyfile scriptable:

  • use momyfile.js for more complex setting
  • "globalConverter": (val, type, fieldName, colName) => {/** do something **/}

Some pros and cons:

  • pros: max flexibility. Users can freely extend it as they like.
  • cons: too complex

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.