cognitom / momy Goto Github PK
View Code? Open in Web Editor NEWMongoDB to MySQL replication
Home Page: https://www.npmjs.com/package/momy
License: MIT License
MongoDB to MySQL replication
Home Page: https://www.npmjs.com/package/momy
License: MIT License
After a clean installation of momy, and having to set the "timestamp" field of the "mongo_to_mysql" table to something else than 0, I'm stuck getting this error.
ubuntu:~$ momy
16 Apr 20:36:52 - Connect to MySQL...
16 Apr 20:36:52 - Connect to MongoDB...
16 Apr 20:36:52 - Bigin to watch... (from 1)
16 Apr 20:36:52 - Stream closed....
16 Apr 20:36:52 - { MongoError: No more documents in tailed cursor
at Function.MongoError.create (/home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/error.js:31:11)
at nextFunction (/home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/cursor.js:644:50)
at /home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/cursor.js:593:7
at queryCallback (/home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/cursor.js:232:18)
at /home/ubuntu/.nvm/versions/node/v6.10.2/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:461:18
at _combinedTickCallback (internal/process/next_tick.js:73:7)
at process._tickCallback (internal/process/next_tick.js:104:9)
name: 'MongoError',
message: 'No more documents in tailed cursor',
tailable: true,
awaitData: true }
16 Apr 20:36:52 - Bye
I have a bug when i import the data, the first record is always empty. Then when i lunch the daemon i have this log:
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
1 Feb 10:02:51 - Replace a record in statistics (_id=undefined)
...
If i update some data in Mongo, nothing is happening in Mysql.
Where this issue can come from?
I'm not sure why. Until Nov 10 it was ok, but now it doesn't pass the tests on Travis, even though the all tests are pass in local.
FATAL ERROR: MarkCompactCollector: semi-space copy, fallback in old gen Allocation failed - JavaScript heap out of memory
It seems that mongodb-runner
heap out of memory. I've changed the memory size of Node, but it didn't work 😢
{
"src": "mongodb://localhost:27017/logs",
"dist": "mysql://root:password@localhost:3306/admin",
"collections": {
"audit-logs": {
"_id": "string",
"message": "string",
"createdAt": "string",
"updatedAt": "DATE"
},
"error-logs": {
"_id": "string",
"createdAt": "DATETIME",
"message": "string",
"updatedAt": "DATETIME"
}
}
}
this is my momy.json file.
Like this :
"dist":["mysql://root:password@localhost:3306/admin","mysql://root:password@localhost:3306/newdb"]
I've started thinking about custom converters or filters to solve #2 and #3 .
Here's just my brain storming...
Add these global fields to momyfile.json
:
"exclusions": "\uFFFD"
: if this field is set, these chars will be automatically removed."inclusions": "\x00-\x7F"
: if this field is set, any other chars will be automatically removed.Some pros and cons:
Make momyfile
scriptable:
momyfile.js
for more complex setting"globalConverter": (val, type, fieldName, colName) => {/** do something **/}
Some pros and cons:
I followed the instruction for setting up a replica set with no data and everything seems fine. Now I have data when I reboot the mongo instance using <mongod --replSet "rs0" --oplogSize 100> it seems to fail with the following errors:
forms-db_1 | 2018-09-14T04:39:26.741+0000 W REPL [replexec-0] Locally stored replica set configuration does not have a valid entry for the current node; waiting for reconfig or remote heartbeat; Got "NodeNotFound: No host described in new configuration 1 for replica set rs0 maps to this node" while validating { _id: "rs0", version: 1, protocolVersion: 1, writeConcernMajorityJournalDefault: true, members: [ { _id: 0, host: "9f7ecacad1f6:27017", arbiterOnly: false, buildIndexes: true, hidden: false, priority: 1.0, tags: {}, slaveDelay: 0, votes: 1 } ], settings: { chainingAllowed: true, heartbeatIntervalMillis: 2000, heartbeatTimeoutSecs: 10, electionTimeoutMillis: 10000, catchUpTimeoutMillis: -1, catchUpTakeoverDelayMillis: 30000, getLastErrorModes: {}, getLastErrorDefaults: { w: 1, wtimeout: 0 }, replicaSetId: ObjectId('5b9ae14fb9795b68247cb92f') } } forms-db_1 | 2018-09-14T04:39:26.741+0000 I REPL [replexec-0] New replica set config in use: { _id: "rs0", version: 1, protocolVersion: 1, writeConcernMajorityJournalDefault: true, members: [ { _id: 0, host: "9f7ecacad1f6:27017", arbiterOnly: false, buildIndexes: true, hidden: false, priority: 1.0, tags: {}, slaveDelay: 0, votes: 1 } ], settings: { chainingAllowed: true, heartbeatIntervalMillis: 2000, heartbeatTimeoutSecs: 10, electionTimeoutMillis: 10000, catchUpTimeoutMillis: -1, catchUpTakeoverDelayMillis: 30000, getLastErrorModes: {}, getLastErrorDefaults: { w: 1, wtimeout: 0 }, replicaSetId: ObjectId('5b9ae14fb9795b68247cb92f') } } forms-db_1 | 2018-09-14T04:39:26.741+0000 I REPL [replexec-0] This node is not a member of the config forms-db_1 | 2018-09-14T04:39:26.742+0000 I REPL [replexec-0] transition to REMOVED from STARTUP forms-db_1 | 2018-09-14T04:39:26.742+0000 I REPL [replexec-0] Starting replication storage threads forms-db_1 | 2018-09-14T04:39:26.743+0000 I REPL [replexec-0] Starting replication fetcher thread forms-db_1 | 2018-09-14T04:39:26.743+0000 I REPL [replexec-0] Starting replication applier thread forms-db_1 | 2018-09-14T04:39:26.744+0000 I REPL [replexec-0] Starting replication reporter thread forms-db_1 | 2018-09-14T04:39:26.744+0000 I REPL [rsSync-0] Starting oplog application forms-db_1 | 2018-09-14T04:44:26.737+0000 I NETWORK [LogicalSessionCacheRefresh] Starting new replica set monitor for rs0/9f7ecacad1f6:27017 forms-db_1 | 2018-09-14T04:44:26.745+0000 W NETWORK [LogicalSessionCacheRefresh] Unable to reach primary for set rs0 forms-db_1 | 2018-09-14T04:44:26.745+0000 I NETWORK [LogicalSessionCacheRefresh] Cannot reach any nodes for set rs0. Please check network connectivity and the status of the set. This has happened for 1 checks in a row. forms-db_1 | 2018-09-14T04:44:27.247+0000 W NETWORK [LogicalSessionCacheRefresh] Unable to reach primary for set rs0
Any help would be appreciated.
Thanks.
Hi.
I'm not sure for how long this has been going on, but records are not being updated after initial insert. After record is created, it is imported but no changes are reflected in MySQL.
I assume it is related to _id=undefined in the logs, so it does not know what record to update? Does anyone have any clue why this is happening? Is this issue related to config or bug in application?
MongoDB version 5.0
LOG:
13 Dec 15:41:40 - Replace a record in rocketchat.rocketchat_room (_id=undefined)
13 Dec 15:43:14 - Insert a new record into rocketchat.rocketchat_message
13 Dec 15:43:14 - Replace a record in rocketchat.rocketchat_room (_id=undefined)
13 Dec 15:43:37 - Insert a new record into rocketchat.rocketchat_message
13 Dec 15:43:37 - Replace a record in rocketchat.rocketchat_room (_id=undefined)
13 Dec 15:44:42 - Insert a new record into rocketchat.rocketchat_uploads
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)
13 Dec 15:44:42 - Insert a new record into rocketchat.rocketchat_uploads
13 Dec 15:44:42 - Replace a record in rocketchat.rocketchat_uploads (_id=undefined)
CONFIG:
{
"src": "REDACTED",
"dist": "REDACTED",
"prefix": "t_",
"case": "camel",
"fieldCase": "camel",
"collections": {
"rocketchat_room": {
"_id": "string",
"name": "string",
"fname": "string",
"t": "string",
"ts": "DATETIME",
"_updatedAt": "DATETIME",
"usersCount": "number",
"u.username": "string",
"customFields.department": "string",
"customFields.project": "string",
"u.username": "string"
},
"rocketchat_uploads": {
"_id": "string",
"rid": "string",
"name": "string",
"description": "string",
"extension": "string",
"userId": "string",
"size": "number",
"uploadedAt": "DATETIME",
"url": "string"
},
"rocketchat_message": {
"_id": "string",
"rid": "string",
"msg": "TEXT",
"_updatedAt": "DATETIME",
"ts": "DATETIME",
"u.username": "string",
"t": "string",
"file._id": "string",
"file.name": "string",
"file.type": "string"
}
}
}
If the _id is treated as a string, the CREATE TABLE command will use a VARCHAR(255) which is too large for a MYSQL string index if using collation of utf8mb4_general_ci
Fix for this was to change the type defines to VARCHAR(50) instead of VARCHAR(255)
Future releases should allow variable VARCHAR size in the JSON
Stack trace
9 Mar 21:07:05 - Connect to MySQL... 9 Mar 21:07:05 - { Error: ER_TOO_LONG_KEY: Specified key was too long; max key length is 767 bytes at Query.Sequence._packetToError (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14) at Query.ErrorPacket (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Query.js:77:18) at Protocol._parsePacket (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:280:23) at Parser.write (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Parser.js:75:12) at Protocol.write (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:39:16) at Socket.<anonymous> (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:103:28) at emitOne (events.js:96:13) at Socket.emit (events.js:189:7) at readableAddChunk (_stream_readable.js:176:18) at Socket.Readable.push (_stream_readable.js:134:10) -------------------- at Protocol._enqueue (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:141:48) at Connection.query (/usr/local/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:208:25) at Promise (/usr/local/lib/node_modules/momy/lib/mysql.js:177:10) at MySQL.query (/usr/local/lib/node_modules/momy/lib/mysql.js:175:12) at query.then.then (/usr/local/lib/node_modules/momy/lib/mysql.js:116:24) at process._tickCallback (internal/process/next_tick.js:103:7) code: 'ER_TOO_LONG_KEY', errno: 1071, sqlState: '42000', index: 1 }
MongoDB username and password are correct. I am using mlab.com.
If src in momyfile.json is my localhost, this code run without problem.
But, if src is another ip address (in this case i use mlab.com), I got message error
(node:4177) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): MongoError: Authentication failed.
Maybe in this code auth using MONGODB-CR, not SCRAM-SHA-1?
Regards
What would be the best way to add a dynamic sync, i.e.
With at least 3. implemented 1. & 2. would basically come for free, as one could write some application/mongodb-side script to publish a new momyfile when needed, without having to restart momy.
Oops..., I'll fix it asap.
collection.insert({a:1, b:2})
// OK
collection.update({a:1}, {$set:{b:3}})
// NG
collection.update({a:1}, {a:1, b:3})
Hello trying to make mongo keep my mysql up to date but getting bad sql error
root@e69c090f6462:/# momy --config momyfile.json --import
25 Apr 10:04:27 - Connect to MySQL...
25 Apr 10:04:28 - { Error: ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near ')' at line 1
at Query.Sequence._packetToError (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14) at Query.ErrorPacket (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/sequences/Query.js:77:18)
at Protocol._parsePacket (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:279:23)
at Parser.write (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Parser.js:76:12)
at Protocol.write (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:39:16)
at Socket.<anonymous> (/usr/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:103:28)
at Socket.emit (events.js:180:13)
at addChunk (_stream_readable.js:274:12)
at readableAddChunk (_stream_readable.js:261:11)
at Socket.Readable.push (_stream_readable.js:218:10)
--------------------
at Protocol._enqueue (/usr/lib/node_modules/momy/node_modules/mysql/lib/protocol/Protocol.js:145:48)
at Connection.query (/usr/lib/node_modules/momy/node_modules/mysql/lib/Connection.js:208:25)
at Promise (/usr/lib/node_modules/momy/lib/mysql.js:177:10)
at new Promise (<anonymous>)
at MySQL.query (/usr/lib/node_modules/momy/lib/mysql.js:175:12)
at query.then.then (/usr/lib/node_modules/momy/lib/mysql.js:116:24)
at <anonymous>
at process._tickCallback (internal/process/next_tick.js:182:7)
code: 'ER_PARSE_ERROR',
errno: 1064,
sqlMessage: 'You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near \')\' at line 1',
sqlState: '42000',
index: 1,
sql: 'DROP TABLE IF EXISTS `t_paste`; CREATE TABLE `t_paste` ();' }
25 Apr 10:04:28 - Bye
root@e69c090f6462:/#
Here my config file
{
"src": "mongodb://localhost:27017/paste_db",
"dist": "mysql://root:removed@localhost:3306/mongod",
"prefix": "t_",
"case": "camel",
"collections": {
"paste": {
"_id": "_id",
"TEXT": "db_keywords",
"TEXT": "emails",
"TEXT": "hashes",
"TEXT": "num_emals",
"TEXT": "num_hashes",
"VARCHAR": "url"
}
}
}
Password been removed to mysql being used and is allowing out of network connections
How should I represent the array field in the json file?
Hi, my MongoDB Collection is returning an object instead of a number or string, I have changed the code:
DATETIME: {
type: 'DATETIME',
convert: val => {
if (typeof val === 'string') val = getValueOfDate(val);
if (typeof val === 'object') {
val = moment(val).format('YYYY-MM-DD HH:mm:ss');
return `"${val}"`;
}
if (typeof val !== 'number') return 'NULL';
val = moment(val).format('YYYY-MM-DD HH:mm:ss');
return `"${val}"`;
}
},
I think this error could happen too in other date formats, but with DATETIME is fixed for me. I have created a PR, please check. Thanks and excellent work ;)
fieldCase
spawn
prefix
Import was success, but when streaming I got this error. What is it? When I changed data on mongodb, it was not synchronized, maybe it was caused by this error?
javan@javan-desktop:~/workspace/simpel/application/shell$ momy --config momyfile.json
29 May 12:48:31 - Connect to MySQL...
the options [auto_reconnect] is not supported
(node:24053) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'ts' of null
at db.collection.find.sort.limit.nextObject.then.item (/usr/local/lib/node_modules/momy/lib/tailer.js:127:25)
at
at process._tickCallback (internal/process/next_tick.js:188:7)
(node:24053) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:24053) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
Thanks for this great work!! I'm getting this error, has anyone seen it?
name: 'MongoError', message: 'ns field has bson.D value that is not string or valid MongoDb RegEx: Error parsing value [{$in [db.collection db.collection2 db.collection2]}] to RegEx: Must specify $regex field', ok: 0, errmsg: 'ns field has bson.D value that is not string or valid MongoDb RegEx: Error parsing value [{$in [db.collection db.collection2 db.collection2]}] to RegEx: Must specify $regex field', code: 8000, codeName: 'AtlasError' }
For example, if the collection has fields like this:
{
"_id": 0,
"name": "Mike",
"address": {
"zip": "1550033",
"city": "Setagaya"
}
}
I'd like to make it possible to specify the sub-field in momyfile.json
:
{
"_id": "number",
"name": "string",
"address.zip": "string",
"address.city": "string"
}
"collections": {
"class": {
"_id": "string",
"tests": "string",
"lastUpdate": "DATETIME"
in mongodb "lastUpdate" : ISODate("2019-01-03T13:01:01Z") ;
but in mysql,lastUpdate is null?why?
When I am running momy, it will eventually crash and stop, instead of starting a new session. Since its impossible to revert to the last "known" state, I have to re-import all 15 million records...
16 Oct 08:17:52 - Insert a new record into database.trades
16 Oct 08:17:55 - Stream closed....
16 Oct 08:17:55 - { MongoError: operation exceeded time limit
at Function.MongoError.create (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/error.js:31:11)
at /usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:497:72
at authenticateStragglers (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:443:16)
at Connection.messageHandler (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/pool.js:477:5)
at Socket. (/usr/local/lib/node_modules/momy/node_modules/mongodb-core/lib/connection/connection.js:333:22)
at Socket.emit (events.js:182:13)
at addChunk (_stream_readable.js:283:12)
at readableAddChunk (_stream_readable.js:264:11)
at Socket.Readable.push (_stream_readable.js:219:10)
at TCP.onread (net.js:638:20)
name: 'MongoError',
message: 'operation exceeded time limit',
ok: 0,
errmsg: 'operation exceeded time limit',
code: 50 }
16 Oct 08:17:56 - Bye
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.