Coder Social home page Coder Social logo

Comments (10)

gjiang1 avatar gjiang1 commented on May 19, 2024

I started zookeeper, kafka server, kafka rest and schema registry one by one and looks like they are up and running, but when I check the status, all show down now. It confused me, should I start them one by one or run "/confluent-4.1.1/bin/confluent start" to make all 6 together?

/confluent-4.1.1/bin/confluent status

ksql-server is [DOWN]
connect is [DOWN]
kafka-rest is [DOWN]
schema-registry is [DOWN]
kafka is [DOWN]
zookeeper is [DOWN]

But actually, they are running.

Thanks for your help!

from kafka-connect-mongodb.

hpgrahsl avatar hpgrahsl commented on May 19, 2024

Hi @gjiang1 !

First I would like to ask which article you are referring to and which instructions you mean that you are following :)

Apart from that, what's a bit confusing to me is the fact that the log snippet you posted contains a completely different package declaration, namely, org.radarcns.connect.mongodb. Can it be that you are referring to a different mongodb sink connector e.g. this one https://github.com/RADAR-base/MongoDb-Sink-Connector ? Taking a closer look at the log messages you can see that the following file is what you are potentially running: https://github.com/RADAR-base/MongoDb-Sink-Connector/blob/master/src/main/java/org/radarcns/connect/mongodb/MongoDbSinkTask.java

Meanwhile I'm waiting for your hopefully clarifying response. THX in advance!

from kafka-connect-mongodb.

gjiang1 avatar gjiang1 commented on May 19, 2024

Hi Hpgrahsl, thank you very much for your responses.

I followed https://github.com/RADAR-base/MongoDb-Sink-Connector. I tried to use two ways to start kafka, one is " /confluent-4.1.1/bin/confluent start", you can see below, all UP
Using CONFLUENT_CURRENT: /tmp/confluent.NFBX0ijJ
Starting zookeeper
zookeeper is [UP]
Starting kafka
kafka is [UP]
Starting schema-registry
schema-registry is [UP]
Starting kafka-rest
kafka-rest is [UP]
Starting connect
connect is [UP]
Starting ksql-server
ksql-server is [UP]
Then, I started to run ./bin/connect-standalone ./etc/schema-registry/connect-avro-standalone.properties ./etc/sink-connect-mongodb.properties, however, I got errors. Please see attachment. I checked Confluent status, can see above 6 stuff are still UP
kafka connector error file1.docx

I killed all Confluent sessions and started zookeeper, kafka server, schema registry and kafka rest one by one following the instruction above, for example, ./bin/zookeeper-server-start ./etc/kafka/zookeeper.properties, I keep the terminals opening and run "./bin/connect-standalone ./etc/schema-registry/connect-avro-standalone.properties ./etc/sink-connect-mongodb.properties" and "curl -X POST -H "Content-Type: application/vnd.kafka.avro.v2+json"
-H "Accept: application/vnd.kafka.v2+json"
--data '{"value_schema": "{"type": "record", "name": "User", "fields": [{"name": "name", "type": "string"}]}", "records": [{"value": {"name": "testUser"}}]}'
"http://localhost:8082/topics/avrotest ", but I can not see avotest collection in MongoDB.

Attached please see connect-avro-standalone.properties and sink-connect-mongodb.properties files.
mongodb, confluent, kafka installation and setup.docx
mongodb installation in linux environment.docx

I am not sure which part I am missing or setting wrong. I will follow you suggestions to do again tomorrow. If you find something wrong in my files, please let me know.

Thank you so much for your time and help!

gw

from kafka-connect-mongodb.

gjiang1 avatar gjiang1 commented on May 19, 2024

from kafka-connect-mongodb.

gjiang1 avatar gjiang1 commented on May 19, 2024

I will show you what I did step by step last time, then please help me to find what is the problem I had.
Thanks a lot!
gw

from kafka-connect-mongodb.

gjiang1 avatar gjiang1 commented on May 19, 2024

Hello Hpgrahsl,

Below are what I did, please help me to figure out the problem. I am a database person but not a programmer and JAVA person. It is my first time to hear Kafka and Confluent, and is learning how to set up the connection between source DB <> Confluent , Kafka <> Sink DB for the future potential project. Very appreciate your help!

  1. I did not do the Docker installation using “ radarbase/kafka-connect-mongodb-sink” Docker image since I have had a Docker container in EC2 Linux environment and MongDB installed there. I installed Confluent-4.1.1 and Confluent JDBC Connector ( to be used for source Oracle database connection).
  2. I downloaded kafka-connect-mongodb-sink-0.2.2-javadoc.jar from https://github.com/RADAR-base/MongoDb-Sink-Connector/releases and put this jar file to /confluent-4.1.1/share/java/
  3. export CLASSPATH=/confluent-4.1.1/share/java/kafka-connect-mongodb-sink-0.2.2.jar
  4. Start Confluent :
  5. [root@7fa5e286b664 bin]# /confluent-4.1.1/bin/confluent start
    Result:
    Using CONFLUENT_CURRENT: /tmp/confluent.NFBX0ijJ
    Starting zookeeper
    zookeeper is [UP]
    Starting kafka
    kafka is [UP]
    Starting schema-registry
    schema-registry is [UP]
    Starting kafka-rest
    kafka-rest is [UP]
    Starting connect
    connect is [UP]
    Starting ksql-server
    ksql-server is [UP]
  6. Create “sink-connect-mongodb.properties” file in /confluent-4.1.1/etc:

Kafka consumer configuration

name=kafka-connector-mongodb-sink

Kafka connector configuration

connector.class=org.radarcns.connect.mongodb.MongoDbSinkConnector
tasks.max=1

Topics that will be consumed

topics=avrotest

MongoDB server

mongo.host=localhost
mongo.port=27017

MongoDB configuration

mongo.username=
mongo.password=
mongo.database=mydb

Collection name for putting data into the MongoDB database. The {$topic} token will be replaced

by the Kafka topic name.

#mongo.collection.format={$topic}

Factory class to do the actual record conversion

#record.converter.class=org.radarcns.connect.mongodb.serialization.RecordConverterFactory
7. Modify connect-avro-standalone.properties file in /confluent-4.1.1/etc/schema-registry/ for plugin.path:
bootstrap.servers=localhost:9092
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
offset.storage.file.filename=/tmp/connect.offsets

plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,

Replace the relative path below with an absolute path if you are planning to start Kafka Connect from within a

directory other than the home directory of Confluent Platform.

plugin.path=/confluent-4.1.1/share/java/

  1. Run connector:
    [root@7fa5e286b664 confluent-4.1.1]# ./bin/connect-standalone ./etc/schema-registry/connect-avro-standalone.properties ./etc/sink-connect-mongodb.properties

But the process stopped because of the errors in the log file that I sent to you.
9. If I manually started zookeeper, kafka-server, kafka-rest and schema-registry separately from different terminals, I can see they are running but with the errors.
10. [root@7fa5e286b664 confluent-4.1.1]# curl -X POST -H "Content-Type: application/vnd.kafka.avro.v2+json"
-H "Accept: application/vnd.kafka.v2+json"
--data '{"value_schema": "{"type": "record", "name": "User", "fields": [{"name": "name", "type": "string"}]}", "records": [{"value": {"name": "testUser"}}]}'
"http://localhost:8082/topics/avrotest "
I can see the following response (that is correct):
{"offsets":[{"partition":0,"offset":0,"error_code":null,"error":null}],"key_schema_id":null,"value_schema_id":21}
11. Check kafka topic,
[root@7fa5e286b664 bin]# ./kafka-topics --list --zookeeper localhost:2181
__confluent.support.metrics
__consumer_offsets
_schemas
test-oracle-jdbc-USERS
avrotest
12. Open Mongo and check mongodb db, but can not see “avrotest” collection in mongo:

show collections
customers
month
month1
people
students
users

  1. When I tried to set up source JDBC –connector, I got some errors too.

Your saw your MongoDbSinkTask.java file, how and where can I use it? Did I set plugin.path correct? Is the anything I am missing? I did not use Docker image to set up Kafka and MongoDB, do you think it is an issue?

Thanks a lot for your time and help!

from kafka-connect-mongodb.

gjiang1 avatar gjiang1 commented on May 19, 2024

from kafka-connect-mongodb.

gjiang1 avatar gjiang1 commented on May 19, 2024

I modified plugin.path in "connect-avro-standalone.properties" :
plugin.path=/etc/alternatives, /var/lib/alternatives, /usr/share/java, /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.171-8.b10.el7_5.x86_64/jre/bin, /usr/bin, /etc/pki/java, /etc/java, /etc/pki/ca-trust/extracted/java, /usr/lib
then started zookeeper, kafka-server, kafka-rest and schema-registry one by one, they are still running, I run MongoDB-Sink-Connector again,

./bin/connect-standalone ./etc/schema-registry/connect-avro-standalone.properties ./etc/sink-connect-mongodb.properties

it is still running, but i can see the error:

[2018-06-27 16:15:08,196] ERROR WorkerSinkTask{id=sink-connect-mongodb-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:173)
[2018-06-27 16:15:08,203] INFO Stopping MongoDBSinkTask (org.radarcns.connect.mongodb.MongoDbSinkTask:163)
[2018-06-27 16:15:08,203] INFO Stopped MongoDBSinkTask (org.radarcns.connect.mongodb.MongoDbSinkTask:185)

[2018-06-27 16:15:08,196] ERROR WorkerSinkTask{id=sink-connect-mongodb-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:173)
[2018-06-27 16:15:08,203] INFO Stopping MongoDBSinkTask (org.radarcns.connect.mongodb.MongoDbSinkTask:163)
[2018-06-27 16:15:08,203] INFO Stopped MongoDBSinkTask
(org.radarcns.connect.mongodb.MongoDbSinkTask:185)

not sure what's wrong.

Thanks for your help!

from kafka-connect-mongodb.

hpgrahsl avatar hpgrahsl commented on May 19, 2024

Hi again @gjiang1!

Thx for all your detailed comments. However as you pointed out many times you are not using my sink connector project but a different one from https://github.com/RADAR-base namely this project https://github.com/RADAR-base/MongoDb-Sink-Connector

So I would highly recommend you post your "issue" - which actually is more about questions how to setup and get their project running - in their project instead of mine.

I will thus close your issue since it is not related to my project. Wishing you all the best to get the other project up & running. If you can't you are welcome to give my project a try :)

Greetings!

from kafka-connect-mongodb.

gjiang1 avatar gjiang1 commented on May 19, 2024

Ok, thanks.

from kafka-connect-mongodb.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.