Coder Social home page Coder Social logo

Comments (4)

hpgrahsl avatar hpgrahsl commented on June 2, 2024

THX for your question @PushUpek!
As also mentioned in the documentation this is the standard behaviour regarding write models:

The default behaviour for the connector whenever documents are written to MongoDB collections is to make use of a proper ReplaceOneModel with upsert mode and create the filter document based on the _id field which results from applying the configured DocumentIdAdder in the value structure of the sink document.

May I ask what you would like to achieve and how you'd want this behaviour to be different? There is in fact already some flexibility when you look at the options you have according to the PostProcessor chain settings the DocumentIdAdder as well as more customized write models, all of which are briefly described in the README.

from kafka-connect-mongodb.

PushUpek avatar PushUpek commented on June 2, 2024

I just want to update existing documents in collection without adding new one. I don't see option to do this with postprocessing.

from kafka-connect-mongodb.

hpgrahsl avatar hpgrahsl commented on June 2, 2024

So what you'd like to have is that when a kafka record is read from a topic you want to only apply an update e.g. based on the _id field? If so I'd say this is a rather special use case and has never been requested since I launched this project many moons ago :)

There's primarily two reasons why this has not been implemented yet:

  1. Imagine you have an empty mongodb collection to begin with. Only trying to update something will never work, hence the collection will never receive any data.

  2. Even if you have existing documents in the collection, what would happen is that any kafka record that's read from the topic for which there is no matching mongodb document would be dropped / skipped, which again is typically not something people wanted to have in the past.

The bottom line is, if you really have this requirement you can always fork the project and add your own custom write model to it. This is the interface to implement https://github.com/hpgrahsl/kafka-connect-mongodb/blob/master/src/main/java/at/grahsl/kafka/connect/mongodb/writemodel/strategy/WriteModelStrategy.java

There are several examples in the project which you can look at to get some inspiration in this package https://github.com/hpgrahsl/kafka-connect-mongodb/tree/master/src/main/java/at/grahsl/kafka/connect/mongodb/writemodel/strategy

from kafka-connect-mongodb.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.