Coder Social home page Coder Social logo

bwaldvogel / mongo-java-server Goto Github PK

View Code? Open in Web Editor NEW
279.0 18.0 88.0 4.05 MB

Fake implementation of MongoDB in Java that speaks the wire protocol.

License: BSD 3-Clause "New" or "Revised" License

Java 99.98% HTML 0.01% Shell 0.01%
mongodb netty java in-memory

mongo-java-server's Introduction

CI Maven Central codecov BSD 3-Clause License Donate

MongoDB Java Server

Fake implementation of the core MongoDB server in Java that can be used for integration tests.

Think of H2/HSQLDB/SQLite but for MongoDB.

The MongoDB Wire Protocol is implemented with Netty. Different backends are possible and can be extended.

In-Memory backend

The in-memory backend is the default backend that is typically used to fake MongoDB for integration tests. It supports most CRUD operations, commands and the aggregation framework. Some features are not yet implemented, such as transactions, full-text search or map/reduce.

Add the following Maven dependency to your project:

<dependency>
    <groupId>de.bwaldvogel</groupId>
    <artifactId>mongo-java-server</artifactId>
    <version>1.45.0</version>
</dependency>

Example

class SimpleTest {

    private MongoCollection<Document> collection;
    private MongoClient client;
    private MongoServer server;

    @BeforeEach
    void setUp() {
        server = new MongoServer(new MemoryBackend());

        // optionally:
        // server.enableSsl(key, keyPassword, certificate);
        // server.enableOplog();

        // bind on a random local port
        String connectionString = server.bindAndGetConnectionString();

        client = MongoClients.create(connectionString);
        collection = client.getDatabase("testdb").getCollection("testcollection");
    }

    @AfterEach
    void tearDown() {
        client.close();
        server.shutdown();
    }

    @Test
    void testSimpleInsertQuery() throws Exception {
        assertThat(collection.countDocuments()).isZero();

        // creates the database and collection in memory and insert the object
        Document obj = new Document("_id", 1).append("key", "value");
        collection.insertOne(obj);

        assertThat(collection.countDocuments()).isEqualTo(1L);
        assertThat(collection.find().first()).isEqualTo(obj);
    }

}

Example with SpringBoot

@RunWith(SpringRunner.class)
@SpringBootTest(classes={SimpleSpringBootTest.TestConfiguration.class})
public class SimpleSpringBootTest {

    @Autowired private MyRepository repository;

    @Before
    public void setUp() {
        // initialize your repository with some test data
        repository.deleteAll();
        repository.save(...);
    }

    @Test
    public void testMyRepository() {
        // test your repository ...
        ...
    }

    @Configuration
    @EnableMongoTestServer
    @EnableMongoRepositories(basePackageClasses={MyRepository.class})
    protected static class TestConfiguration {
        // test bean definitions ...
        ...
    }
}

@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Import(MongoTestServerConfiguration.class)
public @interface EnableMongoTestServer {

}

public class MongoTestServerConfiguration {
	@Bean
	public MongoTemplate mongoTemplate(MongoDatabaseFactory mongoDbFactory) {
		return new MongoTemplate(mongoDbFactory);
	}

	@Bean
	public MongoDatabaseFactory mongoDbFactory(MongoServer mongoServer) {
		String connectionString = mongoServer.getConnectionString();
		return new SimpleMongoClientDatabaseFactory(connectionString + "/test");
	}

	@Bean(destroyMethod = "shutdown")
	public MongoServer mongoServer() {
		MongoServer mongoServer = new MongoServer(new MemoryBackend());
		mongoServer.bind();
		return mongoServer;
	}
}

H2 MVStore backend

The H2 MVStore backend connects the server to a MVStore that can either be in-memory or on-disk.

<dependency>
    <groupId>de.bwaldvogel</groupId>
    <artifactId>mongo-java-server-h2-backend</artifactId>
    <version>1.45.0</version>
</dependency>

Example

public class Application {

    public static void main(String[] args) throws Exception {
        MongoServer server = new MongoServer(new H2Backend("database.mv"));
        server.bind("localhost", 27017);
    }

}

PostgreSQL backend

The PostgreSQL backend is a proof-of-concept implementation that connects the server to a database in a running PostgreSQL 9.5+ instance. Each MongoDB database is mapped to a schema in Postgres and each MongoDB collection is stored as a table.

<dependency>
    <groupId>de.bwaldvogel</groupId>
    <artifactId>mongo-java-server-postgresql-backend</artifactId>
    <version>1.45.0</version>
</dependency>

Example

public class Application {

    public static void main(String[] args) throws Exception {
        DataSource dataSource = new org.postgresql.jdbc3.Jdbc3PoolingDataSource();
        dataSource.setDatabaseName(…);
        dataSource.setUser(…);
        dataSource.setPassword(…);
        MongoServer server = new MongoServer(new PostgresqlBackend(dataSource));
        server.bind("localhost", 27017);
    }

}

Building a "fat" JAR that contains all dependencies

If you want to build a version that is not on Maven Central you can do the following:

  1. Build a "fat" JAR that includes all dependencies using "./gradlew shadowJar"
  2. Copy build/libs/mongo-java-server-[version]-all.jar to your project, e.g. to the libs directory.
  3. Import that folder (e.g. via Gradle using testCompile fileTree(dir: 'libs', include: '*.jar'))

Contributing

Please read the contributing guidelines if you want to contribute code to the project.

If you want to thank the author for this library or want to support the maintenance work, we are happy to receive a donation.

Donate

Ideas for other backends

Faulty backend

A faulty backend could randomly fail queries or cause timeouts. This could be used to test the client for error resilience.

Fuzzy backend

Fuzzing the wire protocol could be used to check the robustness of client drivers.

Transactions

Please note that transactions are currently not supported. Please see the discussion in issue #143.

When using mongo-java-server for integration tests, you can use Testcontainers or Embedded MongoDB instead to spin-up a real MongoDB that will have full transaction support.

Related Work

  • Testcontainers

    • Can be used to spin-up a real MongoDB instance in a Docker container
  • Embedded MongoDB

    • Spins up a real MongoDB instance
  • fongo

    • focus on unit testing
    • no wire protocol implementation
    • intercepts the java mongo driver
    • currently used in nosql-unit

mongo-java-server's People

Contributors

advptr avatar agiannone avatar astha-textiq avatar bsautel avatar bwaldvogel avatar dowinter avatar epheatt avatar gaff avatar hossomi avatar ikus060 avatar jed204 avatar jmoghisi avatar jturin avatar justinchuch avatar lucas-c avatar marchpig avatar msmerc avatar ncomet avatar nfnitloop avatar rgruber1 avatar snava10 avatar talaverete avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mongo-java-server's Issues

networkless version of the server (or MongoDatabase)

Hello,

We're using mongo java server for many in-memory tests (with MemoryBackend). For each test there is a new TCP connection created and closed which uses unnecessary system resources.

The reason it is happening for each test is because we prefer @Rule over @ClassRule to avoid static fields in JUnit.

Are there any plans to create lightweight version of the server (or MongoDatabase) without networking ?

Regards.

1.13.0 compatibility with mongo-java-driver and mongo db version

When I update the mongo-java-server to 1.13.0 version, I am no longer able to create a mongo connection to mongo db version 3.4 (mongo-java-driver 3.4.3). I see connection timeout errors. Is the 1.13.0 jar compatible with these versions? I have no problems connecting with 1.6.0 version which was being used earlier.

Unexpected number of upserts: 2

With latest 1.11.0, the bulkWrite with multiple ReplaceOneModel and upsert=true is throwing
"Unexpected number of upserts: xx" exception
(1.9.7 works as expected)

Example Code:

List<ReplaceOneModel<T>> models = new ArrayList<>(size);
            for (T entity : entities) {
                Object id = mongo.codecs.id(entity);
                models.add(new ReplaceOneModel<>(Filters.eq("_id", id), entity, new ReplaceOptions().upsert(true)));
            }
            collection().bulkWrite(models, new BulkWriteOptions().ordered(false));

and i found mongo-java-server added those lines in 1.11.0
de.bwaldvogel.mongo.backend.AbstractMongoDatabase
Line 307

        if (!upserts.isEmpty()) {
            if (upserts.size() != 1) {
                throw new IllegalStateException("Unexpected number of upserts: " + upserts.size());
            }
            response.put("upserted", upserts);
        }

is this expected? by debugging, i think the upserts can be more than one if there are multiple upserts occurred

Thanks

no such cmd: listIndexes

listIndexes is not yet implemented, I guess :)

07:44:12.982 INFO  [nioEventLoopGroup-3-4]: adding unique index [key] for collection test
07:44:12.987 ERROR [nioEventLoopGroup-3-4]: unknown query: { "listIndexes" : "test"}
07:44:12.992 ERROR [nioEventLoopGroup-3-4]: unknown command: MongoQuery(header: MessageHeader(request: 42, responseTo: 0), collection: test_feeds.$cmd, query: { "listIndexes" : "test"}, returnFieldSelector: { })
de.bwaldvogel.mongo.exception.NoSuchCommandException: no such cmd: listIndexes
    at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.handleCommand(AbstractMongoDatabase.java:148) ~[mongo-java-server-core-1.4.1.jar:na]
    at de.bwaldvogel.mongo.backend.AbstractMongoBackend.handleCommand(AbstractMongoBackend.java:130) ~[mongo-java-server-core-1.4.1.jar:na]
    at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleCommand(MongoDatabaseHandler.java:147) [mongo-java-server-core-1.4.1.jar:na]
    at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleQuery(MongoDatabaseHandler.java:97) [mongo-java-server-core-1.4.1.jar:na]
    at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:73) [mongo-java-server-core-1.4.1.jar:na]
    at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:37) [mongo-java-server-core-1.4.1.jar:na]

Delete by id query

MongoDB java driver is intelligent and it sets OP_DELETE flag to 1 if it is "delete by id" query (that is, a query which looks like {"_id": "<identifier>"}):

if ( keys.size() == 1 && keys.iterator().next().equals( "_id" ) && _query.get( keys.iterator().next() ) instanceof ObjectId)
    writeInt( 1 );
else
    writeInt( 0 );

However, mongo-java-server because of some reason does not support OP_DELETE flags other than zero. Hence "delete by id" queries trigger an exception in the server.

renaming a collection is not possible

this.mongoDB.getCollection("test_1434609693762").rename("test", true);

results in:

com.mongodb.CommandFailureException: { "serverUsed" : "localhost:35233" , "errmsg" : "no such cmd: renameCollection" , "bad cmd" : { "renameCollection" : "testdb.test_1434609693762" , "to" : "testdb.test" , "dropTarget" : true} , "code" : 59 , "ok" : 0}
at com.mongodb.CommandResult.getException(CommandResult.java:76)
at com.mongodb.CommandResult.throwOnError(CommandResult.java:140)
at com.mongodb.DBCollection.rename(DBCollection.java:1314)
at com.mongodb.DBCollection.rename(DBCollection.java:1296)

Using:

    <dependency>
        <groupId>org.mongodb</groupId>
        <artifactId>mongo-java-driver</artifactId>
        <version>2.13.0</version>
    </dependency>
    <dependency>
        <groupId>de.bwaldvogel</groupId>
        <artifactId>mongo-java-server</artifactId>
        <version>1.4.1</version>
        <scope>test</scope>
    </dependency>

Any ideas?

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder"

Getting next warning trace when using mongo-java-sever as dependency and not using SLF4j.

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further detail
s.

I think it could be fixed adding slf4j-simple as dependency in the server pom.xml.

PostgreSQL backend

Hello!
I was thinking about PostgreSQL backed - with JSONB, json_agg and other PG build-in features to expose tables as collections.

Could you help me with architecture?
What is the best place to start?

My first approach:

  • .createCollection() = CREATE TABLE name (id serial, data JSONB)
  • .find() = SELECT jsonb_agg( ROW_TO_JSON(*) ) WHERE ???
  • .insert([a,b,c]) = INSERT INTO name (data) VALUES (a),(b),(c)

Error getting aggregate

Is the command aggregate not implemented?

com.mongodb.MongoCommandException: Command failed with error 59: 'no such cmd: aggregate' on server localhost:27117. The full response is { "$err" : "no such cmd: aggregate", "errmsg" : "no such cmd: aggregate", "code" : 59, "bad cmd" : { "aggregate" : "devices", "pipeline" : [{ "$match" : { } }, { "$project" : { "_id" : 1 } }, { "$group" : { "_id" : null, "ids" : { "$addToSet" : "$_id" } } }], "cursor" : { } }, "ok" : 0 }

java.lang.UnsupportedOperationException: can't sort class java.util.UUID

Test Case:

  1. Store in database two documents with UUID type as id
  2. Perform distinct operation with UUID.class result type
    Currently test will fail with error:

java.lang.UnsupportedOperationException: can't sort class java.util.UUID
at de.bwaldvogel.mongo.backend.ValueComparator.getTypeOrder(ValueComparator.java:141)
at de.bwaldvogel.mongo.backend.ValueComparator.compare(ValueComparator.java:65)
at java.util.TreeMap.put(TreeMap.java:552)
at java.util.TreeSet.add(TreeSet.java:255)
at de.bwaldvogel.mongo.backend.AbstractMongoCollection.handleDistinct(AbstractMongoCollection.java:776)
at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.handleCommand(AbstractMongoDatabase.java:141)
at de.bwaldvogel.mongo.backend.AbstractMongoBackend.handleCommand(AbstractMongoBackend.java:193)
at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleCommand(MongoDatabaseHandler.java:149)
at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleQuery(MongoDatabaseHandler.java:91)

Test code:

package my.test;

import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
import de.bwaldvogel.mongo.MongoServer;
import de.bwaldvogel.mongo.backend.memory.MemoryBackend;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.Document;

import java.util.ArrayList;
import java.util.List;
import java.util.UUID;

import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsInAnyOrder;

public class DistinctUUIDTest {
    private MongoServer server;
    private MongoClient client;
    private MongoTemplate mongoTemplate;

    @Before
    public void init() {
        server = new MongoServer(new MemoryBackend());
        client = new MongoClient(new ServerAddress(server.bind()));
        mongoTemplate = new MongoTemplate(client, "uuidDatabase");
    }


    @After
    public void cleanup() {
        client.close();
        server.shutdown();
    }

    @Test
    public void systemSupportsDistinctWithUUIDResultType() {
        final TestEntity first = mongoTemplate.save(new TestEntity(UUID.randomUUID(), "first"), "test");
        final TestEntity second = mongoTemplate.save(new TestEntity(UUID.randomUUID(), "second"), "test");

        final List<UUID> ids = mongoTemplate.getCollection("test").distinct("_id", UUID.class).into(new ArrayList<>());
        assertThat(ids, containsInAnyOrder(first.getId(), second.getId()));
    }

    @Document(collection = "test")
    public static class TestEntity {
        @Id
        private UUID id;
        private String name;

        public TestEntity() {
        }

        public TestEntity(UUID id, String name) {
            this.id = id;
            this.name = name;
        }

        public UUID getId() {
            return id;
        }

        public String getName() {
            return name;
        }

        public void setId(UUID id) {
            this.id = id;
        }

        public void setName(String name) {
            this.name = name;
        }
    }
}

It looks like UUID type is not supported in ValueComparator class. Could you fix it or explain why it cannot work? We used Fongo earlier, and the test worked there.

DECIMAL128 is not supported

mongo-java-driver contains logic to read Decimal128 values: https://github.com/mongodb/mongo-java-driver/blob/master/bson/src/main/org/bson/AbstractBsonWriter.java#L873

But https://github.com/bwaldvogel/mongo-java-server/blob/master/core/src/main/java/de/bwaldvogel/mongo/wire/BsonDecoder.java does not support it.

As result query like
{ "singleNumberField" : { "$eq" : { "$numberDecimal" : "1" } } }
will fail with

Caused by: java.io.IOException: unknown type: 0x13
	at de.bwaldvogel.mongo.wire.BsonDecoder.decodeValue(BsonDecoder.java:99)
	at de.bwaldvogel.mongo.wire.BsonDecoder.decodeBson(BsonDecoder.java:38)
	at de.bwaldvogel.mongo.wire.BsonDecoder.decodeValue(BsonDecoder.java:54)
	at de.bwaldvogel.mongo.wire.BsonDecoder.decodeBson(BsonDecoder.java:38)
	at de.bwaldvogel.mongo.wire.MongoWireProtocolHandler.handleQuery(MongoWireProtocolHandler.java:175)
	at de.bwaldvogel.mongo.wire.MongoWireProtocolHandler.decode(MongoWireProtocolHandler.java:88)
	at de.bwaldvogel.mongo.wire.MongoWireProtocolHandler.decode(MongoWireProtocolHandler.java:26)
	at io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:343)

InvalidDataAccessApiUsageException when do count on springboot

Hello,

In my spring project, I migrated from fongo for mongo-java-server.

But now, I encounter a problem with my count queries.

My requests look like this:

    @CountQuery(value = "{'value.$id':'?0', 'object.value1':'?1', 'object.value2':'?2'}")
    int countItemsAffectedByValueAndValue1Value2(String value, String value1, String value2);

In my unit tests, the requests throw an error 59:

org.springframework.dao.InvalidDataAccessApiUsageException: Command failed with error 59 (CommandNotFound): 'no such command: '$query'' on server 127.0.0.1:50778. The full response is { "$err" : "no such command: '$query'", "errmsg" : "no such command: '$query'", "code" : 59, "codeName" : "CommandNotFound", "bad cmd" : { "$query" : { "count" : "items", "query" : { "value.$id" : "any-id", "object.value1" : "FR", "object.value2" : "en" } }, "$readPreference" : { "mode" : "primaryPreferred" } }, "ok" : 0 }; nested exception is com.mongodb.MongoCommandException: Command failed with error 59 (CommandNotFound): 'no such command: '$query'' on server 127.0.0.1:50778. The full response is { "$err" : "no such command: '$query'", "errmsg" : "no such command: '$query'", "code" : 59, "codeName" : "CommandNotFound", "bad cmd" : { "$query" : { "count" : "products", "query" : { "value.$id" : "any-id", "objet.value1" : "FR", "objet.value2" : "en" } }, "$readPreference" : { "mode" : "primaryPreferred" } }, "ok" : 0 }

My test configuration file looks like this:

    @Bean(destroyMethod = "shutdown")
    public MongoServer mongoServer() {
        MongoServer mongoServer = new MongoServer(new MemoryBackend());
        mongoServer.bind();
        return mongoServer;
    }

    @Bean(destroyMethod = "close")
    public MongoClient mongoClient(MongoServer mongoServer) {
        return new MongoClient(new ServerAddress(mongoServer.getLocalAddress()));
    }

    @Bean
    @Qualifier("idpMongoCollection")
    public MongoCollection<Document> idpMongoCollection() {
        return mongoClient(mongoServer()).getDatabase("idp_repository").getCollection("idp_repository");
    }

To be honest, i'm not sure if it's an issue or if I missed something

Do you have an idea about what’s wrong with my tests ?

I remain at your disposal for any further information.

Regards

$not and $size don't seem to work together

As I explained in #31 , I am giving a try to replace fongo with mongo-java-server. Here is the second blocking issue I encounter.

In a request, I am using the $size operator combined to the $not one on a UUID array field. It matches fields that should not be matched.

Here is the query I execute logged by the driver (I want to match objects in which the uuidArray contains a value and other ones (not only this one):

12:10:28.839 [main] DEBUG org.mongodb.driver.protocol.command - Sending command '{ "update" : "collection", "ordered" : true, "updates" : [{ "q" : { "$and" : [{ "uuidArray" : { "$binary" : { "base64" : "NETg3mmBuifOvuvTONgxhg==", "subType" : "03" } } }, { "uuidArray" : { "$not" : { "$size" : 1 } } }] }, "u" : { "$pull" : { "uuidArray" : { "$binary" : { "base64" : "NETg3mmBuifOvuvTONgxhg==", "subType" : "03" } } } }, "multi" : true }] }' with request id 42 to database test on connection [connectionId{localValue:6}] to server localhost:38239

The log file contains multiple errors with the #30 error message because I am using UUID types, this may hide some log message that could help me to know more about what happens internally.

I noticed if I remove the $not operator, the $size one seems to behave as expected. (I tried to match arrays containing a value and whose size is two and it works, but in my use case I can have arrays of any size and, as far as I know, I need the $not operator to describe what I want).

Implement $replaceRoot aggregation stage

I'm trying to use $replaceRoot but it seems it's not implemented yet. (version: 1.11.1)

de.bwaldvogel.mongo.exception.MongoServerError: [Error 40324] Unrecognized pipeline stage name: '$replaceRoot'

Implement ELEM_MATCH query operator

When using filter with elemMatch
val filter = elemMatch("materials", eq("materialId", query.materialId))

🔥 an exception is thrown
java.lang.IllegalArgumentException: unhandled query operator: ELEM_MATCH at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkExpressionMatch(DefaultQueryMatcher.java:472) at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkMatchesValue(DefaultQueryMatcher.java:329)

Removing non-existent intermediate sub-key should not throw an exception

The current behavior for removing a sub-key is this:

Object subObject = getFieldValueListSafe(document, mainKey);
if (subObject instanceof Document || subObject instanceof List<?>) {
   return removeSubdocumentValue(subObject, subKey, matchPos);
} else {
   throw new MongoServerException("failed to remove subdocument");
}

With the code above, if an intermediate sub-key is missing, an exception will be thrown. It is perfectly reasonable that an intermediate sub-key does not exist on the document in question. If it doesn't, the operation (remove or rename) should be ignored for that document.

BinData fields don't match when queried

When you have a document with a binary field like "bin" : { "$binary" : "dBZZjw==" , "$type" : 0} queries for that document filtering by the binary field don't yield any result.

It looks like mongo-java-server is trying to compare filter and document bin fields using equals, but it turns out bin fields are represented by arrays and hence not comparable with Object.equals(..)

Filtering fields doesn´t work as expected

Filtering the returning document fields doesn't work as expected.

    @Test
    public void test() throws MadeValidationException {

        {
            final DBCollection collection = this.exporterDB.getCollection("test");
            collection.insert(new BasicDBObject(ImmutableMap.of("order", 1, "visits", 2, "eid",
                    RandomStringUtils.randomNumeric(12))));

        }
        {
            final DBCursor cur = this.exporterDB.getCollection("test").find(new BasicDBObject(Maps.newHashMap()));
            while (cur.hasNext()) {
                System.out.println(cur.next());
            }
            cur.close();
//{ "_id" : { "$oid" : "558821dfccf26bea9312754c"} , "order" : 1 , "visits" : 2 , "eid" : "033628837086"}
//OK
        }
        {
            BasicDBObject fieldsMap = new BasicDBObject();
            fieldsMap.put("_id", 0);
            final DBCursor cur = this.exporterDB.getCollection("test").find(new BasicDBObject(Maps.newHashMap()),
                    new BasicDBObject(fieldsMap));
            while (cur.hasNext()) {
                System.out.println(cur.next());
            }
            cur.close();
        }
//{ }
//NOT OK
        {
            BasicDBObject fieldsMap = new BasicDBObject();
            fieldsMap.put("visits", 0);
            final DBCursor cur = this.exporterDB.getCollection("test").find(new BasicDBObject(Maps.newHashMap()),
                    new BasicDBObject(fieldsMap));
            while (cur.hasNext()) {
                System.out.println(cur.next());
            }
            cur.close();
        }
//{ "_id" : { "$oid" : "558821dfccf26bea9312754c"}}
// NOT OK
    }

Unsupported $$ROOT aggregation expression

As suggested here, I gave a try to mongo-java-server to replace fongo which seems to be no longer maintained and does not work with the 3.8 MongoDB Java driver.

First, I would like to thank you @bwaldvogel and all the other contributors for working on this project. Using the wire protocol seems to be a more reliable method as mocking the Java MongoDB driver as Fongo does since each implementation change in the driver can break the tool.

I ran all my tests using mongo-java-server and I noticed two blocking issues. I will report the other one in another ticket.

In an aggregation pipeline, I use the grouping operator and the $$ROOT expression that references the whole object, unlike ${field} which references a field of the object. It is documented here.

I get no error telling that it is unsupported but the field which should contain the result of this expression is null.

I tried to use the ${field} syntax and it seems to be supported. It would probably be necessary to extend this syntax to support the $$ROOT expression and maybe the other $$ specific expressions detailed in the documentation.

invalid operator: $elemMatch

I'm having an odd issue. I'm using the Vertx MongoDB client and whenever I call a query containing $elemMatch more than once, I get:
com.mongodb.MongoCommandException: Command failed with error 10068: 'invalid operator: $elemMatch' on server 127.0.0.1:31623. The full response is { "$err" : "invalid operator: $elemMatch", "errmsg" : "invalid operator: $elemMatch", "code" : 10068, "ok" : 0 }

or when performing a normal find:
com.mongodb.MongoQueryException: Query failed with error code 10068 and error message 'invalid operator: $elemMatch' on server 127.0.0.1:1945

The error does not occur in the production database (running mongo 3.6.2), only when using the mock server.
Example queries:
{"id":{"$elemMatch":{"$in":["test"]}}}

As the code seems to be running fine on the real database, I'm thinking that maybe something's going wrong in the mock?

Connect with SSL?

Can you provide the option to connect to a MongoServer instance with SSL?

UUID type not supported in the Document.toString() method

When using fields with UUID type, I get some errors in the log output such as this one:

SLF4J: Failed toString() invocation on an object of type [de.bwaldvogel.mongo.bson.Document]
Reported exception:
java.lang.IllegalArgumentException: Unknown value: 6fbb008f-9af3-4132-b72c-c68198c1964b
	at de.bwaldvogel.mongo.bson.Document.toJsonValue(Document.java:162)
	at de.bwaldvogel.mongo.bson.Document.lambda$toString$0(Document.java:133)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.Iterator.forEachRemaining(Iterator.java:116)
	at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
	at de.bwaldvogel.mongo.bson.Document.toString(Document.java:134)
	at de.bwaldvogel.mongo.bson.Document.toJsonValue(Document.java:151)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
	at de.bwaldvogel.mongo.bson.Document.toJsonValue(Document.java:160)
	at de.bwaldvogel.mongo.bson.Document.lambda$toString$0(Document.java:133)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.Iterator.forEachRemaining(Iterator.java:116)
	at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
	at de.bwaldvogel.mongo.bson.Document.toString(Document.java:134)
	at org.slf4j.helpers.MessageFormatter.safeObjectAppend(MessageFormatter.java:299)
	at org.slf4j.helpers.MessageFormatter.deeplyAppendParameter(MessageFormatter.java:271)
	at org.slf4j.helpers.MessageFormatter.arrayFormat(MessageFormatter.java:233)
	at org.slf4j.helpers.MessageFormatter.arrayFormat(MessageFormatter.java:173)
	at ch.qos.logback.classic.spi.LoggingEvent.getFormattedMessage(LoggingEvent.java:293)
	at ch.qos.logback.classic.spi.LoggingEvent.prepareForDeferredProcessing(LoggingEvent.java:206)
	at ch.qos.logback.core.OutputStreamAppender.subAppend(OutputStreamAppender.java:223)
	at ch.qos.logback.core.OutputStreamAppender.append(OutputStreamAppender.java:102)
	at ch.qos.logback.core.UnsynchronizedAppenderBase.doAppend(UnsynchronizedAppenderBase.java:84)
	at ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51)
	at ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270)
	at ch.qos.logback.classic.Logger.callAppenders(Logger.java:257)
	at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421)
	at ch.qos.logback.classic.Logger.filterAndLog_2(Logger.java:414)
	at ch.qos.logback.classic.Logger.debug(Logger.java:490)
	at de.bwaldvogel.mongo.wire.MongoWireProtocolHandler.handleQuery(MongoWireProtocolHandler.java:203)
	at de.bwaldvogel.mongo.wire.MongoWireProtocolHandler.decode(MongoWireProtocolHandler.java:92)
	at de.bwaldvogel.mongo.wire.MongoWireProtocolHandler.decode(MongoWireProtocolHandler.java:26)
	at io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:343)
	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:489)
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:428)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1434)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:965)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:628)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:563)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)

Thanks to the debugger I could ensure that the type of the value in this case is UUID. I also could see that the UUID type is not supported in the toJsonValue method.

Note that this does not break the tests execution. It sounds like this issue appears when the logger calls toString to the log message parameter (a Document in this case). Instead of logging the original log message, it logs this error.

Query result for embedded document is different

Hi,

I found that the query result for embedded document is different with the real DB.
The following test is failed as the 1st document is not found by the query.

@Test
public void test() {
    mongoTemplate.insert(new A()); // 1st
    mongoTemplate.insert(new A(new B())); // 2nd
    mongoTemplate.insert(new A(new B(new C()))); // 3rd

    List<A> list = mongoTemplate.find(Query.query(Criteria.where("b.c").is(null)), A.class);

    Assert.assertEquals(2, list.size()); // Failed here [Expected:2  / Actual:1]
    list.forEach(a -> Assert.assertTrue(a.b == null || a.b.c == null));
}
public class A {
    public B b;

    public A() {}
    public A(B b) {
        this.b = b;
    }
}
public class B {
    public C c;

    public B() {}
    public B(C c) {
        this.c = c;
    }
}
public class C {
}

Thanks!

Doesn't support nested unique index.

Creating a unique index with a sub-document using the dot notation doesn't work properly. No error are reported in the log.

dbCollection.createIndex(new BasicDBObject("action.actionid", 1), new BasicDBObject("unique", true));

Seams UniqueIndex#getKeyValue(BSONObject) is making a direct call to BSONObject#get() which doesn't support dot notation. I'm suggesting to make a call to Utils#getSubdocumentValue().

MongoSocketReadException at testing insertMany

I got the following error with mongo-java 3.0.2 and 3.0.3 drivers with insertMany.

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.14:test (default-test) on project presto-mongodb: Execution default-test of goal org.apache.maven.plugins:maven-surefire-plugin:2.14:test failed: There was an error in the forked process
[ERROR] org.testng.TestNGException:
[ERROR] Cannot instantiate class com.facebook.presto.mongodb.TestMongodbDistributedQueries
[ERROR] at org.testng.internal.ObjectFactoryImpl.newInstance(ObjectFactoryImpl.java:38)
[ERROR] at org.testng.internal.ClassHelper.createInstance1(ClassHelper.java:387)
[ERROR] at org.testng.internal.ClassHelper.createInstance(ClassHelper.java:299)
[ERROR] at org.testng.internal.ClassImpl.getDefaultInstance(ClassImpl.java:110)
[ERROR] at org.testng.internal.ClassImpl.getInstances(ClassImpl.java:186)
[ERROR] at org.testng.internal.TestNGClassFinder.<init>(TestNGClassFinder.java:120)
[ERROR] at org.testng.TestRunner.initMethods(TestRunner.java:409)
[ERROR] at org.testng.TestRunner.init(TestRunner.java:235)
[ERROR] at org.testng.TestRunner.init(TestRunner.java:205)
[ERROR] at org.testng.TestRunner.<init>(TestRunner.java:153)
[ERROR] at org.testng.SuiteRunner$DefaultTestRunnerFactory.newTestRunner(SuiteRunner.java:522)
[ERROR] at org.testng.SuiteRunner.init(SuiteRunner.java:157)
[ERROR] at org.testng.SuiteRunner.<init>(SuiteRunner.java:111)
[ERROR] at org.testng.TestNG.createSuiteRunner(TestNG.java:1299)
[ERROR] at org.testng.TestNG.createSuiteRunners(TestNG.java:1286)
[ERROR] at org.testng.TestNG.runSuitesLocally(TestNG.java:1140)
[ERROR] at org.testng.TestNG.run(TestNG.java:1057)
[ERROR] at org.apache.maven.surefire.testng.TestNGExecutor.run(TestNGExecutor.java:77)
[ERROR] at org.apache.maven.surefire.testng.TestNGDirectoryTestSuite.executeMulti(TestNGDirectoryTestSuite.java:189)
[ERROR] at org.apache.maven.surefire.testng.TestNGDirectoryTestSuite.execute(TestNGDirectoryTestSuite.java:105)
[ERROR] at org.apache.maven.surefire.testng.TestNGProvider.invoke(TestNGProvider.java:117)
[ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR] at java.lang.reflect.Method.invoke(Method.java:483)
[ERROR] at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray2(ReflectionUtils.java:208)
[ERROR] at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:158)
[ERROR] at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:86)
[ERROR] at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
[ERROR] at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:95)
[ERROR] Caused by: java.lang.reflect.InvocationTargetException
[ERROR] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
[ERROR] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
[ERROR] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[ERROR] at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
[ERROR] at org.testng.internal.ObjectFactoryImpl.newInstance(ObjectFactoryImpl.java:29)
[ERROR] ... 29 more
[ERROR] Caused by: com.mongodb.MongoSocketReadException: Prematurely reached end of stream
[ERROR] at com.mongodb.connection.SocketStream.read(SocketStream.java:88)
[ERROR] at com.mongodb.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:491)
[ERROR] at com.mongodb.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:221)
[ERROR] at com.mongodb.connection.UsageTrackingInternalConnection.receiveMessage(UsageTrackingInternalConnection.java:102)
[ERROR] at com.mongodb.connection.DefaultConnectionPool$PooledConnection.receiveMessage(DefaultConnectionPool.java:416)
[ERROR] at com.mongodb.connection.WriteCommandProtocol.receiveMessage(WriteCommandProtocol.java:184)
[ERROR] at com.mongodb.connection.WriteCommandProtocol.execute(WriteCommandProtocol.java:76)
[ERROR] at com.mongodb.connection.InsertCommandProtocol.execute(InsertCommandProtocol.java:66)
[ERROR] at com.mongodb.connection.InsertCommandProtocol.execute(InsertCommandProtocol.java:37)
[ERROR] at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:155)
[ERROR] at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:219)
[ERROR] at com.mongodb.connection.DefaultServerConnection.insertCommand(DefaultServerConnection.java:108)
[ERROR] at com.mongodb.operation.MixedBulkWriteOperation$Run$2.executeWriteCommandProtocol(MixedBulkWriteOperation.java:416)
[ERROR] at com.mongodb.operation.MixedBulkWriteOperation$Run$RunExecutor.execute(MixedBulkWriteOperation.java:604)
[ERROR] at com.mongodb.operation.MixedBulkWriteOperation$Run.execute(MixedBulkWriteOperation.java:363)
[ERROR] at com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:148)
[ERROR] at com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:141)
[ERROR] at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:186)
[ERROR] at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:177)
[ERROR] at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:141)
[ERROR] at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:72)
[ERROR] at com.mongodb.Mongo.execute(Mongo.java:747)
[ERROR] at com.mongodb.Mongo$2.execute(Mongo.java:730)
[ERROR] at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:294)
[ERROR] at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:282)

Issue with $exists using dot notation

It seems that any numbers following a dot (as well as any trailing dots) are ignored for $exists queries.

Document doc = new Document("_id", new ObjectId()).append("values", Arrays.asList("A", "B", "C"));
mongoCollection.insertOne(doc);

Document exists = new Document("$exists", true);
Document doc1 = mongoCollection.find(new Document("values", exists)).first(); //exists
Document doc2 = mongoCollection.find(new Document("values.", exists)).first(); //exists
Document doc3 = mongoCollection.find(new Document("values.1", exists)).first(); //exists
Document doc4 = mongoCollection.find(new Document("values.1...", exists)).first(); //exists
Document doc5 = mongoCollection.find(new Document("values.111", exists)).first(); //exists
Document doc6 = mongoCollection.find(new Document("values111", exists)).first(); //null
Document doc7 = mongoCollection.find(new Document("values.....", exists)).first(); //exists
Document doc8 = mongoCollection.find(new Document("values.....111", exists)).first(); //illegal key
Document doc9 = mongoCollection.find(new Document("values.abc", exists)).first(); //null

invalid codeName attribute for DuplicateKeyException

When inserting document with duplicate key I get the following error

Command failed with error 11000 (Location11000): 
'E11000 duplicate key error collection: testDB.entity index: _id_ dup key: { : "e1" }' 
on server localhost:50769. The full response is 
{"$err": "E11000 duplicate key error collection: testDB.entity index: _id_ dup key: { : \"e1\" }", 
"errmsg": "E11000 duplicate key error collection: testDB.entity index: _id_ dup key: { : \"e1\" }",
 "code": 11000, "codeName": "Location11000",
 "ok": 0}

MongoDB returns codeName: DuplicateKey (compared to Location11000)

Regex Support on _id field

Hello,

we are currently switching from fongo to mongo-java-server for our unit testing. So far it works pretty good, but we have an issue related to a search with IgnoreCase, where we use a regex query like that:

@Query(value = "{'_id': {$regex : '^?0$', $options: 'i'}}")
UserDocument findByIdIgnoreCase(String userId);

As a result it always returns null. Is there regex support for the inmemory db?

//update: I analyzed the behaviour and it seems, that regex works on fields, that are not the _id of the document. In case of regex on the _id field it always returns null for me.

update with upsert wrong behaviour regarding nMatches vs nModified

When updating a document with the upsert options, I've experience wrong behaviour with this driver.
According to mongo documentation:

If upsert is true and no document matches the query criteria, update() inserts a single document. The update creates the new document with either ...

But looking at the code


It checks if the nModified == 0.
This is not the expected behavior. We should check if nMatched == 0.

Issue while creating In memory DB

Hi,

I am new to mongo DB.
Have a requirement where have to create in memory mongo db. Was trying to use the SimpleTest from the repository. I didn't modified any thing. Getting Time out error while executing the test case.

com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:54689, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketReadTimeoutException: Timeout while receiving message}, caused by {java.net.SocketTimeoutException: Read timed out}}]
at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:375)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:104)
at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:203)
at com.mongodb.operation.CountOperation.execute(CountOperation.java:207)
at com.mongodb.operation.CountOperation.execute(CountOperation.java:54)
at com.mongodb.Mongo.execute(Mongo.java:818)
at com.mongodb.Mongo$2.execute(Mongo.java:805)
at com.mongodb.MongoCollectionImpl.count(MongoCollectionImpl.java:185)
at com.mongodb.MongoCollectionImpl.count(MongoCollectionImpl.java:165)
at com.cisco.SimpleTest.testSimpleInsertQuery(SimpleTest.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:678)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)

Thanks in advance.

Support mongo driver 3

When trying to use the mongo v3 java client with v1.4.1, the following exceptions are thrown (at least in our local tests):

ERROR de.bwaldvogel.mongo.wire.MongoWireProtocolHandler - unknown command: MongoQuery(header: MessageHeader(request: 3, responseTo: 0), collection: admin.$cmd, query: { "getlasterror" : 1}, returnFieldSelector: null)
de.bwaldvogel.mongo.exception.NoSuchCommandException: no such cmd: getlasterror
at de.bwaldvogel.mongo.backend.AbstractMongoBackend.handleAdminCommand(AbstractMongoBackend.java:92) ~[mongo-java-server-core-1.4.1.jar:na]

ERROR de.bwaldvogel.mongo.wire.MongoWireProtocolHandler - unknown command: MongoQuery(header: MessageHeader(request: 8, responseTo: 0), collection: testdb.$cmd, query: { "create" : "basicTest" , "capped" : false}, returnFieldSelector: null)
de.bwaldvogel.mongo.exception.NoSuchCommandException: no such cmd: create
at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.handleCommand(AbstractMongoDatabase.java:148) ~[mongo-java-server-core-1.4.1.jar:na]

$and with $all and $nin not working

For the known reasons I have tried to switch from fongo to mongo-java-server.

Except for the following case, all tests ran correctly. Here the demo test class to reproduce the problem (find all entities that have a tag "A", but not a tag "B" or "C" -> expect no entities at all):

@RunWith(SpringRunner.class)
@SpringBootTest(classes={Issue.TestConfiguration.class})
public class Issue {

	@Autowired private TestRepository repository;

	@Test
	public void testTags() throws Exception {
		repository.deleteAll();
		repository.insert(new TestEntity("ID_1", "A", "B"));
		repository.insert(new TestEntity("ID_2", "A", "C"));

		List<TestEntity> entitiesByTags = repository.findByTags(Arrays.asList("A"), Arrays.asList("B", "C"));
		assertEquals(0, entitiesByTags.size());
	}

	@Configuration
	@EnableMongoRepositories(basePackageClasses={TestRepository.class})
	protected static class TestConfiguration {
		@Bean
		public MongoTemplate mongoTemplate(MongoClient mongoClient) {
			return new MongoTemplate(mongoDbFactory(mongoClient));
		}

		@Bean
		public MongoDbFactory mongoDbFactory(MongoClient mongoClient) {
			return new SimpleMongoDbFactory(mongoClient, "test");
		}

		@Bean(destroyMethod="shutdown")
		public MongoServer mongoServer() {
			MongoServer mongoServer = new MongoServer(new MemoryBackend());
			mongoServer.bind();
			return mongoServer;
		}

		@Bean(destroyMethod="close")
		public MongoClient mongoClient(MongoServer mongoServer) {
			return new MongoClient(new ServerAddress(mongoServer.getLocalAddress()));
		}
	}
}
@Document(collection="test")
public class TestEntity {
	@Id private String id;
	@Indexed private Set<String> tags = new HashSet<>();
	public TestEntity() {
	}
	public TestEntity(String id, String... tags) {
		this.id = id;
		this.tags = new HashSet<>(Arrays.asList(tags));
	}
	public String getId() {
		return id;
	}
	public Set<String> getTags() {
		return tags;
	}
}
public interface TestRepository extends MongoRepository<TestEntity, String> {
	/**
	 * @param all https://docs.mongodb.com/manual/reference/operator/query/all/
	 * @param nin https://docs.mongodb.com/manual/reference/operator/query/nin/
	 * @return a list of {@link TestEntity} that contains {@code $all} the specified tags but not the tags specified in {@code $nin}.
	 */
	@Query("{$and:[{'tags':{$all:?0}},{'tags':{$nin:?1}}]}")
	public List<TestEntity> findByTags(Collection<String> all, Collection<String> nin);
}

deleteAll

Hi all,

we are upgrading from Spring 4 to 5 and also from Fongo to mongo-java-server.

My Gradle imports are

compile "org.springframework.session:spring-session-data-mongodb:2.1.2.RELEASE"
compile "org.springframework.data:spring-data-mongodb:2.1.4.RELEASE"
testCompile "de.bwaldvogel:mongo-java-server:1.14.0"

I create the TEST DB and connection as:

@Configuration
@Profile("test")
public class MongoDbTestConfig extends AbstractMongoConfiguration {
    private MongoClient client;

    @Override
    protected String getDatabaseName() {
        return "uam-test";
    }

    @Override
    public MongoClient mongoClient() {

        MongoClientOptions mongoClientOptions = new MongoClientOptions.Builder().maxConnectionIdleTime(0).build();

        MongoServer server = new MongoServer(new MemoryBackend());
        InetSocketAddress serverAddress = server.bind();

        client = new MongoClient(new ServerAddress(serverAddress), mongoClientOptions);

        return client;
    }
}

And I have some JUnit tests in a single class that run the following code after each test:

@After
    public void tearDown() throws Exception {
        userRepository.deleteAll();
    }

Unfortunately, the first few calls to deleteAll will succeed while all subsequent ones will fail with this exception:

2019-04-12 15:37:51,945 ERROR [mongo-server-worker2] d.b.m.w.MongoExceptionHandler: exception for client e5988b4d 
java.lang.IllegalArgumentException: illegal field: subject
	at de.bwaldvogel.mongo.backend.Utils.hasFieldValueListSafe(Utils.java:294)
	at de.bwaldvogel.mongo.backend.Utils.hasSubdocumentValue(Utils.java:258)
	at de.bwaldvogel.mongo.backend.Utils.hasSubdocumentValue(Utils.java:253)
	at de.bwaldvogel.mongo.backend.AbstractUniqueIndex.lambda$hasNoValueForKeys$0(AbstractUniqueIndex.java:36)
	at java.util.stream.MatchOps$1MatchSink.accept(MatchOps.java:90)
	at java.util.ArrayList$ArrayListSpliterator.tryAdvance(ArrayList.java:1359)
	at java.util.stream.ReferencePipeline.forEachWithCancel(ReferencePipeline.java:126)
	at java.util.stream.AbstractPipeline.copyIntoWithCancel(AbstractPipeline.java:498)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:485)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.MatchOps$MatchOp.evaluateSequential(MatchOps.java:230)
	at java.util.stream.MatchOps$MatchOp.evaluateSequential(MatchOps.java:196)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.noneMatch(ReferencePipeline.java:459)
	at de.bwaldvogel.mongo.backend.AbstractUniqueIndex.hasNoValueForKeys(AbstractUniqueIndex.java:36)
	at de.bwaldvogel.mongo.backend.AbstractUniqueIndex.remove(AbstractUniqueIndex.java:41)
	at de.bwaldvogel.mongo.backend.AbstractMongoCollection.removeDocument(AbstractMongoCollection.java:901)
	at de.bwaldvogel.mongo.backend.AbstractMongoCollection.deleteDocuments(AbstractMongoCollection.java:699)
	at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.deleteDocuments(AbstractMongoDatabase.java:858)
	at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.commandDelete(AbstractMongoDatabase.java:333)
	at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.handleCommand(AbstractMongoDatabase.java:133)
	at de.bwaldvogel.mongo.backend.AbstractMongoBackend.handleCommand(AbstractMongoBackend.java:193)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleCommand(MongoDatabaseHandler.java:149)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleQuery(MongoDatabaseHandler.java:91)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:71)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:37)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:337)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:337)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
	at java.lang.Thread.run(Thread.java:748)
2019-04-12 15:37:51,946 WARN  [main] o.m.d.connection: Got socket exception on connection [connectionId{localValue:2}] to localhost:46421. All connections to localhost:46421 will be closed. 
2019-04-12 15:37:51,946 INFO  [mongo-server-worker2] d.b.m.w.MongoWireProtocolHandler: channel [id: 0xe5988b4d, L:/127.0.0.1:46421 ! R:/127.0.0.1:44430] closed 
2019-04-12 15:37:51,947 INFO  [main] o.m.d.connection: Closed connection [connectionId{localValue:2}] to localhost:46421 because there was a socket exception raised by this connection. 

org.springframework.data.mongodb.UncategorizedMongoDbException: Prematurely reached end of stream; nested exception is com.mongodb.MongoSocketReadException: Prematurely reached end of stream

	at org.springframework.data.mongodb.core.MongoExceptionTranslator.translateExceptionIfPossible(MongoExceptionTranslator.java:138)
	at org.springframework.data.mongodb.core.MongoTemplate.potentiallyConvertRuntimeException(MongoTemplate.java:2774)
	at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:540)
	at org.springframework.data.mongodb.core.MongoTemplate.doRemove(MongoTemplate.java:1682)
	at org.springframework.data.mongodb.core.MongoTemplate.remove(MongoTemplate.java:1658)
	at org.springframework.data.mongodb.repository.support.SimpleMongoRepository.deleteAll(SimpleMongoRepository.java:185)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.springframework.data.repository.core.support.RepositoryComposition$RepositoryFragments.invoke(RepositoryComposition.java:359)
	at org.springframework.data.repository.core.support.RepositoryComposition.invoke(RepositoryComposition.java:200)
	at org.springframework.data.repository.core.support.RepositoryFactorySupport$ImplementationMethodExecutionInterceptor.invoke(RepositoryFactorySupport.java:644)
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
	at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:608)
	at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.lambda$invoke$3(RepositoryFactorySupport.java:595)
	at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:595)
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
	at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke(DefaultMethodInvokingMethodInterceptor.java:59)
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
	at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:93)
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
	at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke(SurroundingTransactionDetectorMethodInterceptor.java:61)
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
	at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
	at com.sun.proxy.$Proxy91.deleteAll(Unknown Source)
	at com.swisscom.uam.test.service.UserServiceTest.tearDown(UserServiceTest.java:76)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:75)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:86)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:84)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:251)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:97)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:70)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:190)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
Caused by: com.mongodb.MongoSocketReadException: Prematurely reached end of stream
	at com.mongodb.internal.connection.SocketStream.read(SocketStream.java:92)
	at com.mongodb.internal.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:554)
	at com.mongodb.internal.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:425)
	at com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:289)
	at com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:255)
	at com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:99)
	at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:444)
	at com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:72)
	at com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:200)
	at com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:269)
	at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:131)
	at com.mongodb.operation.MixedBulkWriteOperation.executeCommand(MixedBulkWriteOperation.java:418)
	at com.mongodb.operation.MixedBulkWriteOperation.executeBulkWriteBatch(MixedBulkWriteOperation.java:256)
	at com.mongodb.operation.MixedBulkWriteOperation.access$700(MixedBulkWriteOperation.java:67)
	at com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:200)
	at com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:191)
	at com.mongodb.operation.OperationHelper.withReleasableConnection(OperationHelper.java:424)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:191)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:67)
	at com.mongodb.client.internal.MongoClientDelegate$DelegateOperationExecutor.execute(MongoClientDelegate.java:193)
	at com.mongodb.client.internal.MongoCollectionImpl.executeSingleWriteRequest(MongoCollectionImpl.java:960)
	at com.mongodb.client.internal.MongoCollectionImpl.executeDelete(MongoCollectionImpl.java:940)
	at com.mongodb.client.internal.MongoCollectionImpl.deleteMany(MongoCollectionImpl.java:551)
	at org.springframework.data.mongodb.core.MongoTemplate$9.doInCollection(MongoTemplate.java:1722)
	at org.springframework.data.mongodb.core.MongoTemplate$9.doInCollection(MongoTemplate.java:1682)
	at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:538)
	... 52 more

However, changing the deleteAll with single deletes:

@After
    public void tearDown() throws Exception {
        for(User u: userRepository.findAll()){
            userRepository.delete(u);
        }
    }

fixes it.

I am quite new to mongo-java-server so I am unsure what exactly is going on, any hints?

Thanks in advance and have a nice day

'$pull' does not work well

Hello:
I write a test case to execute a 'pull' operation using your stub mongo, but it dose not worked. My code as followed.

 BasicDBObject obj = json("_id: 1");
 collection.insert(obj);
 collection.update(obj, json("$set: {field: [{'key1': 'value1', 'key2': 'value2'}]}"));
 collection.update(obj, json("$pull: {field: {'key1': 'value1'}}"));

The 'field' should be empty after 'pull', but it exist yet. And if I use

$pull: {field: {'key1': 'value1', 'key2': 'value2'}}

it could work well. However, I think the relation of match conditions of 'pull' is 'OR' but not 'AND'.
Could you help me to resolve it ? Thanks a lot.

Searching via DBRef $id and $exists condition does not work

package test;

import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
import de.bwaldvogel.mongo.MongoServer;
import de.bwaldvogel.mongo.backend.memory.MemoryBackend;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.hamcrest.Matchers;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.query.BasicQuery;

import java.util.List;

import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;

public class DBRefSearch {


    private MongoServer server;
    private MongoClient client;
    private MongoTemplate mongoTemplate;

    @Before
    public void init() {
        server = new MongoServer(new MemoryBackend());
        client = new MongoClient(new ServerAddress(server.bind()));
        mongoTemplate = new MongoTemplate(client, "dbRefDatabase");
    }

    @After
    public void cleanup() {
        client.close();
        server.shutdown();
    }

    @Test
    public void searchViaDbRefId() {
        mongoTemplate.save(new TestEntity(1, "Entity without status", null));
        mongoTemplate.save(new TestEntity(2, "Entity with status", new Status(1, "active")));

        //It works as expected
        final List<TestEntity> fromDbFirstCase = mongoTemplate.find(new BasicQuery("{\"status\" : {\"$exists\" : false}} "), TestEntity.class);
        assertThat(fromDbFirstCase, Matchers.hasSize(1));
        assertThat(fromDbFirstCase.get(0).getId(), is(1));

        //Nothing is found if $id added in query condition. Why?
        final List<TestEntity> fromDbSecondCase = mongoTemplate.find(new BasicQuery("{\"status.$id\" : {\"$exists\" : false}} "), TestEntity.class);
        assertThat(fromDbSecondCase, Matchers.hasSize(1));
        assertThat(fromDbSecondCase.get(0).getId(), is(1));
    }

    @Data
    @NoArgsConstructor
    @AllArgsConstructor
    @Document(collection = "test_entities")
    public static class TestEntity {
        @Id
        private int id;
        private String name;

        @DBRef
        private Status status;
    }

    @Data
    @NoArgsConstructor
    @AllArgsConstructor
    @Document(collection = "statuses")
    public static class Status {
        @Id
        private int id;
        private String name;
    }
}

Aggregation lookup doesnt work as left outer join

The aggregation function lookup only works if in both collections a result is returned. If there is no document found in the related collection it returns null. I've added two test to show the problem

@RunWith(SpringRunner.class)
@ContextConfiguration(classes = { IssueAggregate.TestConfiguration.class })
public class IssueAggregate {

	@Autowired
	private RelatedRepository relatedRepository;
	@Autowired
	private TestRepository repository;

	//	@Autowired
	private TestDatabase database;

	@Autowired
	MongoTemplate template;

	@Before
	public void setup() {
		this.database = new TestDatabase();
		this.database.template = this.template;
		this.database.repository = this.repository;
	}

	@Test
	public void testLookupRelatedFound_OK() throws Exception {
		repository.deleteAll();
		repository.insert(new TestEntity("ID_2", "B"));

		relatedRepository.deleteAll();
		relatedRepository.insert(new RelatedEntity("R_1", "ID_2", "plain"));
		relatedRepository.insert(new RelatedEntity("R_2", "ID_2", "Full"));

		List<RelatedEntity> related = relatedRepository.getRelated("ID_2");
		assertEquals(2, related.size());

		assertNotNull(repository.findById("ID_2").isPresent());

		TestEntityUsage result = database.findAndJoinUsage("ID_2"); // joining 2 collections work if joining element exists
		assertNotNull(result);
	}

	@Test
	public void testLookupLeftOuterJoin_fails() throws Exception {
		repository.deleteAll();
		repository.insert(new TestEntity("ID_1", "A"));

		relatedRepository.deleteAll();

		TestEntityUsage result = database.findAndJoinUsage("ID_1"); // left outer join doens't work
		assertNotNull(result);
	}

	@Configuration
	@EnableMongoRepositories(basePackageClasses = { TestRepository.class, RelatedRepository.class, TestDatabase.class })
	protected static class TestConfiguration {
		@Bean
		public MongoTemplate mongoTemplate(MongoClient mongoClient) {
			return new MongoTemplate(mongoDbFactory(mongoClient));
		}

		@Bean
		public MongoDbFactory mongoDbFactory(MongoClient mongoClient) {
			return new SimpleMongoDbFactory(mongoClient, "test");
		}

		@Bean(destroyMethod = "shutdown")
		public MongoServer mongoServer() {
			MongoServer mongoServer = new MongoServer(new MemoryBackend());
			mongoServer.bind();
			return mongoServer;
		}

		@Bean(destroyMethod = "close")
		public MongoClient mongoClient(MongoServer mongoServer) {
			return new MongoClient(new ServerAddress(mongoServer.getLocalAddress()));
		}
	}
}

Documents:

@Document(collection = "test")
public class TestEntity {
	@Id
	private String id;
	@Indexed
	private String name;

	public TestEntity() {}

	public TestEntity(String id, String name) {
		this.id = id;
		this.name = name;
	}

	public String getId() {	return id; }
	public String getName() { return name; }
}

@Document(collection = "relatedTest")
public class RelatedEntity {
	@Id
	private String id;
	@Indexed
	private String testId;
	private String usage;
	public RelatedEntity() {
	}

	public RelatedEntity(String id, String testId, String usage) {
		this.id = id;
		this.testId = testId;
		this.usage = usage;
	}

	public String getId() { return id; }
	public String getTestId() {return testId;}
	public String getUsage() { return usage;	}
}

Expected joined document result

public class TestEntityUsage {
	private String id;
	private String name;
	private List<RelatedEntity> usageDocuments = new ArrayList<>();

	public TestEntityUsage() {	}

	public String getId() {	return id; }

	public String getName() { return name; }

	public List<RelatedEntity> getUsageDocuments() {
		return usageDocuments;
	}

}

Repositories and Db

public interface TestRepository extends MongoRepository<TestEntity, String> {

}

public interface RelatedRepository extends MongoRepository<RelatedEntity, String> {
	@Query("{ 'testId' : ?0 }")
	public List<RelatedEntity> getRelated(String id);
}

@Repository
public class TestDatabase {

	@Autowired
	TestRepository repository;
	@Autowired
	MongoTemplate template;

	public TestEntityUsage findAndJoinUsage(String id) {
		Aggregation aggregation = newAggregation(
				match(Criteria.where("_id").is(id)),
				lookup("relatedTest", "_id", "testId", "usageDocuments"));
		AggregationResults<TestEntityUsage> result = template.aggregate(aggregation, TestEntity.class, TestEntityUsage.class);
		return result.getUniqueMappedResult();
	}
}

deleteById() / deleteAll() not working with indexed properties

I have found another bug which prevents me from switching from fongo to mongo-java-server.

If you modify an indexed property of a document and then delete the document, no new document can be inserted with the value of the originally indexed property.

Here is a test case to reproduce the problem:

@RunWith(SpringRunner.class)
@SpringBootTest(classes={Issue.TestConfiguration.class})
public class Issue {

	@Autowired private TestRepository repository;

	@Test
	public void testDeleteWithUniqueIndexes() throws Exception {
		TestEntity document = repository.save(new TestEntity("DOC_1", "Text1"));

		// update value of indexed property
		document.setText("Text1 (updated)");
		repository.save(document);

		// delete document (deleteAll() does not work either)
		repository.deleteById("DOC_1");

		// duplicate key error
		repository.save(new TestEntity("DOC_1", "Text1"));
	}


	@Configuration
	@EnableMongoRepositories(basePackageClasses={TestRepository.class})
	protected static class TestConfiguration {
		@Bean
		public MongoTemplate mongoTemplate(MongoClient mongoClient) {
			return new MongoTemplate(mongoDbFactory(mongoClient));
		}

		@Bean
		public MongoDbFactory mongoDbFactory(MongoClient mongoClient) {
			return new SimpleMongoDbFactory(mongoClient, "test");
		}

		@Bean(destroyMethod="shutdown")
		public MongoServer mongoServer() {
			MongoServer mongoServer = new MongoServer(new MemoryBackend());
			mongoServer.bind();
			return mongoServer;
		}

		@Bean(destroyMethod="close")
		public MongoClient mongoClient(MongoServer mongoServer) {
			return new MongoClient(new ServerAddress(mongoServer.getLocalAddress()));
		}
	}
}
@Document(collection="test")
public class TestEntity {
	@Id private String id;
	@Indexed(unique=true) private String text;
	public TestEntity() {
	}
	public TestEntity(String id, String text) {
		this.id = id;
		this.text = text;
	}
	public String getId() {
		return id;
	}
	public String getText() {
		return text;
	}
	public void setText(String text) {
		this.text = text;
	}
}
public interface TestRepository extends MongoRepository<TestEntity, String> {
}

Missing query operator '$or'

Greetings,

I'm using your nice mockup library to test my classes. However, I realized that mongo server throws exception due to missing operator. Maybe you would like to consider adding this in your coming updates

2019-01-07 16:56:58.509 ERROR [d.b.mongo.wire.MongoWireProtocolHandler:103] - failed to handle query MongoQuery(header: MessageHeader(request: 26, responseTo: 0), collection: <myCollectionName>, query: <myMongoQuery>, returnFieldSelector: null)
de.bwaldvogel.mongo.exception.MongoServerError: [Error 2] unknown operator: $or
	at de.bwaldvogel.mongo.backend.QueryOperator.fromValue(QueryOperator.java:49)
	at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkExpressionMatch(DefaultQueryMatcher.java:377)
	at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkMatchesValue(DefaultQueryMatcher.java:310)
	at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkMatchesElemValues(DefaultQueryMatcher.java:354)
	at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkMatchesAnyValue(DefaultQueryMatcher.java:175)
	at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkMatch(DefaultQueryMatcher.java:151)
	at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.checkMatch(DefaultQueryMatcher.java:81)
	at de.bwaldvogel.mongo.backend.DefaultQueryMatcher.matches(DefaultQueryMatcher.java:32)
	at de.bwaldvogel.mongo.backend.AbstractMongoCollection.documentMatchesQuery(AbstractMongoCollection.java:46)
	at de.bwaldvogel.mongo.backend.memory.MemoryCollection.matchDocuments(MemoryCollection.java:96)
	at de.bwaldvogel.mongo.backend.AbstractMongoCollection.queryDocuments(AbstractMongoCollection.java:59)
	at de.bwaldvogel.mongo.backend.AbstractMongoCollection.handleQuery(AbstractMongoCollection.java:710)
	at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.handleQuery(AbstractMongoDatabase.java:645)
	at de.bwaldvogel.mongo.backend.AbstractMongoBackend.handleQuery(AbstractMongoBackend.java:206)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleQuery(MongoDatabaseHandler.java:93)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:71)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:37)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1434)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:965)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:648)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:583)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:500)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:897)
	at java.base/java.lang.Thread.run(Thread.java:834)

Query with $nin doesn't work in my case

I have a use case where I don't get the same behavior than in MongoDb
My real case is more complex but I reduced it for this bug report.

Base preparation

> use test
switched to db test
> db.test.find()
> db.test.insert({ "code" : "c1", "map" : { "key1" : ["value 1.1"], "key2" : ["value 2.1"] } })
Cannot use commands write mode, degrading to compatability mode
WriteResult({ "nInserted" : 1 })
> db.test.insert({ "code" : "c1", "map" : { "key1" : ["value 1.2"], "key2" : ["value 2.2"] } })
Cannot use commands write mode, degrading to compatability mode
WriteResult({ "nInserted" : 1 })
> db.test.insert({ "code" : "c1", "map" : { "key1" : ["value 1.3"], "key2" : ["value 2.3"] } })
Cannot use commands write mode, degrading to compatability mode
WriteResult({ "nInserted" : 1 })
> db.test.find()
{ "_id" : ObjectId("536a35a58b25d8ad95dc2792"), "code" : "c1", "map" : { "key1" : [ "value 1.1" ], "key2" : [ "value 2.1" ] } }
{ "_id" : ObjectId("536a36078b25d8ad95dc2793"), "code" : "c1", "map" : { "key1" : [ "value 1.2" ], "key2" : [ "value 2.2" ] } }
{ "_id" : ObjectId("536a365d8b25d8ad95dc2794"), "code" : "c1", "map" : { "key1" : [ "value 1.2" ], "key2" : [ "value 2.2" ] } }

Query how doesn't work

> db.test.find({ "map.key2" : { "$nin" : ["value 2.2"] }})
{ "_id" : ObjectId("536a35a58b25d8ad95dc2792"), "code" : "c1", "map" : { "key1" : [ "value 1.1" ], "key2" : [ "value 2.1" ] } }
{ "_id" : ObjectId("536a36078b25d8ad95dc2793"), "code" : "c1", "map" : { "key1" : [ "value 1.2" ], "key2" : [ "value 2.2" ] } }
{ "_id" : ObjectId("536a365d8b25d8ad95dc2794"), "code" : "c1", "map" : { "key1" : [ "value 1.2" ], "key2" : [ "value 2.2" ] } }
> db.test.find({ "map.key2" : { "$not" : {"$in" : ["value 2.2"] }}})
{ "_id" : ObjectId("536a35a58b25d8ad95dc2792"), "code" : "c1", "map" : { "key1" : [ "value 1.1" ], "key2" : [ "value 2.1" ] } }
{ "_id" : ObjectId("536a36078b25d8ad95dc2793"), "code" : "c1", "map" : { "key1" : [ "value 1.2" ], "key2" : [ "value 2.2" ] } }
{ "_id" : ObjectId("536a365d8b25d8ad95dc2794"), "code" : "c1", "map" : { "key1" : [ "value 1.2" ], "key2" : [ "value 2.2" ] } }

I expected to get only 2 results.

Thanks for this project

$in query causing ClassCastException: java.util.ArrayList cannot be cast to java.lang.Comparable

A query using $in on the _id field fails with a ClassCastException as shown in the logs below. Matching on other fields works as expected.

15:27:40.187 [mongo-server-worker2] DEBUG d.b.m.w.MongoWireProtocolHandler - MongoQuery(header: MessageHeader(request: 112, responseTo: 0), collection: xxx.xxx, query: {"_id" : {"$in" : [[1, 2, 3]]}}, returnFieldSelector: null)
15:27:40.192 [mongo-server-worker2] ERROR d.b.m.w.MongoExceptionHandler - exception for client 5f418d5e
java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.Comparable
	at java.util.TreeMap.compare(TreeMap.java:1294)
	at java.util.TreeMap.put(TreeMap.java:538)
	at java.util.TreeSet.add(TreeSet.java:255)
	at java.util.AbstractCollection.addAll(AbstractCollection.java:344)
	at java.util.TreeSet.addAll(TreeSet.java:312)
	at java.util.TreeSet.<init>(TreeSet.java:160)
	at de.bwaldvogel.mongo.backend.AbstractUniqueIndex.getPositionsForExpression(AbstractUniqueIndex.java:170)
	at de.bwaldvogel.mongo.backend.AbstractUniqueIndex.getPositions(AbstractUniqueIndex.java:135)
	at de.bwaldvogel.mongo.backend.AbstractMongoCollection.queryDocuments(AbstractMongoCollection.java:50)
	at de.bwaldvogel.mongo.backend.AbstractMongoCollection.handleQuery(AbstractMongoCollection.java:718)
	at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.handleQuery(AbstractMongoDatabase.java:650)
	at de.bwaldvogel.mongo.backend.AbstractMongoBackend.handleQuery(AbstractMongoBackend.java:206)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleQuery(MongoDatabaseHandler.java:93)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:71)
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:37)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
	at java.lang.Thread.run(Thread.java:748)

Memory backend doesn't support unordered bulk operations

MemoryDatabase throws an exception if it tries to execute unordered bulk operations - there's a check at the top of the various command* methods like so:

boolean isOrdered = Utils.isTrue(query.get("ordered"));
if (!isOrdered)
    throw new RuntimeException("unexpected update query: " + query);

replica set support

I need to be able to test client sessions, but only replica sets support that. Is that possible with this?

$unwind casted as String

Hi Benedikt,

I found an issue when you try to execute an aggregation with an extended "$unwind" like:

"$unwind" : {
    "path" : "$array_field", 
    "preserveNullAndEmptyArrays" : true
}

The error throwed is:

java.lang.ClassCastException: de.bwaldvogel.mongo.bson.Document cannot be cast to java.lang.String
	at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.commandAggregate(AbstractMongoDatabase.java:615) ~[mongo-java-server-core-1.13.0.jar:na]
	at de.bwaldvogel.mongo.backend.AbstractMongoDatabase.handleCommand(AbstractMongoDatabase.java:140) ~[mongo-java-server-core-1.13.0.jar:na]
	at de.bwaldvogel.mongo.backend.AbstractMongoBackend.handleCommand(AbstractMongoBackend.java:193) ~[mongo-java-server-core-1.13.0.jar:na]
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleCommand(MongoDatabaseHandler.java:149) ~[mongo-java-server-core-1.13.0.jar:na]
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.handleQuery(MongoDatabaseHandler.java:91) ~[mongo-java-server-core-1.13.0.jar:na]
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:71) ~[mongo-java-server-core-1.13.0.jar:na]
	at de.bwaldvogel.mongo.wire.MongoDatabaseHandler.channelRead0(MongoDatabaseHandler.java:37) ~[mongo-java-server-core-1.13.0.jar:na]
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) ~[netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323) [netty-codec-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297) [netty-codec-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1434) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:965) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:648) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:583) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:500) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462) [netty-transport-4.1.31.Final.jar:4.1.31.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:897) [netty-common-4.1.31.Final.jar:4.1.31.Final]
	at java.lang.Thread.run(Thread.java:748) [na:1.8.0_141]

I've took a look AbstractMongoDatabase class, at 615 line, and it seems the problem you are treating the "$unwind" as a String when really is a Document

If you like I can try to fix it.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.