find() and findOne() methods in MongoDB showing different results

First of all, basic difference between findOne() and find():

  • findOne() - if query matches, first document is returned, otherwise null.

  • find() - no matter the number of documents matched, a cursor is returned, never null.

So when put in an if condition, findOne() can convert to false when it doesn't match any document. As find() returns a cursor object and never returns null, will convert to true when put in an if condition.

find and findOne() return the following for empty collection :

enter image description here


The pitfall you find yourself in is the rather undocumented conversion from mongo shell objects to booleans in javascript:

findOne() returns a document, or nil/null/whatever-it-is-called

find() returns a cursor, which can be empty. But the object returned is always defined.


The find() method will return an array, even if no documents match the search criteria. An empty array still exists, so it will act as truthy.

findOne() will either return exactly one document, or undefined. Undefined, by definition, is a falsy value.

When you know you will be searching for just one document, use findOne() to get a more accurate representation.


The the find() method returns a cursor which is always truthy even if the query criteria does not match any document.

On the other hand, the findOne returns the first document that matches your query criteria or null (JavaScript or the equivalent in your language driver) if there is not any document that matches the specified criteria.

> db.dropDatabase()
{ "dropped" : "test", "ok" : 1 }
> var cursor = db.collection.find();
> cursor;
> typeof cursor;
object
> !cursor;
false
> var document = db.collection.findOne();
> document;
null
> typeof document;
object
> !document;
true