MongoDB dot (.) in key name

As mentioned in other answers MongoDB does not allow $ or . characters as map keys due to restrictions on field names. However, as mentioned in Dollar Sign Operator Escaping this restriction does not prevent you from inserting documents with such keys, it just prevents you from updating or querying them.

The problem of simply replacing . with [dot] or U+FF0E (as mentioned elsewhere on this page) is, what happens when the user legitimately wants to store the key [dot] or U+FF0E?

An approach that Fantom's afMorphia driver takes, is to use unicode escape sequences similar to that of Java, but ensuring the escape character is escaped first. In essence, the following string replacements are made (*):

\  -->  \\
$  -->  \u0024
.  -->  \u002e

A reverse replacement is made when map keys are subsequently read from MongoDB.

Or in Fantom code:

Str encodeKey(Str key) {
    return key.replace("\\", "\\\\").replace("\$", "\\u0024").replace(".", "\\u002e")
}

Str decodeKey(Str key) {
    return key.replace("\\u002e", ".").replace("\\u0024", "\$").replace("\\\\", "\\")
}

The only time a user needs to be aware of such conversions is when constructing queries for such keys.

Given it is common to store dotted.property.names in databases for configuration purposes I believe this approach is preferable to simply banning all such map keys.

(*) afMorphia actually performs full / proper unicode escaping rules as mentioned in Unicode escape syntax in Java but the described replacement sequence works just as well.


The Mongo docs suggest replacing illegal characters such as $ and . with their unicode equivalents.

In these situations, keys will need to substitute the reserved $ and . characters. Any character is sufficient, but consider using the Unicode full width equivalents: U+FF04 (i.e. “$”) and U+FF0E (i.e. “.”).


MongoDB doesn't support keys with a dot in them so you're going to have to preprocess your JSON file to remove/replace them before importing it or you'll be setting yourself up for all sorts of problems.

There isn't a standard workaround to this issue, the best approach is too dependent upon the specifics of the situation. But I'd avoid any key encoder/decoder approach if possible as you'll continue to pay the inconvenience of that in perpetuity, where a JSON restructure would presumably be a one-time cost.