System.Collections.Generic.Dictionary = Ultimate performance?

I've found that the .NET Dictionary performs well, if not exceptionally well, in most situations. It's a good general purpose implementation. The problem I most often run into is the 2-gigabyte limit. On a 64-bit system, you can't add more than about 89.5 million items to a dictionary (when the key is an integer or a reference, and the value is a reference). Dictionary overhead appears to be 24 bytes per item.

That limit makes itself known in a very odd way. The Dictionary seems to grow by doubling--when it gets full, it increases capacity to the next prime number that's at least double the current size. Because of that, the dictionary will grow to about 47 million and then throw an exception because when it tries to double (to 94 million), the memory allocation fails (due to the 2 gigabyte limit). I get around the problem by pre-allocating the Dictionary (i.e. call the constructor that lets you specify the capacity). That also speeds up populating the dictionary because it never has to grow, which entails allocating a new array and re-hashing everything.

What makes you say that Dictionary uses a linked list for collision resolution? I'm pretty sure it uses open addressing, but I don't know how it does the probes. I guess if it does linear probing, then the effect is similar to what you'd get with a linked list.

We wrote our own BigDictionary class to get past the 2-gigabyte limit and found that a straightforward open addressing scheme with linear probing gives reasonably good performance. It's not as fast as Dictionary, but it can handle hundreds of millions of items (billions if I had the memory).

That said, you should be able to write a faster task-specific hash table that outperforms the .NET Dictionary in some situations. But for a general purpose hash table I think you'll be hard pressed to do better than what the BCL provides.


There are many things to consider in desigining a "better" hash table. One of the reasons that the custom approaches you tried were slower or no better than the .NET Dictionary is that very often the performance of a hash table is very dependant on:

  • The data being hashed
  • The performance of the hash function
  • The load factor of the table
  • The number of collisions vs non-collisions
  • The algorithm for collision resolution
  • The amount of data in the table and how it's stored (by pointer/reference or directly within the buckets)
  • The access patterns to the data
  • The number of insertions/deletions vs retrievals
  • The need for resizing in a closed hashing/open addressing implementation
  • and many other factors...

With so many things to tweak and tune, it is difficult, without a significant amount of effort to come up with a general high performance (time and speed) hash table. That is why, if you are going to try to create a custom hash table instead of one built into a standard library (such as .NET) be ready to spend countless hours and be aware that your finely tuned implementation may be only tuned for the specific type and amount of data you are hashing.

Therefore, no, the .NET Dictionary is not the ultimate hash table for any specific purpose. But, given the frequency of dictionary use, I am sure that the Microsoft BCL (Base Class Library) team performed a huge amount of profiling to choose the approach they chose for the general case.