Large number of tables and Hibernate memory consumption

I would suggest to do profiling of application in production or staging using java melody to find out where or who is consuming maximum memory and based on profiling result you should decide on what changes should be done in application.

Java melody is very easy to integrate and configure and in production you can enable or disable by just updating web.xml


An open hibernate session will tend to accrue objects as it is in use. This is not a memory leak; a hibernate session is designed to be used once for a request, and it caches objects that are persistent (i.e. live within the session), as well as queries and other data. If you call session.toString(), you will see a laundry list of objects that live in the session.

If you work with a very large number of objects, consider handling the objects in batches. You can call session.clear() after each batch to evict cached data and persistent objects from the session and reduce the session's memory footprint (sometimes dramatically).

After calling session.clear(), be aware that objects loaded before this call will revert to the detached state, and are no longer active for the current session.

You can also use lazy fetching to optimize the amount of data that hibernate has to load in order to handle a given operation. You can read more about this in the hibernate documentation. I would recommend enabling hibernate's SQL logging feature, and checking to see whether hibernate is pulling back data that it doesn't need.

You can also configure hibernate to gather statistics that can help you:

sessionFactory.getStatistics().setStatisticsEnabled(true);