Fast way to get size of very large folders

Since the directory sizes are usually stable within a day, and it's important to quickly get the size information at some point during the day, the best method for you would be an automated job to gather the information ahead of time. If your user has the required read access, it could be as simple as putting this one line inside your own crontab (crontab -e) to run nightly at 2:30 am:

30 2 * * * ( date ; du --summarize path1 path2 ; date ) >> $HOME/du_out.txt

As you get more confident with the results you can extend it into a script, perhaps running in a system-wide cron capacity so other administrators can maintain it, and add information about creation or access dates to automatically delete the oldest and largest files if available space is too low (presumably you already have a mechanism so important directories that should be kept longer are "promoted" to another location).

Of course if you already have a comprehensive build system, like jenkins, forget everything I said about cron and create this as a job there.