Sql Server - Best Practices for Growing Database Files

You should be aiming to auto-grow as little as possible. Seven times a day is excruciating, even with instant file initialization.

Don't do a Shrink Database. Ever. Shrinkfile, maybe, but only after an extraordinary event. Shrinking it just to grow again is an exercise in futility and should actually be called auto-fragment.

If recovery model is simple, there is no way on earth you should need to grow your log file by 250 GB. The used space in the file will clean itself out automatically over time, unless you started a transaction a month ago and have no intentions of ever committing it or rolling it back.

So my advice would be:

Auto-grow the data file manually during a quiet period to a size that will accommodate several months of growth. What are you saving it for in the meantime?

Set the auto-growth increment for the data file to something relatively small (so that it doesn't interrupt users when it does happen), and alert on this event (you can catch it in the default trace, for example, or through extended events). This can tell you that you are hitting the high point you estimated and it is time to grow manually again. At this point you will want to keep this manual in case you want to add a new file / filegroup on a different drive to accommodate the space, since eventually you will fill the current drive.

Auto-grow the log file to, say, twice the largest it's ever been. It shouldn't auto-grow further unless there is some abnormal transaction holding things up. You should monitor for this event as well, so that you know about them.


Auto growth is something that you should try to avoid if possible. The issue is you have no control over when the growth can happen and your system can take a serious hit while it does so.

Set your file size to something sensible for a month or so and monitor your growth rate from there work out how much space you estimate for X amount of time and set your size to that + a margin of error.

I've set up a simple monitoring job that will alert me when the file size hits a predefined maximum before auto growth. You could use something like this:

SELECT instance_name,
       [Data File(s) Size (KB)],
       [LOG File(s) Size (KB)],
       [Log File(s) Used Size (KB)],
       [Percent Log Used]
       into ##Logsize
FROM
(
   SELECT *
   FROM sys.dm_os_performance_counters
   WHERE counter_name IN
   (
       'Data File(s) Size (KB)',
       'Log File(s) Size (KB)',
       'Log File(s) Used Size (KB)',
       'Percent Log Used'
   )
     AND instance_name = 'database your interested in' 
) AS Src
PIVOT
(
   MAX(cntr_value)
   FOR counter_name IN
   (
       [Data File(s) Size (KB)],
       [LOG File(s) Size (KB)],
       [Log File(s) Used Size (KB)],
       [Percent Log Used]
   )
) AS pvt 
go
declare @logsize int
Select @logsize = [Percent Log Used] from ##Logsize

If @logsize > the maximum percent you want the log to fill too i.e 90
    BEGIN
        --Do your thing here
    END

Drop table ##Logsize

This could of course be scheduled as a job.