How do I run a large script with many inserts without running out of memory?

The maximum batch size for SQL Server 2005 is 65,536 * Network Packet Size (NPS), where NPS is usually 4KB. That works out to 256 MB. That would mean that your insert statements would average 5.8 KB each. That doesn't seem right, but maybe there are extraneous spaces or something unusual in there.

My first suggestion would be to put a "GO" statement after every INSERT statement. This will break your single batch of 45,000 INSERT statements into 45,000 separate batches. This should be easier to digest. Be careful, if one of those inserts fails you may have a hard time finding the culprit. You might want to protect yourself with a transaction. You can add those statements quickly if your editor has a good search-and-replace (that will let you search on and replace return characters like \r\n) or a macro facility.

The second suggestion is to use a Wizard to import the data straight from Excel. The wizard builds a little SSIS package for you, behind the scenes, and then runs that. It won't have this problem.


BULK INSERT or bcp seem more appropriate options than 45,000 insert statements.

If you need to stick with the insert statements, I would consider a few options:

A: Use transactions and wrap batches of 100 or 500 or 1000 statements in each one to minimize the impact on the log and the batch. e.g.

BEGIN TRANSACTION;
INSERT dbo.table(a, ...) SELECT 1, ...
INSERT dbo.table(a, ...) SELECT 2, ...
...
INSERT dbo.table(a, ...) SELECT 500, ...
COMMIT TRANSACTION;
GO

BEGIN TRANSACTION;
INSERT dbo.table(a, ...) SELECT 1, ...
INSERT dbo.table(a, ...) SELECT 2, ...
...
INSERT dbo.table(a, ...) SELECT 500, ...
COMMIT TRANSACTION;
GO

B: Instead of individual insert statements, use UNION ALL for 100 or 500 statements at a time, e.g.

INSERT dbo.table(a, ...)
SELECT 1, ...
UNION ALL SELECT 2, ...
...
UNION ALL SELECT 500, ...
GO

INSERT dbo.table(a, ...)
SELECT 501, ...
UNION ALL SELECT 502, ...
...
UNION ALL SELECT 1000, ...
GO

I've left error handling out for brevity, but the point is that I would never try to send a single batch of 45,000 individual statements to SQL Server.


I am not sure why you are getting the out of memory error, but there is an easier approach.

If you can export the data from the spreadsheet into a delimited format (e.g. csv) you can use the data import wizard in SSMS to insert the data for you:

SSMS import data task.