Optimization: Moving variable declarations to the top of your procedure

No.

This either used to be true a long time ago (and no longer is, at least since SQL Server 2000), or it never was true and your DBA just confused his recommendation with the following one:

It is important to group together all DDL statements (like creating indexes) for temporary tables at the start of a stored procedure. By placing these DDL statements together unnecessary compilations due to schema change can be avoided.

You can find another explanation of the reasoning behind this recommendation on this page.

If we take a look at this Microsoft KB, we see that the cause of a stored procedure recompile can be one of the following (SQL Server 2005+):

  1. Schema changed.
  2. Statistics changed.
  3. Recompile DNR.
  4. Set option changed.
  5. Temp table changed.
  6. Remote rowset changed.
  7. For browse perms changed.
  8. Query notification environment changed.
  9. MPI view changed.
  10. Cursor options changed.
  11. With recompile option.

Declaring a variable -- even a table variable (i.e. @table_variable) -- cannot trigger any of these events, obviously, because declaring a variable doesn't count as DDL. A variable (even a table variable) is a temporary object used exclusively for your T-SQL programming. That's why table variables get no statistics and are not bound by transactions. Declaring a variable (table or not) cannot trigger a proc recompile.

Creating a temp table (i.e. #temp_table) or an index, however, is DDL that affects the physical definition of the database. Temp tables and indexes are "real" objects with statistics and transactional control, therefore creating them could fire any of events 1, 2, or 5 in the list above and thus trigger a proc recompile.


It should not make a difference or reduce compile locks or cause less recompilations to declare a variable half way down the stack or at the top. I happen to do this at the top for readability more often than not.

To get at the "what is my DBA thinking" part of the question, the only thing I can come up with (other than Nick's point that they are thinking of how something used to be) is perhaps they were talking about Parameter Sniffing (See Option 2 at this link on simple talk)

About your blocking --> If you are seeing true blocking, that isn't the type of compile lock contention that your DBA is talking about most likely. While it is true that there are certain things that affect this (not schema qualifying tables, not schema qualifying your stored procedure calls, for instance) this is not the cause of your high reads certainly and likely not the cause of your blocking. You should definitely do all you can to avoid these compile locks. But I would look at tuning and optimizing the rest of the stored procedure code as a more important task than worrying about where the variables are. You can also read How to identify and resolve compile locks if you want to verify you aren't experiencing issues here.

Post those before/after examples and we'll see what the DBA is driving at here.