Is there any difference between a predicate using a literal value or a parameter value when that value will always be the same?

Alright assuming that you are talking about a local variable running from a query in SSMS since it hasn't been specified otherwise. Even if you use the same value for the AND UserStatus = @userStatus that you would use in the literal AND UserStatus = 1 you will see a difference in your execution plan due to how the cardinality estimate is generated.

When you use a literal value SQL Server will go out to the histogram for that table and see where that value of fits within the range key. The estimate gathered on that will result in one of two scenarios.

HISTOGRAM DIRECT HIT Essentially this means there is a RANGE_HI_KEY(Upper column value for any step in the histogram) value for that specific literal value in your query and therefore the estimate will match the number of EQ_ROWS(# of rows whose value equals the RANGE_HI_KEY) in the histogram. This means your estimate will be the number of rows that match that value based on the last time you updated statistics.

HISTOGRAM INTRA-STEP HIT This is when the value exists in a range between two RANGE_HI_KEY values. When your literal value is between that range it is calculated by the RANGE_ROWS(number of rows between two histogram steps), DINSTINCT_RANGE(the number of distinct values within that histogram step), and AVG_RANGE_ROWS (RANGE_ROWS/DISTINCT_RANGE_ROWS) and that will give you your estimate.

However when you run with a local variable you will no longer go to the Histogram for these values since the @Variable isn't known at runtime.

For more information on this topic I recommend reading this white paper by Joe Sack.

DENSITY VECTOR When there is no specific value to go with then SQL server instead uses a Density to be able to determine the estimated number of rows returned for that predicate. Density being 1/ the number of distinct values within that column. So then your cardinality estimation will be Density * the number of rows in the table.

So long story short no. Even if you run the same value over and over with a local variable you will not get the same results for reasons further explained in the link Eric Provided from Kendra Little.


A literal value is known to the optimizer (so it can estimate selectivity based on that value).

A parameter value (stored procedure, function) is sniffed at execution time. A subsequent execution might have some other value, for which the previously compiled plan might not be optimal.

For a variable, the value is not known to the optimizer. It might be able to look at density (on average we have this many rows for a certain value), or it uses hard-wired estimates (like for instance > result in 10% selectivity, or whatever those ward-wired percentages might be).