T-SQL: If parameter1 is null, use parameter2

The simplest answer to your question involves telling SQL Server what to look for by placing brackets around grouped requirements. So, for instance, if we only want to check Field2 when @parameter1 IS NULL, we could do this:

WHERE (@parameter1 IS NULL AND Field2 = @parameter2)
    OR (Field1 = @parameter1)

SQL Server will evaluate Field2 = @parameter2 when @parameter1 IS NULL. When '@Parameter1` is not null SQL Server will only return rows that match it.

If you have a lot of rows in the source table, you may want to use a stored procedure to help optimize the possible choices. The following example creates a table, populates a couple of rows, and creates a procedure to query the table.

I'm doing this in tempdb since this is just an example:

USE tempdb;

Create the table, and populate it with a couple of rows:

IF OBJECT_ID('dbo.SomeTable') IS NOT NULL
DROP TABLE dbo.SomeTable;
GO
CREATE TABLE dbo.SomeTable
(
    SomeID INT NOT NULL
        CONSTRAINT PK_SomeTable
        PRIMARY KEY CLUSTERED
        IDENTITY(1,1)
    , Column1 NVARCHAR(30) NULL
    , Column2 NVARCHAR(30) NULL
    , Column3 NVARCHAR(200) NOT NULL
);
INSERT INTO dbo.SomeTable (Column1, Column2, Column3)
VALUES (NULL, 'col2_value', 'some data for row 1')
    , ('col1_value', NULL, 'some data for row 2');

Since I know the precise patterns my queries against this table will take, I can create a couple of simple covering indexes that will allow SQL Server to very quickly find the relevant rows:

CREATE INDEX IX_SomeTable_c1 ON dbo.SomeTable(Column1) INCLUDE (Column2, Column3);
CREATE INDEX IX_SomeTable_c2 ON dbo.SomeTable(Column2) INCLUDE (Column1, Column3);
GO

This will add an extra 10,000 rows to the table just to give us enough data for the indexes to be considered by the query optimizer:

INSERT INTO dbo.SomeTable (Column1, Column2, Column3)
SELECT TOP(10000) LEFT(o1.name, 30), RIGHT(o2.name, 30), o3.name
FROM sys.objects o1
    , sys.objects o2
    , sys.objects o3

Here, I'm creating a procedure that will search our table using the contents of either Column1 or Column2 depending on the parameters passed in.

IF OBJECT_ID('dbo.MySearch') IS NOT NULL
DROP PROCEDURE dbo.MySearch;
GO
CREATE PROCEDURE dbo.MySearch
(
    @parameter1 NVARCHAR(30) = NULL
    , @parameter2 NVARCHAR(30) = NULL
)
AS
BEGIN
    SET NOCOUNT ON;
    DECLARE @sql NVARCHAR(MAX);
    DECLARE @where NVARCHAR(100);
    DECLARE @params NVARCHAR(100);
    SET @sql = 'SELECT t.Column1
    , t.Column2
    , t.Column3
FROM dbo.SomeTable t
';
    IF @parameter1 IS NULL
    BEGIN
        SET @params = '@p2 NVARCHAR(30)';
        SET @where = 'WHERE t.Column2 = @p2';
        SET @sql = @sql + @where;
        EXEC sys.sp_executesql @sql, @params, @p2 = @parameter2;
    END
    IF @parameter2 IS NULL
    BEGIN
        SET @params = '@p1 NVARCHAR(30)';
        SET @where = 'WHERE t.Column1 = @p1';
        SET @sql = @sql + @where;
        EXEC sys.sp_executesql @sql, @params, @p1 = @parameter1;
    END
END;
GO

If you pass in values for both @parameter1 and @parameter2, the stored procedure will return no rows. You may want to add a third IF statement to the stored procedure to handle that situation, something like:

    IF @parameter1 IS NOT NULL AND @parameter2 IS NOT NULL
    BEGIN
        SET @params = '@p1 NVARCHAR(30), @p2 NVARCHAR(30)';
        SET @where = 'WHERE t.Column1 = @p1 AND t.Column2 = @p2';
        SET @sql = @sql + @where;
        EXEC sys.sp_executesql @sql, @params, @p1 = @parameter1, @p2 = @parameter2;
    END

Without knowing your exact requirements, it's difficult to say if you need that.

Two samples that show the usage of the stored proc, along with the execution plans:

EXEC dbo.MySearch @parameter1 = 'col1_value';

enter image description here

EXEC dbo.MySearch @parameter2 = 'col2_value';

enter image description here

Using a stored procedure in combination with dynamic SQL allows the query designer (us) to help SQL Server consistently choose the best plan for the job by providing two independent cachable plans that are optimized for each scenario. Depending on the cardinality of each column involved, the complexity of the actual query design, along with the presence of desirable indexes, SQL Server may decide to create vastly different plans for the two queries, which can be a Very Good Thing™.

Just for fun, using the "simple" WHERE clause I show at the top of my answer like this:

DECLARE @parameter1 NVARCHAR(30) = 'col1_value';
DECLARE @parameter2 NVARCHAR(30);
SELECT t.Column1
    , t.Column2
    , t.Column3
FROM dbo.SomeTable t
WHERE (@parameter1 IS NULL AND t.Column2 = @parameter2)
    OR (t.Column1 = @parameter1);

Results in a plan like this:

enter image description here

This clearly shows SQL Server searching both indexes for Column1 and Column2. This is rather obviously not optimal, and would certainly not be the preferred route for larger sets of data.


If I understood you, this should do the job:

SELECT ...
 WHERE 
    (Field2 = @Parameter2 AND @Parameter1 IS NULL) OR 
    (Field1 = @Parameter1 AND @Parameter2 IS NULL) 

However, I'd rather not do such things because of potential performance issues. Execution plan for this query depends to a great extent on parameter values. For instance, passing NULL as @Parameter1 should incline optimizer towards using index on Field2 , passing NULL as @Parameter2` - index on Field1.

To overcome this, you'll need to force recompilation every time the query runs. Even if later releases of SQL Server have something similar to Oracle bind awareness, it is much easier to have 2 separate queries than rely on optimizer magic.

For instance, you can modify your procedure/function and put logic there:

IF  @Parameter1 IS NULL 
 BEGIN
    SELECT ... WHERE Field2 = @Parameter2;
    ...
 END;
ELSE 
 BEGIN
    SELECT ... WHERE Field1 = @Parameter1;
    ...
 END;

Further reading:

Dynamic Search Conditions in T‑SQL by Erland Sommarskog