Is there a "best-practices" type process for developers to follow for database changes?

In a VS environment, I've always used database projects to implement the update scripts. I tend to use unimaginative names like "DatabaseUpdate17.sql" or "PriceUpdateFebruary2010.sql" for my scripts. Having them as database projects lets me tie them to Team Server tasks, bugs (and if we did code reviews, to them as well). I also include in each database (that I have authority over) a table specifically for the collection of changes to the schema.

CREATE TABLE [dbo].[AuditDDL](
    [EventID] [int] IDENTITY(1,1) PRIMARY KEY NOT NULL,
    [EventData] [xml] NULL,                    -- what did they do
    [EventUser] varchar(100) NOT NULL,         -- who did it
    [EventTime] [datetime] DEFAULT (getdate()) -- when did they do it
    )
GO

Well, that takes care of 3 of the 6 Ws.

CREATE TRIGGER [trgAuditDDL]
ON DATABASE 
FOR DDL_DATABASE_LEVEL_EVENTS 
AS
INSERT INTO AuditDDL(EventData, EventUser)
SELECT EVENTDATA(), original_login()
GO

I include an insert statement to log the beginning of a patch as well as the end of a patch. Events happening outside of patches are things to look into.

For example, a "begin patch" insert for "patch 17" would look like:

INSERT INTO [dbo].[AuditDDL]
           ([EventData]
           ,[EventUser])
     VALUES
           ('<EVENT_INSTANCE><EventType>BEGIN PATCH 17</EventType></EVENT_INSTANCE>'
           ,ORIGINAL_LOGIN())
GO

Since it also catches when indices are rebuilt, you'll need to run the following every month or so to clear out those events:

DELETE FROM AuditDDL
WHERE [EventData].exist('/EVENT_INSTANCE/EventType/text()[fn:contains(.,"ALTER_INDEX")]') =1
GO

DELETE FROM AuditDDL
WHERE [EventData].exist('/EVENT_INSTANCE/EventType/text()[fn:contains(.,"UPDATE_STATISTICS")]') =1
GO

Earlier version previously posted on Server Fault.

In a SOX and PCI-DSS compliant environment, you will never have access to the production servers. Therefore the scripts need to be clear and exercised beforehand. The comments at the top of the update scripts include lists of new tables, stored procs, functions, etc as well as lists of modified tables, stored procs, functions, etc. If data gets modified, explain what is being modified and why.

A secondary problem is that you sometimes end up needing to manually coordinate changes if two separate tasks change the same database object. This may just be the way it is but it still seems like there should be some automated way of "flagging" these issues or something.

I've never come across a tool that lets us track this automatically. Previous employers used a principle of "database owner" - one and only one person who is personally in charge of the database. This person won't be the only developer working against that database, but rather all changes have to go through them. This has worked reasonably well to keep changes from colliding and damaging each other.


Have you looked at SQL Source Control? You can use it to connect your SQL Server to TFS/SVN/Vault or VSS - http://www.red-gate.com/products/sql-development/sql-source-control/


Another solution is to use something like PowerDesigner, ERWin, etc to design and manage changes to your database.

We're starting to transition to a policy where databases are modeled in PowerDesigner. All changes to the database structure/code is done in the model, checked into source control and then change scripts are generated from the models to implement the changes in the database. These change scripts are also checked in to source control. Large changes are peer reviewed and PowerDesigner makes this very easy using built-in features.

PowerDesigner is a generic modeling tool supporting more than just databases so we're starting to use it to manage requirements, create conceptual, physical and architecture diagrams (OOM's too), etc. Basically, we're using it to provide the backbone to our software engineering process.

(I'm in no way affiliated with Sybase, who developed PowerDesigner - just thought I'd throw that in there).