Posts Tagged 'SQL Server'

Database Code Deployments

One of the responsibilities of my current position as a SQL Developer is to prepare the scripts for deployment. For quality assurance reasons we store all of the database objects in a source control system, so each object is stored in a separate file. This can make deployments very tedious, depending on the volume of files.

Disclaimer

These processes are my processes for an internally managed application. These scripts and processes are not distributed in any way, and quite frankly are not suited for that purpose.

This does not cover any T-SQL best practices. There are far better articles written by far more knowledgeable people covering these topics. Click here for a list of best practice articles.

The Development

Here are my general guidelines for deployment, in no particular order:

1. Each script must be able to be executed multiple times without error.

There is nothing worse that running a script to test a deployment and then getting an error because an object already exists or a key violation because the data already exists. All scripts must check for the existence of the finals state before doing anything else. For stored procedures, this may me a conditional drop and then create and for tables this may mean checking for column existence before running an alter statement.

2. Object permission must be part of the object script.

Your mileage may vary with this one, but I am a fan of explicit permission granted to a database role for each database object that is being deployed. For newer architectures your roles may have blanket permissions to a schema, but I have not seen many instances of security set up this way.

Having the permission set at the end of the script (just in case you drop an re-create it :-)) will save a lot of headache if you ever need to setup the application in a new environment. NOTE: Please do not forget the batch separator (i.e. “GO”) in the script. I hate seeing permission errors where a stored procedure is trying execute permissions to itself.

3. Group the data scripts by table.

If you follow guideline #1 this shouldn’t be much of an issue, but if you have the same or worse, similar, updates in multiple places you could in a world of hurt.

4. Try to group like objects.

I try to keep like objects together. With some exceptions I will deploy objects in the following order:

  1. Tables – With the exception of using a function in a default value or check constraint, it is a pretty safe bet that you can deploy all table changes first without error. The most tedious part of deploying tables is their relationships. Since they are dependant on one another, order is important. I have not found a perfect way to deal with this dependency.
  2. Views – Again, it is possible to use functions as part of a view, but I wouldn’t recommend it. It is also possible to reference another view as part of a view definition, but I would not recommend that either. Pay attention to the order, just in case.
  3. Functions – In SQL Sever you cannot call a stored procedure as part of a scalar or table valued function, so deploying these before the stored procedures is advisable.
  4. Stored Procedures – These are next to last because they can use any of the previous three objects. They may, however, be dependant on one another, so watch the order here as well.
  5. Triggers – Triggers can be added to tables or view and they can use stored procedure or functions, so I save these for last.

5. Take as much care with the rollback scripts as you do with the future scripts.

What can be worse than beginning a deployment and then realizing that you have to rollback. After the rollback completes, you realize that you just dropped a primary table in your database that you just meant to alter. Oops.

For this reason I will hand write table rollback scripts. I use a script generator like ScriptDB to generate all of the other rollback scripts a week or so prior to deployment to capture any run fixes that were deployed during the development process.

I maintain a 1:1 relationship between deployment and rollback scripts. I execute the rollback scripts in reverse order to the deployments scripts. No muss, no fuss and no additional lists of scripts to maintain.

6. Don’t forget the jobs!

If you update requires job changes, add those to source control for deployment as well. All of the objects that are part of the deployment should be scripted.

7. Store the deployment order some place safe.

The largest deployment that I have worked on to date was over 600 individual scripts, with many exceptions to the steps listed above. Rebuilding that list would have taken a large amount of effort and testing, which I really did not have the time to do.

I prefer to store them in a database table table so that the list is available to other developers to make the necessary changes. I never want to be the one left holding the ball, let alone the one to drop, so let’s keep this in a safe place.

The Deployment

Take the list that you have so tediously maintained throughout the development process and script it into a file. I use sqlcmd to deploy all of the database objects, so this is a relatively simple process. Be sure that error handling is written into the script. I prefer that the batch ends when an error is encountered. If you followed #1 above, you can fix any errors and start the batch from the beginning.

The table that I use to store the scripts has the definition below. I have a template parameter for the database in the script. Use Ctrl+Shift+M or click the e “Specify Values for Template Parameters” button to populate this field.

USE <Database,,> ;
GO
IF EXISTS ( SELECT
                *
            FROM
                sys.objects
            WHERE
                object_id = OBJECT_ID(N'[dbo].[SQLScripts]') AND
                type IN ( N'U' ) )
    DROP TABLE [dbo].[SQLScripts]
GO
SET ANSI_NULLS ON ;
GO
SET QUOTED_IDENTIFIER ON ;
GO
SET ANSI_PADDING ON ;
GO
CREATE TABLE dbo.SQLScripts
    (
      SQLScriptOrder SMALLINT NOT NULL,
      SQLScriptDatabase VARCHAR(255) NOT NULL,
      SQLScriptName VARCHAR(255) NOT NULL,
      SQLScriptRelativePath VARCHAR(255) NOT NULL,
      CreateTimeStamp DATETIME NOT NULL
                               CONSTRAINT DF_SQLScripts_CreatedTimeStamp DEFAULT ( GETDATE() ),
      CreateUserID VARCHAR(50) NOT NULL
                               CONSTRAINT DF_SQLScripts_CreateUserID DEFAULT ( SUSER_SNAME() ),
      UpdateTimeStamp DATETIME NOT NULL
                               CONSTRAINT DF_SQLScripts_UpdatedTimeStamp DEFAULT ( GETDATE() ),
      UpdateUserID VARCHAR(50) NOT NULL
                               CONSTRAINT DF_SQLScripts_UpdateUserID DEFAULT ( SUSER_SNAME() ),
      CONSTRAINT PK_SQLScripts PRIMARY KEY CLUSTERED ( SQLScriptOrder ASC ),
      CONSTRAINT AK_SQLScripts UNIQUE ( SQLScriptDatabase ASC, SQLScriptName ASC )
    ) ;
GO
CREATE TRIGGER dbo.SQLScriptUpdate ON dbo.SQLScripts
    AFTER UPDATE
AS
UPDATE
    ss
SET
    UpdateTimeStamp = GETDATE(),
    UpdateUserID = SUSER_NAME()
FROM
    dbo.SQLScripts ss
WHERE
    EXISTS ( SELECT
                1
             FROM
                INSERTED i
             WHERE
                i.SQLScriptDatabase = ss.SQLScriptDatabase AND
                i.SQLScriptName = ss.SQLScriptName ) ;

To create the batch file used for deployment I use to SQL outlined below. I find this option to be very flexible. Run the result to text. I hate to use cursors like everyone else, but this ensures that I get the proper format out.

SET NOCOUNT ON ;
DECLARE
    @Server VARCHAR(50),
    @LocalPath VARCHAR(255),
    @SQLScriptRelativePath VARCHAR(255),
    @SQLScriptName VARCHAR(255),
    @Command VARCHAR(1000) ;

SET @Server = '(local)' ;
SET @LocalPath = 'C:\Temp' ;

PRINT 'REM Deployment Script Generator'
PRINT 'REM Generated On: ' + CONVERT(VARCHAR(10), GETDATE(), 101)
PRINT 'REM Server Name: ' + @Server ;
PRINT '' ;

DECLARE ScriptCursor CURSOR LOCAL FORWARD_ONLY READ_ONLY
    FOR SELECT
            SQLScriptRelativePath,
            SQLScriptName
        FROM
            dbo.SQLScripts
        ORDER BY
            SQLScriptOrder ;

OPEN ScriptCursor ;

FETCH NEXT FROM ScriptCursor INTO @SQLScriptRelativePath,@SQLScriptName ;
WHILE @@FETCH_STATUS = 0
    BEGIN
        PRINT 'REM ' + @SQLScriptRelativePath + '\' + @SQLScriptName ;
        PRINT 'sqlcmd -S ' + @Server + ' -i "' + @LocalPath + '\' + @SQLScriptRelativePath + '\' + @SQLScriptName + '"' ;
        PRINT 'if not %errorlevel%==0 goto :error' ;

        FETCH NEXT FROM ScriptCursor INTO @SQLScriptRelativePath,@SQLScriptName ;
    END ;

CLOSE ScriptCursor ;
DEALLOCATE ScriptCursor ;

PRINT '' ;
PRINT 'pause' ;
PRINT 'exit' ;
PRINT '' ;
PRINT ':error' ;
PRINT 'echo ERROR ENCOUNTERED' ;
PRINT 'pause' ;

Take the output of this script and save it as a batch file and VIOLA! Change the local path and order the scripts in reverse order to generate the rollback batch file.

Now you have a quick, dependable and free SQL deployment method.

Happy querying,

~Ron

Refeshing Views

Running on a recently updated to SQL Server 2005 cluster, we were noticing some performance issues with one stored procedure in particular. We initially blamed the issue on optimizer changes, but were looking a fix that did not involve and object change.

Someone came up with the idea (not me) to refresh the view. I am aware that a view that contains the dreaded “SELECT *” may not contain new table columns until the view is refreshed, but I have never actually run into this issue in real life. In our situation the view definition had not changed, nor had the underlying objects, but refreshing the view resolved the performance issues.

So on that note i wrote a quick little cursor up refresh all views in a database. I hope someone finds this useful.

/*******************************************************************************
Name:			Refresh Views
Description:	Declare our variables and cursor. The cursor looks at and 
                schema bound views since the are not able to be refreshed.
Dependencies:	
Testing:		
********************************************************************************
Author - Date - Change Description
--------------------------------------------------------------------------------
Ron Carpenter - 07/01/2010 - Initial Version
*******************************************************************************/
DECLARE @View varchar(255) ;

DECLARE ViewCursor CURSOR LOCAL READ_ONLY FORWARD_ONLY
    FOR SELECT
            [Name]
        FROM
            sys.objects o
        WHERE
            Type_Desc = 'VIEW' AND
            NOT EXISTS (SELECT
                            1
                        FROM
                            sys.sql_dependencies d
                        WHERE
                            o.object_id = d.object_id AND
                            d.class = 1)

OPEN ViewCursor ;

FETCH NEXT FROM ViewCursor INTO @View ;
WHILE @@FETCH_STATUS = 0 
    BEGIN
        EXECUTE sp_refreshview @View ;

        FETCH NEXT FROM ViewCursor INTO @View ;
    END ;

CLOSE ViewCursor ;
DEALLOCATE ViewCursor ;

Display Job History

We have a service that polls a myriad of things several times an hour to send pages and alerts to the appropriate people based on a specific set of criteria. One of the things that it polls is the SQL job history, which can be helpful to be proactive in catching long running jobs or even if a job gets disabled.

The time that a job is scheduled to run and it’s run duration is saved as an integer in hhmmss format, so converting this to the datetime datatype requires some manipulation. You can find a lot of ways to do this using string functions, which is the way we had completed the conversion … until is started erroring.

The statement below will convert the Integer value to the datetime datatype using mathmatical functions instead of string functions, which is less prone to error.

The first thing that we will do is extract the hours, minutes and seconds into their separate parts using the modulo operator. Once we have that we will convert these values into seconds (the smallest interval we are working with) and add them together. Once we have the offset value in seconds from midnight we use the dateadd function and add the seconds to the run date. Viola! And it’s even a little faster.

SELECT
    DATEADD(ss,(((Run_Time - Run_Time % 10000) / 1e4) * 36e2 + 
                ((Run_Time % 10000 - Run_Time % 100) / 1e2) * 6e1 + 
                  Run_Time % 100),CAST(CAST(Run_Date AS char(8)) AS datetime)) StartTime,
    DATEADD(ss,(((Run_Time - Run_Time % 10000) / 1e4) * 36e2 + 
                ((Run_Time % 10000 - Run_Time % 100) / 1e2) * 6e1 + 
                  Run_Time % 100 + 
                ((Run_Duration - Run_Duration % 10000) / 1e4) * 36e2 + 
                ((Run_Duration % 10000 - Run_Duration % 100) / 1e2) * 6e1 + 
                  Run_Duration % 100),CAST(CAST(Run_Date AS char(8)) AS datetime)) EndTime,
    j.name JobName,
    jh.step_name StepName
FROM
    msdb.dbo.sysjobhistory jh
INNER JOIN msdb.dbo.sysjobs j
    ON jh.job_id = j.job_id
WHERE
    jh.step_id > 0
ORDER BY
    j.job_id,
    jh.step_id ;

Snippets for SQL Server Management Studio (SSMS)

How many times have you had to open an existing SQL object and copy the header information to add to a new procedure, view or function? How many times have you done this and forgot to change the information in the header? Do you like standards? If you answered “Yes!” to any of these questions then templates are for you! Microsoft SQL Server Management Studio comes with a plethora of built-in templates, but this little gem of a feature also allows you to create your own!

To open template explorer, select View -> Template Explore or press Ctrl+Alt+T. This will open a side bar (by default, but you can change it) with the templates installed by default. To add your own templates, right-click on “SQL Server Templates” and then “New” from the context menu. This will give you the option to create a new folder or template. I prefer to keep my user-created templates in their own folder, since that makes it easier to transfer templates between machines.

Let’s create a folder called “CodeSnippets” as shown in the screen shot below.

Template Context Menu (Folder)

Template Context Menu (Folder)

Now that we have our folder created, let’s create some templates. We will create a basic header template and a stored procedure template. To create our templates, Right click on the newly create folder and select New -> Template from the context menu.

Template Context Menu (Template)

Template Context Menu (Template)

Now that we have the blank templates created, let’s add some content. Right click on the template and select “Edit” from the context menu.

Template Context Meny (Edit)

Template Context Meny (Edit)

From here you can write or paste your content into the template.After creating your templates, usage is as simple as dragging and dropping your template into the proper location in the query window.

I mentioned earlier about transferring templates between machines. To move your templates between machines browse to

C:\Documents and Settings\[User Profile]\Application Data\Microsoft\Microsoft SQL Server\[90 for SQL 2005 or 100 for SQL 2008]\Tools\Shell\Templates\Sql\[Your Snippet Folder Name]

for Windows XP and Windows Server 2003 or

C:\Users\[User Profile]\AppData\Roaming\Microsoft\Microsoft SQL Server\[90 for SQL 2005 or 100 for SQL 2008]\Tools\Shell\Templates\Sql\[Your Snippet Folder Name]

for Windows Vista, Windows 7 and Windows Server 2008. (Please replace the items in [] with your specific value for your environment) and copy the data to the appropriate path on the source machine.

I hope you find this as useful as I do! Happy coding!

~Ron

Updated: Attach Database Scripts

I updated my previous version of this script, which was written for SQL 2005, to handle filestream data. And I added a header for Ignas.

Please post in the comments if you run in to any issues.

~Ron

/******************************************************************************
    Name: Create Each Database for Attach
    Author: Ron Carpenter
    Date: 08/18/2009
    Revision: 1.0
    Purpose: This script will generate scripts to create each database on a
             sever for attach. It handles multiple files, multiple files,
             multiple files on a filegroup and multiple log files. There is a
             parameter to attach the logs or create the databases for attach
             with rebuild log.

    Author: Ron Carpenter
    Date: 01/20/2010
    Revision: 1.1
    Purpose: Updated the script to handle filestream data.
******************************************************************************/
USE [master] ;
GO
CREATE TABLE #_dba_migrate_db
    (
     FileID int IDENTITY(1,1)
                NOT NULL,
     DBName varchar(255) NOT NULL,
     DBGroupName varchar(255) NOT NULL,
     DBType tinyint NOT NULL,
     DBFileLogicalName varchar(255) NOT NULL,
     DBFileName varchar(1000) NOT NULL,
     DBSize varchar(15) NOT NULL,
     DBMaxSize varchar(15) NOT NULL,
     DBFileGrowth varchar(15) NOT NULL,
     CONSTRAINT PK_#_dba_migrate_db PRIMARY KEY CLUSTERED (DBType ASC,FileID ASC)
    ) ;
GO
sp_msforeachdb
'
USE [?] ;

IF DB_NAME() NOT IN (''master'',''model'',''msdb'',''tempdb'')
    INSERT INTO
        #_dba_migrate_db
        (
         DBName,
         DBGroupName,
         DBType,
         DBFileLogicalName,
         DBFileName,
         DBSize,
         DBMaxSize,
         DBFileGrowth
        )
        SELECT
            DB_NAME(),
            COALESCE(fg.name,''LOG''),
            f.type,
            LTRIM(RTRIM(f.name)),
            LTRIM(RTRIM(f.physical_name)),
            CAST(CEILING(((CONVERT(decimal(25,2),f.size) / 1024) * 8)) AS varchar(15)) + ''MB'',
            CASE f.max_size
              WHEN-1 THEN ''UNLIMITED''
              ELSE CAST(CEILING(((CONVERT(decimal(25,2),f.max_size) / 1024) * 8)) AS varchar(15)) + ''MB''
            END,
            CASE f.is_percent_growth
              WHEN 1 THEN CAST(f.growth AS varchar(15)) + ''%''
              ELSE CAST(CEILING(((CONVERT(decimal(25,2),f.growth) / 1024) * 8)) AS varchar(15)) + ''MB''
            END
        FROM
            sys.database_files f
        LEFT OUTER JOIN sys.filegroups fg
            ON f.data_space_id = fg.data_space_id ;
'
GO

SET NOCOUNT ON ;

SELECT
    'USE [master] ;' + CHAR(10) + 'GO' + CHAR(10) ;

DECLARE
    @DBName varchar(255),
    @DBGroupName varchar(255),
    @DBType int,
    @FileID int,
    @MaxData int,
    @MaxLog int,
    @MaxFileStrem int,
    @IncludeLog bit ;

SET @IncludeLog = 1 ;

DECLARE db_cursor CURSOR
    FOR SELECT DISTINCT
            DBName
        FROM
            #_dba_migrate_db

OPEN db_cursor ;

FETCH NEXT FROM db_cursor INTO @DBName ;
WHILE @@FETCH_STATUS = 0
    BEGIN
        SELECT
            'CREATE DATABASE ' + @DBName + ' ON'

        DECLARE group_cursor CURSOR
            FOR SELECT
                    MIN(FileID),
                    CASE DBType
                      WHEN 2 THEN 0
                      ELSE DBType
                    END,
                    DBGroupName
                FROM
                    #_dba_migrate_db
                WHERE
                    DBName = @DBName AND
                    (DBType <> 1 OR @IncludeLog = 1)
                GROUP BY
                    DBGroupName,
                    DBType
                ORDER BY
                    CASE DBType
                      WHEN 2 THEN 0
                      ELSE DBType
                    END,
                    MIN(FileID) ;

        OPEN group_cursor ;

        FETCH NEXT FROM group_cursor INTO @FileID,@DBType,@DBGroupName ;
        WHILE @@FETCH_STATUS = 0
            BEGIN
                SELECT
                    @MaxData = MAX(FileID)
                FROM
                    #_dba_migrate_db
                WHERE
                    DBName = @DBName AND
                    DBType IN (0,2) ;

                SELECT
                    @MaxLog = MAX(FileID)
                FROM
                    #_dba_migrate_db
                WHERE
                    DBName = @DBName AND
                    DBType = 1 ;

                SELECT
                    CASE WHEN DBType = 0 AND DBGroupName = 'PRIMARY' THEN DBGroupName
                         WHEN DBType = 1 THEN DBGroupName + ' ON'
                         WHEN DBType = 2 THEN 'FILEGROUP ' + DBGroupName + ' CONTAINS FILESTREAM'
                         ELSE 'FILEGROUP ' + DBGroupName
                    END
                FROM
                    #_dba_migrate_db
                WHERE
                    FileID = @FileID ;

                SELECT
                    '(' + CHAR(10) + '	NAME = ' + DBFileLogicalName + ', ' + CHAR(10) + '	FILENAME = ''' + DBFileName + CASE WHEN DBType <> 2 THEN ''',' + CHAR(10) + '	SIZE = ' + DBSize + ',' + CHAR(10) + '	MAXSIZE = ' + DBMaxSize + ',' + CHAR(10) + '	FILEGROWTH = ' + DBFileGrowth
                                                                                                                             ELSE ''''
                                                                                                                        END + CHAR(10) + ')' + CASE WHEN FileID IN (@MaxData,@MaxLog) THEN ''
                                                                                                                                                    ELSE ','
                                                                                                                                               END
                FROM
                    #_dba_migrate_db
                WHERE
                    DBName = @DBName AND
                    DBGroupName = @DBGroupName ;

                FETCH NEXT FROM group_cursor INTO @FileID,@DBType,@DBGroupName ;
            END

        CLOSE group_cursor ;

        DEALLOCATE group_cursor ;

        SELECT
            CASE WHEN @IncludeLog = 1 THEN 'FOR ATTACH;'
                 ELSE 'FOR ATTACH_REBUILD_LOG;'
            END + CHAR(10) + 'GO' + CHAR(10) ;

        FETCH NEXT FROM db_cursor INTO @DBName ;
    END

CLOSE db_cursor ;

DEALLOCATE db_cursor ;

DROP TABLE #_dba_migrate_db ;

Scripting Role and User Permissions

I recently had to copy a large number of databases to a development environment. There was several hundred gigabytes of files, so I went with stop sql services, copy, restart services and attach new databases route.

Before copying the files, I migrated the users. After attaching the database files to the new instance, I ran the orphaned user scripts. Everything seemed to have swimmingly.

A couple of hours later, a developer called me up and said that they were running into permission errors executing stored procedures. There were a lot of scripts out there for user permissions, but none of them seemed to fit my specific needs, especially since this was SQL 2000. So, on that note, here is my addition to that myriad of scripts for SQL Server permissions.

/* SQL 2005/2008 */
DECLARE
 @Login varchar(50),
 @Role varchar(50);

SET @Login = NULL;
SET @Role = NULL;

SELECT
 perm.state_desc
 + SPACE(1)
 + perm.permission_name
 + ' on ['
 + s.name
 + '].['
 + o.name
 + '] to ['
 + prin.name
 + '];'
 + CHAR(10)
 + 'GO'
 + CHAR(10)
 COLLATE LATIN1_General_CI_AS
FROM
 sys.database_permissions perm
INNER JOIN
 sys.objects o ON
 perm.major_id = o.OBJECT_ID
INNER JOIN
 sys.schemas s ON
 o.SCHEMA_ID = s.SCHEMA_ID
INNER JOIN sys.database_principals prin ON
 perm.grantee_principal_id = prin.principal_id
WHERE
 ((prin.type = 'U' AND prin.name = @Login) OR @Login IS NULL) AND
 ((prin.type = 'R' AND prin.name = @Role) OR @Role IS NULL);

/* SQL 2000 */
DECLARE
 @Login varchar(50),
 @Role varchar(50),
 @ObjectName varchar(50);

SET @Login = NULL;
SET @Role = NULL;
SET @ObjectName = '[admin_getStoreOrderTimes]';

SELECT
 CASE ProtectType
 WHEN 204 THEN 'GRANT_W_GRANT'
 WHEN 205 THEN 'GRANT'
 WHEN 206 THEN 'DENY' END
 + SPACE(1)
 + CASE Action
 WHEN 26 THEN 'REFERENCES'
 WHEN 178 THEN 'CREATE FUNCTION'
 WHEN 193 THEN 'SELECT'
 WHEN 195 THEN 'INSERT'
 WHEN 196 THEN 'DELETE'
 WHEN 197 THEN 'UPDATE'
 WHEN 198 THEN 'CREATE TABLE'
 WHEN 203 THEN 'CREATE DATABASE'
 WHEN 207 THEN 'CREATE VIEW'
 WHEN 222 THEN 'CREATE PROCEDURE'
 WHEN 224 THEN 'EXECUTE'
 WHEN 228 THEN 'BACKUP DATABASE'
 WHEN 233 THEN 'CREATE DEFAULT'
 WHEN 235 THEN 'BACKUP LOG'
 WHEN 236 THEN 'CREATE RULE' END
 + SPACE(1)
 + 'ON'
 + SPACE(1)
 + '['
 + USER_NAME(o.uid)
 + '].['
 + OBJECT_NAME(o.id)
 + ']'
 + SPACE(1)
 + 'TO'
 + SPACE(1)
 + '['
 + u.name
 + '];'
 + CHAR(10)
 + 'GO'
 + CHAR(10)
FROM
 sysusers u
INNER JOIN
 sysprotects p ON
 u.uid = p.uid
INNER JOIN
 sysobjects o ON
 p.id = o.id
WHERE
 ((IsLogin = 1 AND u.name = @Login) OR @Login IS NULL) AND
 (((u.IsSQLRole = 1 OR u.IsAppRole = 1) AND u.name = @Role) OR @Role IS NULL) AND
 (OBJECT_NAME(o.id) = @ObjectName OR @ObjectName IS NULL) AND
 u.name NOT IN ('public','db_owner','db_accessadmin','db_securityadmin','db_ddladmin',
 'db_backupoperator','db_datareader','db_datawriter','db_denydatareader',
 'db_denydatawriter','guest');

~Ron

Script Column Default Values

Another database developer had copied several database from our model office back to our development environment. When we started to wire up out applications I noticed that there were no indexes, defaults, keys, or constraints of any kind. Just heaps.

Adding these constraints is fairly straigtforward, and there are many tools out there to do that. But I could not find a tool that would script just the default constraints without also dropping and/or creating the table. It took a couple of days to get the databases moved over in the first place, but there were over a thousand default constraints, which could also take several days if done manually.

I was sure that there was a way to script this, but after over an hour of searching I decided that I burned enough time and that I would write one myself. I don’t know how often anyone would come across this issue, but here it is anyways.

SELECT
    'ALTER TABLE dbo.' + OBJECT_NAME(s.id) + ' ADD CONSTRAINT DF_' + OBJECT_NAME(s.id) + '_' + sc.name + ' DEFAULT ' + c.text + ' FOR ' + sc.name
FROM
    dbo.sysconstraints s
INNER JOIN syscolumns sc
	ON s.id = sc.id AND
	   s.colid = sc.colid
INNER JOIN dbo.syscomments c
    ON s.constid = c.id
WHERE
    s.status & 5 = 5