Add column to table and then update it inside transaction

I am creating a script that will be run in a MS SQL server. This script will run multiple statements and needs to be transactional, if one of the statement fails the overall execution is stopped and any changes are rolled back.

I am having trouble creating this transactional model when issuing ALTER TABLE statements to add columns to a table and then updating the newly added column. In order to access the newly added column right away, I use a GO command to execute the ALTER TABLE statement, and then call my UPDATE statement. The problem I am facing is that I cannot issue a GO command inside an IF statement. The IF statement is important within my transactional model. This is a sample code of the script I am trying to run. Also notice that issuing a GO command, will discard the @errorCode variable, and will need to be declared down in the code before being used (This is not in the code below).

BEGIN TRANSACTION

DECLARE @errorCode INT
SET @errorCode = @@ERROR

-- **********************************
-- * Settings
-- **********************************
IF @errorCode = 0
BEGIN
 BEGIN TRY
  ALTER TABLE Color ADD [CodeID] [uniqueidentifier] NOT NULL DEFAULT ('{00000000-0000-0000-0000-000000000000}')
  GO
 END TRY
 BEGIN CATCH
  SET @errorCode = @@ERROR
 END CATCH
END

IF @errorCode = 0
BEGIN
 BEGIN TRY
  UPDATE Color
  SET CodeID= 'B6D266DC-B305-4153-A7AB-9109962255FC'
  WHERE [Name] = 'Red'
 END TRY
 BEGIN CATCH
  SET @errorCode = @@ERROR
 END CATCH
END

-- **********************************
-- * Check @errorCode to issue a COMMIT or a ROLLBACK
-- **********************************
IF @errorCode = 0
BEGIN
 COMMIT
 PRINT 'Success'
END
ELSE 
BEGIN
 ROLLBACK
 PRINT 'Failure'
END

So what I would like to know is how to go around this problem, issuing ALTER TABLE statements to add a column and then updating that column, all within a script executing as a transactional unit.


GO is not a T-SQL command. Is a batch delimiter. The client tool (SSM, sqlcmd, osql etc) uses it to effectively cut the file at each GO and send to the server the individual batches. So obviously you cannot use GO inside IF, nor can you expect variables to span scope across batches.

Also, you cannot catch exceptions without checking for the XACT_STATE() to ensure the transaction is not doomed.

Using GUIDs for IDs is always at least suspicious.

Using NOT NULL constraints and providing a default 'guid' like '{00000000-0000-0000-0000-000000000000}' also cannot be correct.

Updated:

  • Separate the ALTER and UPDATE into two batches.
  • Use sqlcmd extensions to break the script on error. This is supported by SSMS when sqlcmd mode is on, sqlcmd, and is trivial to support it in client libraries too: dbutilsqlcmd.
  • use XACT_ABORT to force error to interrupt the batch. This is frequently used in maintenance scripts (schema changes). Stored procedures and application logic scripts in general use TRY-CATCH blocks instead, but with proper care: Exception handling and nested transactions.

example script:

:on error exit

set xact_abort on;
go

begin transaction;
go

if columnproperty(object_id('Code'), 'ColorId', 'AllowsNull') is null
begin
    alter table Code add ColorId uniqueidentifier null;
end
go

update Code 
  set ColorId = '...'
  where ...
go

commit;
go

Only a successful script will reach the COMMIT. Any error will abort the script and rollback.

I used COLUMNPROPERTY to check for column existance, you could use any method you like instead (eg. lookup sys.columns).


Orthogonal to Remus's comments, what you can do is execute the update in an sp_executesql.

ALTER TABLE [Table] ADD [Xyz] NVARCHAR(256);

DECLARE @sql NVARCHAR(2048) = 'UPDATE [Table] SET [Xyz] = ''abcd'';';
EXEC sys.sp_executesql @query = @sql;

We've needed to do this when creating upgrade scripts. Usually we just use GO but it has been necessary to do things conditionally.