12 years ago

Logging database updates with database triggers

I have recently been involved in a project where an application I have written gets scheduled updates of data into its Microsoft SQL Server 2008 instance. The database system that originally houses this data is very complex so SQL replication would have been a overly complicated option but essentially these updates are attempting to mimic this behaviour.  When my database received a new batch of data I asked the data provider to set a flag in a status table to alert my application that this had been completed successfully.  My application would then monitor this flag and do the neccessary database operations to start utilizing the new data.

I wanted a way of knowing when these updates occurred, where they came from and whether my application had noticed the updates (i.e. had my application correctly noticed the new data before another batch of new data came in).  I also wanted to do all this without having to get the provider of the data to perform any changes.

This is a perfect case for a very simple trigger.

Firstly I needed to create a table to store the log messages I was going to create each time I received a data upload.  Below we have a simple table containing 'type' which allowed me to specify what type of data I received (I have many different sets of data which can be updated), before_status and after_status, store the upload status flag before and after the update.  These two values are quite important as it is these values which tell me whether my application has spotted the new data before the next update or not.  The column 'create_time' is the time this 'event' occured.  The fields 'clientaddress', 'transport' and 'protocol' store information about where the data upload came from.

Here is the table

CREATE TABLE update_log
(
id int IDENTITY(1,1) PRIMARY KEY NOT NULL,
type nvarchar(50) NOT NULL,
before_status bit NOT NULL,
after_status bit NOT NULL,
create_time datetime NOT NULL,
clientaddress nvarchar(48) NULL,
transport nvarchar(80) NULL,
protocol nvarchar(80) NULL,
);

You can see why I used this schema for the columns clientaddress, transport and protocol by visiting the MSDN documentation for sys.dm_exec_connections. Here is the trigger I applied to the status table.  This is fired everytime the 'flag' is changed.

CREATE TRIGGER tr_upload_status_change on upload_status
AFTER UPDATE
AS
BEGIN
DECLARE @before as bit,
@after as bit,
@clientaddress as nvarchar(48),
@transport as nvarchar(80),
@protocol as nvarchar(80)

SET @before = (SELECT upload_status FROM deleted)
SET @after = (SELECT upload_status FROM inserted)

SELECT @clientaddress = client_net_address, @transport = net_transport, @protocol = protocol_type FROM sys.dm_exec_connections WHERE session_id = @@SPID;

SET NOCOUNT ON;
INSERT INTO update_log VALUES ('STATUS_TYPE', @before, @after, GETDATE(), @clientaddress, @transport, @protocol)
END

So lets test this out...

 UPDATE upload_status SET upload_status = 0;

Now if we look in the 'upload_log' table we will see

SELECT * FROM update_log;
id type before_status after_status create_time clientaddress transport protocol
---- ------------ ------------- ------------ ----------------------- ---------------- ----------------- ----------
104 STATUS_TYPE 0 1 2012-11-26 22:28:13.497 local machine Shared memory TSQL

So you can see a nice log message showing the important information.

You can extend this log table and the trigger to provide lots more information like number of rows imported etc should you need it.  Very powerful.