//
you're reading...
SQL Server

Auto Backup Logs at 70% utilization

I recently had a reason to reconsider my policy towards transaction log file allocation. Traditionally I set the transaction log files to auto grow unlimited and perform daily or intraday transaction log backups, not wanting to take the chance on running out of log space arbitrarily. However I had set up a new sql server status and reporting system to try and consolidate that functionality into a single process. One of the alerts is on log file growth, the alert triggers when log file gets a new space allocation. Well all of the sudden I am getting inundated with alerts for log file extensions. Of course this prompted me to check into why this was happening so frequently. Of course I found several misconfigured transaction log file settings that I inherited but overlooked. This made me take pause and  rethink my strategy. The thought process now is to set the transaction log size to a fixed amount based on historical precedent (or pick an arbitrarily high number if a new database) then put in place a sql performance alert triggering a job (or simply a scheduled  job at a fixed interval) that triggers a transaction log backup at >= 70%. Obviously there are some considerations here, transactional volatility,size of the database and size of the transaction log i.e. how long will it take to perform the transaction log backup and will that impact the production environment adversely.

Outlined below is the code necessary to set up a job that checks the current log file utilization for all databases and issues a transaction log backup for the specified % utilization threshold. I have a database “dba” that I use foe all my monitoring, special stored procedures etc.

1) The first stored proc simply runs a simple DBCC SQLPERF(logspace) command:

USE dba
go

IF EXISTS (SELECT * FROM sys.objects WHERE type = ‘P’ AND name = ‘usp_DBA_SQLPerf’)
DROP PROCEDURE dbo.usp_DBA_SQLPerf
GO

CREATE PROC dbo.usp_DBA_SQLPerf
AS
DBCC SQLPERF(logspace)
GO

2) The next step is to create a table to hold the results of the dbcc command:

IF OBJECT_ID(‘logSpaceStats’) IS NULL
BEGIN
CREATE TABLE dbo.logSpaceStats
(
id INT IDENTITY (1,1),
logDate datetime DEFAULT GETDATE(),
databaseName sysname,
logSize decimal(18,5),
logUsed decimal(18,5)
)
END
GO

3) The next stored procedure puts the first two pieces together and pumps the results into the “logSpaceStats” table:

IF EXISTS (SELECT * FROM sys.objects WHERE type = ‘P’ AND name = ‘usp_DBA_GetSQLPerfStats’)
DROP PROCEDURE dbo.usp_DBA_GetSQLPerfStats
GO

CREATE PROC dbo.usp_DBA_GetSQLPerfStats
AS
SET NOCOUNT ON

CREATE TABLE #tFileList
(
databaseName sysname,
logSize decimal(18,5),
logUsed decimal(18,5),
status INT
)

INSERT INTO #tFileList
       EXEC usp_DBA_SQLPerf

INSERT INTO logSpaceStats (databaseName, logSize, logUsed)
SELECT databasename, logSize, logUsed
FROM #tFileList

DROP TABLE #tFileList
GO

4) The next step creates the stored procedure that will select those user databases where the log file utilization is above the threshold and dynamically create the log backup statement. Please provide your backup code based on what is utilized in your organization:

IF EXISTS (SELECT * FROM sys.objects WHERE type = ‘P’ AND name = ‘usp_DBA_AutoBackupLog’)
DROP PROCEDURE dbo.usp_DBA_AutoBackupLog
GO

CREATE PROC dbo.usp_DBA_AutoBackupLog
AS
SET NOCOUNT ON

exec usp_DBA_GetSQLPerfStats

DECLARE csr CURSOR FOR
SELECT databaseName
FROM (SELECT id
,logDate
,databaseName
,logSize
,logUsed
,row_number() over (partition by databaseName order by abs(datediff(dd, logDate, getdate()))) Ranking
FROM dbo.logSpaceStats) XX
WHERE logUsed >= 70.00 AND databaseName not in (‘master’,’tempdb’,’msdb’,’model’) AND Ranking = 1

OPEN csr
DECLARE @databaseName varchar(100)

FETCH NEXT FROM csr INTO @databaseName
WHILE @@FETCH_STATUS = 0
BEGIN
    DECLARE @sqlcmd varchar(max),@cmd VARCHAR(MAX)

/* Add your backup code here – I use sqlcmd to execute my backup stored procedure */
    SET @sqlcmd = ‘sqlcmd -E -S yourServerName -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Compress=””Y””,@Databases = ””’ + @databaseName + ””’, @Directory = N””Y:\Backup””, @BackupType = ””LOG””, @Verify = ””N””, @CleanupTime = 1, @CheckSum = ””Y””, @LogToTable = ””Y””" -b’
 
    DECLARE @sql varchar(max)
    SET @cmd = ‘EXEC xp_cmdshell ”’ + @sqlcmd + ””
    EXEC (@cmd)
    FETCH NEXT FROM csr INTO @databaseName;
END;
CLOSE csr;
DEALLOCATE csr;
GO

5) The final step is to create a job that executes usp_DBA_AutoBackupLog and either set up a sql server performance altert on log file % utilization or simply schedule the job the run at a given interval.

Advertisements

About ldgaller

Accomplished Data Warehouse Architect, DBA and Software Architect with over 15 years of professional experience and demonstrated success designing and implementing solutions that improve business functionality and productivity. Highly diverse technical background with proven ability to design, develop and implement technology on an enterprise level. I approach all projects with passion, diligence, integrity, and exceptional aptitude.

Discussion

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: