Total Pageviews

Wednesday, December 9, 2020

SQL Server Errorlog Processing and Visualization

Reviewing the SQL Server Errorlog is a common and important task that SQL Server Database Professionals perform on a regular basis to identify and troubleshoot database server related issues. The simplistic process described in this post can prove to be a very effective method for a large number of SQL Server instances.

The process parses the Errorlog for known keywords or common phrases, then categories each event before dumping them to a SQL Server table. For visualization purposes, the open source tool Grafana (http://www.grafana,com) is used, and for wild card searches SQL Server FullText is also used.

Dashboard Analytics for SQL Server Errorlog

The following PowerShell Script parses the Errorlog from multiple servers and inserts the categorized data into a SQL Server Table.

Scrips and dashboard download Link:


File: PowerShell - ProcessErrorlogMain.zip
Link: http://bit.ly/1cZNScb

Direct link: https://www.dropbox.com/scl/fo/t4uok05mplkbzt5n5d8hs/h?rlkey=r4wfnyuddbj1l6sakaw055fiv&dl=0

Common Variables used in the PowerShell scripts:

  1. $nInitialDays = Initial number of days to scan
  2. $sqlInstance = SQL Server instances
  3. $LastCheckpoint = Staring date and time
  4. $NextCheckpointTime = End date and time
  5. “Servers.txt” = list of SQL server instances, we need to create this file
  6. “Checkpoint.txt” = used to hold last scan date and time, will be created automatically

SQL Server Table for parsed  data preservation:

USE [yourdbname]
GO

-- table for SQL Server event log
CREATE TABLE [dbo].[EventTbl] (
  [eventID] [bigint] IDENTITY (1, 1) NOT NULL
 ,[eventCreated] [datetime] NULL
 ,[eventServer] [varchar](50) NULL
 ,[eventSource] [varchar](50) NULL
 ,[eventType] [varchar](50) NULL
 ,[eventText] [varchar](8000) NULL
 ,CONSTRAINT [PK_EventTbl] PRIMARY KEY CLUSTERED
  ( [eventID] ASC ) WITH (PAD_INDEX = OFF, 
          STATISTICS_NORECOMPUTE = OFF, 
          IGNORE_DUP_KEY = OFF, 
          ALLOW_ROW_LOCKS = ON, 
          ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

-- Create a fulltext catalog
CREATE FULLTEXT CATALOG [ErrorlogCatalog] WITH ACCENT_SENSITIVITY = OFF
AUTHORIZATION [db_owner]

-- Create fulltext index
CREATE FULLTEXT INDEX ON [dbo].[EventTbl] 
	KEY INDEX [PK_EventTbl] ON ([ErrorlogCatalog], 
	FILEGROUP [PRIMARY]) WITH (CHANGE_TRACKING AUTO)

ALTER FULLTEXT INDEX ON [dbo].[EventTbl] ADD ([eventText] LANGUAGE [English])
ALTER FULLTEXT INDEX ON [dbo].[EventTbl] ENABLE

ProcessErrorlogReview.ps1: Quick review of Errorlog from multiple servers

This PowerShell Script examines all Errorlogs within a certain time interval and displays the output. One or more server names can be passed, either by supplying the server names in a “servers.text“ file or by assigning server names directly to the variable $sqlInstances in the script.

 

Clear-Host
Import-Module sqlserver

$nInitialDays = 15
$match = '(Login|shutdown)'

[DateTime]$InitialStart = $(Get-Date).AddDays(-$nInitialDays)

# Load the list of server from file
# $PSScriptRoot = Split-Path -Parent -Path $MyInvocation.MyCommand.Definition
# $path = "$PSScriptRoot"
# $sqlInstanaces = Get-Content ($path + "\Servers.txt")

# SQL Server instances to collect errorlog event
# Supply server name 
[String[]]$sqlInstanaces = "SHBMAIN","SHBMAIN\SQL2019","SHBMAIN\SQL2017"

# Next check point date and time
$CheckTime = Get-Date

$result = @()
foreach ($sqlInstanace in $sqlInstanaces)
{
	$obj = "" | select eventCreated, eventServer,eventSource,eventType,eventText
	$obj = Get-SqlErrorLog -ServerInstance $sqlInstanace -After $InitialStart -Before $CheckTime -ErrorAction SilentlyContinue | `
	select @{ n = "eventCreated"; e = { $_.Date } }, `
		   @{ n = "eventYear"; expression = { $_.Date.ToString("yyyy") } }, `
		   @{ n = "eventMonth"; expression = { $_.Date.ToString("MMMM") } }, `
		   @{ n = "eventDate"; expression = { $_.Date.ToString('MM/dd/yyyy') } }, `
		   @{ n = "eventTime"; expression = { $_.Date.ToString('hh:mm:ss tt') } }, `
		   @{ n = "eventServer"; e = { $_.ServerInstance } }, `
		   @{ n = "eventSource"; e = { $_.Source } }, `	
           @{ n = "eventType"; expression = { if ($_.'Text' -match 'DBCC CHECKDB' ) {'CHECKDB'}  
                                                    elseif ($_.'Text' -match 'cachestore' ) {'CacheFlush'}           
                                                    elseif ($_.'Text' -match 'Job Failed' ) {'JobFailed'} 
                                                    elseif ($_.'Text' -match '(specified in network packet|packet used to open)' ) {'Connection'} 
                                                    elseif ($_.'Text' -match 'Configuration' ) {'Configuration'} 
                                                    elseif ($_.'Text' -match 'memory' ) {'Memory'} 
                                                    elseif ($_.'Text' -match 'Killed' ) {'Killed'} 
                                                    elseif ($_.'Text' -match 'requests taking longer' ) {'I/O'} 
                                                    elseif ($_.'Text' -match 'FlushCache' ) {'I/O'} 
                                                    elseif ($_.'Text' -match 'Login failed' ) {'LoginFailed'} 
                                                    elseif ($_.'Text' -match 'Login succeeded' ) {'LoginSucceeded'} 
                                                    elseif ($_.'Text' -match 'deadlocked' ) {'Deadlock'} 
                                                    elseif ($_.'Text' -match 'SQL Trace ID' ) {'TraceStart'} 
                                                    elseif ($_.'Text' -match 'SQL Trace stopped' ) {'TraceStop'} 
                                                    elseif ($_.'Text' -match 'Database backed up' ) {'BackupFull'} 
                                                    elseif ($_.'Text' -match 'Database differential changes were backed up' ) {'BackupDifferential'} 
                                                    elseif ($_.'Text' -match 'Log was backed' ) {'BackupLog'}                                                       
                                                    elseif ($_.'Text' -match 'system shutdown' ) {'Shutdown'} 
                                                    elseif ($_.'Text' -match 'RESOLVING_NORMAL' ) {'AlwaysOn'}                                                       
                                                    elseif ($_.'Text' -match 'RECOVERY' ) {'DatabaseRecoveryModel'} 
                                                    elseif ($_.'Text' -match 'RESTORING' ) {'DatabaseRestoring'} 
                                                    elseif ($_.'Text' -match 'Fatal' ) {'Error'}
                                                    else   {'Other'}} }, `
          @{ n = "eventText"; e = { $_.Text.Replace("'", "''") } } 
 
        # filter out whatever is needed
        # $obj = $obj | Where-Object { $_.eventType -notmatch "Other|LogBackup" }
        # $obj = $obj | Where-Object { $_.eventSource -match "Logon" -and $_.eventType -notmatch "Other"}
 
        $obj = $obj | Where-Object { $_.eventType -match $match }	
	if ($obj.Count -ge 1 )
	{
	    Write-Host "Number of records for $sqlInstanace : $($obj.count)"
	    $result += $obj
	}
}

$result | ft

Sample output of parsed errorlog:

ProcessErrorlogMain.ps1: For automation - parsing, categorizing, preserving data and visualization

This script is used to collect events from one or more SQL Servers using a regular interval through a scheduled task and inserts the parsed data into a SQL Server Table.

How does it work?

When it runs for the first time, either through a scheduler task or a SQL Server Agent Job, the script parses data for the last 15 days, e.g. from x date to current date time. It remembers the last scan date and time, and creates a checkpoint.txt file. When the script is run again, it retrieves the last scan date and time from the checkpoint.txt file and uses it as the starting time.

The parsed data will be inserted into a SQL Server Table (EventTbl) on a SQL Server database.

Visualization of parsed Errorlog data:

Since Grafana supports SQL Server connectivity, it will be used to visualize the data. The data source name in the dashboard is “SQLLog”.


Clear-Host
Import-Module sqlserver

# Database server to export results
$database = 'TestDB'
$server = 'SHBMAIN\SQL2019'
$table = 'dbo.EventTbl'

$nInitialDays = 15
[DateTime]$InitialStart = $(Get-Date).AddDays(-$nInitialDays)

# Load the checkpoint file
$PSScriptRoot = Split-Path -Parent -Path $MyInvocation.MyCommand.Definition
$path = "$PSScriptRoot"
$CheckPointFile = $path + "\Checkpoint.txt"

# SQL Server instances to collect errorlog event
# Supply server name in the script
# [String[]]$sqlInstanaces = "SA0481V", "SA0997V", "SA0270V","SA0435"

# Supply server name from a text file

# List off all servers from the 'servers.txt'
if (!( Test-Path ($path + "\Servers.txt") )) 
{ 
    Write-Host "Servers.txt does not contain any SQL Server Instances. Add SQL Servers...." -ForegroundColor Yellow
    Write-Host ""
    $null | Add-Content ($path + "\Servers.txt") 
    return
}
else
{
    $sqlInstanaces = Get-Content ($path + "\Servers.txt") -Force
}

# Retrieve last scan date/time from Checkpoint file
if (! (Test-Path $CheckPointFile)) # Checkpoint file not found
{
	Write-Host 'Checkpoint file does NOT exist'
    # first and last log scan date and time
	$LastCheckpoint = $InitialStart
    $NextCheckpointTime = Get-Date
    $NextCheckpointTime | Set-Content -Force $CheckPointFile
}
else # the Checkpoint file does exist 
{
	Write-Host 'Checkpoint file does exist'
	[DateTime]$LastCheckpoint = Get-Content -Force $CheckPointFile
    $NextCheckpointTime = Get-Date
    $NextCheckpointTime | Set-Content -Force $CheckPointFile
}

Write-Host "Staring from : $LastCheckpoint"
Write-Host "Next Start: $NextCheckpointTime"
Write-Host " "

#$match = 'Login failed|requests taking longer|is full'
#$notmatch = 'without errors|found 0 errors'

# $result = @()
foreach ($sqlInstanace in $sqlInstanaces)
{
	$obj = "" | select eventCreated, eventServer,eventSource,eventType,eventText
	$obj = Get-SqlErrorLog -ServerInstance $sqlInstanace -After $LastCheckpoint -Before $NextCheckpointTime -ErrorAction SilentlyContinue | `
	select @{ n = "eventCreated"; e = { $_.Date } }, `
		   @{ n = "eventYear"; expression = { $_.Date.ToString("yyyy") } }, `
		   @{ n = "eventMonth"; expression = { $_.Date.ToString("MMMM") } }, `
		   @{ n = "eventDate"; expression = { $_.Date.ToString('MM/dd/yyyy') } }, `
		   @{ n = "eventTime"; expression = { $_.Date.ToString('hh:mm:ss tt') } }, `
		   @{ n = "eventServer"; e = { $_.ServerInstance } }, `
		   @{ n = "eventSource"; e = { $_.Source } }, `	
           @{ n = "eventType"; expression = { if ($_.'Text' -match 'DBCC CHECKDB' ) {'CHECKDB'}  
                                                    elseif ($_.'Text' -match 'cachestore' ) {'CacheFlush'}           
                                                    elseif ($_.'Text' -match 'Job Failed' ) {'JobFailed'} 
                                                    elseif ($_.'Text' -match '(specified in network packet|packet used to open)' ) {'Connection'} 
                                                    elseif ($_.'Text' -match 'Configuration' ) {'Configuration'} 
                                                    elseif ($_.'Text' -match 'memory' ) {'Memory'} 
                                                    elseif ($_.'Text' -match 'Killed' ) {'Killed'} 
                                                    elseif ($_.'Text' -match 'requests taking longer' ) {'I/O'} 
                                                    elseif ($_.'Text' -match 'FlushCache' ) {'I/O'} 
                                                    elseif ($_.'Text' -match 'Login failed' ) {'LoginFailed'} 
                                                    elseif ($_.'Text' -match 'Login succeeded' ) {'LoginSucceeded'} 
                                                    elseif ($_.'Text' -match 'deadlocked' ) {'Deadlock'} 
                                                    elseif ($_.'Text' -match 'SQL Trace ID' ) {'TraceStart'} 
                                                    elseif ($_.'Text' -match 'SQL Trace stopped' ) {'TraceStop'} 
                                                    elseif ($_.'Text' -match 'Database backed up' ) {'BackupFull'} 
                                                    elseif ($_.'Text' -match 'Database differential changes were backed up' ) {'BackupDifferential'} 
                                                    elseif ($_.'Text' -match 'Log was backed' ) {'BackupLog'}                                                       
                                                    elseif ($_.'Text' -match 'system shutdown' ) {'Shutdown'} 
                                                    elseif ($_.'Text' -match 'RESOLVING_NORMAL' ) {'AlwaysOn'}                                                       
                                                    elseif ($_.'Text' -match 'RECOVERY' ) {'DatabaseRecoveryModel'} 
                                                    elseif ($_.'Text' -match 'RESTORING' ) {'DatabaseRestoring'} 
                                                    elseif ($_.'Text' -match 'Fatal' ) {'Error'}
                                                    else   {'Other'}} }, `
          @{ n = "eventText"; e = { $_.Text.Replace("'", "''") } } 
 
        # filter out whatever is needed
        $obj = $obj | Where-Object { $_.eventType -notmatch "Other"}	
	if ($obj.Count -ge 1 )
	{
	    Write-Host "$sqlInstanace - total records $($obj.count) and exporting log to SQL Server $server" -ForegroundColor Yellow
            Write-Host ""
            $obj | ForEach-Object {
		   Invoke-Sqlcmd `
		    -Database $database -ServerInstance $server `
                    -Query "insert into $table VALUES ( '$($_.eventCreated)', '$($_.eventServer)',`
                        '$($_.eventSource)','$($_.eventType)','$($_.eventText)' ) " }
		# $result += $obj
	}
}

# $result | ft

ProcessErrorlogImport.ps1: Manually parsing and Importing of Errorlog data

This script can be used to manually import data from servers which are not listed in the “servers.txt” file. Suppose “SRV1010” is not in the “server.txt” file and we want to include this server in our scan of the last 15 days. In the script, we can assign one or more SQL Server Instances to the variable $sqlInstances, as follows:


$sqlInstances = “SRV1010”
$sqlInstances = “SRV1010”,”QRD0450\SQL2019”,”DEV4568”


Clear-Host
Import-Module sqlserver

#########################################################################
# Additional log import
#########################################################################

# SQL Server instance to collect errorlog event
[String[]]$sqlInstanaces = "SHBMAIN\SQL2012","SHBMAIN\SQL2014" 

# Database server to export results
$database = 'TestDB'
$server = 'SHBMAIN\SQL2019'
$table = 'dbo.EventTbl'

$nInitialDays = 15
[DateTime]$InitialStart = $(Get-Date).AddDays(-$nInitialDays)

# Load the checkpoint file
$PSScriptRoot = Split-Path -Parent -Path $MyInvocation.MyCommand.Definition
$path = "$PSScriptRoot"
$CheckPointFile = $path + "\Checkpoint.txt"

# Retrieve last scan date/time from Checkpoint file
if (! (Test-Path $CheckPointFile)) # Checkpoint file not found
{
	Write-Host 'Checkpoint file does NOT exist'
    # first and last log scan date and time
	[DateTime]$LastCheckpoint = $InitialStart
    [DateTime]$NextCheckpointTime = Get-Date
    $NextCheckpointTime | Set-Content -Force $CheckPointFile
}
else # the Checkpoint file does exist 
{
	Write-Host 'Checkpoint file does exist'
	[DateTime]$LastCheckpoint = $InitialStart
    [DateTime]$NextCheckpointTime = Get-Content -Force $CheckPointFile
}

# list previously added servers
$Servers = Get-Content ($path + "\Servers.txt") -Force

Write-Host "Staring from : $LastCheckpoint"
Write-Host "Next Start: $NextCheckpointTime"
Write-Host " "

# $result = @()
foreach ($sqlInstanace in $sqlInstanaces)
{
	$Process = $false
	
	if ($Servers -match $sqlInstanace.Replace("\", "\\"))
	{
        $Process = $false
		Write-Host "$sqlInstanace - does exist in server.txt file" -ForegroundColor Cyan		
	}
	else
	{
		$Process = $true
		Write-Host "$sqlInstanace - is new and including in server.txt file" -ForegroundColor Yellow
		$sqlInstanace | Add-Content ($path + "\Servers.txt") 
	}
	
	if ($Process)
	{
		Write-Host "$sqlInstanace - processing errorlog" -ForegroundColor Yellow
		$obj = "" | select eventCreated, eventServer, eventSource, eventType, eventText
		$obj = Get-SqlErrorLog -ServerInstance $sqlInstanace -After $LastCheckpoint -Before $NextCheckpointTime -ErrorAction SilentlyContinue | `
	    select @{ n = "eventCreated"; e = { $_.Date } }, `
		       @{ n = "eventYear"; expression = { $_.Date.ToString("yyyy") } }, `
		       @{ n = "eventMonth"; expression = { $_.Date.ToString("MMMM") } }, `
		       @{ n = "eventDate"; expression = { $_.Date.ToString('MM/dd/yyyy') } }, `
		       @{ n = "eventTime"; expression = { $_.Date.ToString('hh:mm:ss tt') } }, `
		       @{ n = "eventServer"; e = { $_.ServerInstance } }, `
		       @{ n = "eventSource"; e = { $_.Source } }, `	
               @{ n = "eventType"; expression = { if ($_.'Text' -match 'DBCC CHECKDB' ) {'CHECKDB'}  
                                                      elseif ($_.'Text' -match 'cachestore' ) {'CacheFlush'}           
                                                      elseif ($_.'Text' -match 'Job Failed' ) {'JobFailed'} 
                                                      elseif ($_.'Text' -match '(specified in network packet|packet used to open)' ) {'Connection'} 
                                                      elseif ($_.'Text' -match 'Configuration' ) {'Configuration'} 
                                                      elseif ($_.'Text' -match 'memory' ) {'Memory'} 
                                                      elseif ($_.'Text' -match 'Killed' ) {'Killed'} 
                                                      elseif ($_.'Text' -match 'requests taking longer' ) {'I/O'} 
                                                      elseif ($_.'Text' -match 'FlushCache' ) {'I/O'} 
                                                      elseif ($_.'Text' -match 'Login failed' ) {'LoginFailed'} 
                                                      elseif ($_.'Text' -match 'Login succeeded' ) {'LoginSucceeded'} 
                                                      elseif ($_.'Text' -match 'deadlocked' ) {'Deadlock'} 
                                                      elseif ($_.'Text' -match 'SQL Trace ID' ) {'TraceStart'} 
                                                      elseif ($_.'Text' -match 'SQL Trace stopped' ) {'TraceStop'} 
                                                      elseif ($_.'Text' -match 'Database backed up' ) {'BackupFull'} 
                                                      elseif ($_.'Text' -match 'Database differential changes were backed up' ) {'BackupDifferential'} 
                                                      elseif ($_.'Text' -match 'Log was backed' ) {'BackupLog'}                                                       
                                                      elseif ($_.'Text' -match 'system shutdown' ) {'Shutdown'} 
                                                      elseif ($_.'Text' -match 'RESOLVING_NORMAL' ) {'AlwaysOn'}                                                       
                                                      elseif ($_.'Text' -match 'RECOVERY' ) {'DatabaseRecoveryModel'} 
                                                      elseif ($_.'Text' -match 'RESTORING' ) {'DatabaseRestoring'} 
                                                      elseif ($_.'Text' -match 'Fatal' ) {'Error'}
                                                      else   {'Other'}} }, `
	           @{ n = "eventText"; e = { $_.Text.Replace("'", "''") } } 
 
                # filter out whatever is needed
		# $obj = $obj | Where-Object { $_.eventType -notmatch "Other|LogBackup" }
		# $obj = $obj | Where-Object { $_.eventSource -match "Logon" -and $_.eventType -notmatch "Other"}
 
                $obj = $obj | Where-Object { $_.eventType -notmatch "Other" }				
		if ($obj.Count -ge 1)
		{
			Write-Host "$sqlInstanace - total records $($obj.count) and exporting to SQL Server $server"
                        Write-Host " "

			$obj | ForEach-Object {
				Invoke-Sqlcmd `
				  -Database $database -ServerInstance $server `
				  -Query "insert into $table VALUES ( '$($_.eventCreated)','$($_.eventServer)',`
                        '$($_.eventSource)','$($_.eventType)','$($_.eventText)' ) "
			}
			# $result += $obj			
		}
	}
}
Write-Host " "
# $result | ft

Conclusion:

This script can be fine-tuned to fit your specific requirements. You can add other filters to categorize SQL events, such as Service Broker, data file full, and not enough disk space and so on. I hope this post helped you get a good start to processing SQL Server events from Errorlog.


Friday, November 27, 2020

“Login Failed” and using PowerShell to collect events from multiple SQL Server’s Errorlog

There is a common phenomenon of the requirement to capture failed login attempts to a database server. When a login attempt fails, it simply means that the login to a database server was unsuccessful. Some of the reasons for this include a mismatched password, a login that does not exist, or the default login database is offline or has been removed.

Although capturing “Login Failed” events can give a strong sense of security, it is equally as important to capture successful logins in order to understand the location and activities of those who have logged in. A failed login will never cause any data breaching issues, but a successful login can.

There are a few different ways a successful or unsuccessful login events can be captured, such as the following:

  1. SQL Server Audit
  2. Extended Event
  3. SQL Profiler Trace
  4. SQL Server Errorlog

Using SQL Errorlog:

This is one of the most commonly used and easy to implement mechanisms for capturing login events from a SQL Server instance. The SQL Server setting “Failed Login Only” can be configured for two separate reasons: to troubleshoot login failures and to understand if there were any unauthorized login attempts was made to the SQL Server Instance.

Although this particular feature exists for all SQL Server editions, the new feature introduces “Login failed” messages that are always included in in the new editions. This feature can aid with quicker audits and more efficient troubleshooting.

To find a failed login, use the system sp_readerrolrog stored procedure:

sp_readerrorlog 0, 1, 'Login failed' -- Querying current errorlog
sp_readerrorlog 1, 1, 'Login failed' -- Querying previous errorlog

Login failure message:

Some examples of login failure messages that can be found in the SQL Server Errorlog:

  • Login failed for user 'AppUser'. Reason: Password did not match that for the login provided. [CLIENT: 15.201.210.170]
  • Login failed for user 'MYDOMAIN\SalesSvc'. Reason: Could not find a login matching the name provided. [CLIENT: 101.20.104.45]
  • Login failed for user ''. Reason: An attempt to login using SQL authentication failed. Server is configured for Integrated authentication only. [CLIENT: 110.110.211.110]
  • Login failed for user 'ReportSvc'. Reason: Failed to open the explicitly specified database 'TestDB'. [CLIENT: 110.221.74.120]
  • Login failed for user 'MYDOMAIN\HL7Svc'. Reason: Failed to open the explicitly specified database 'HL7ProdDB'. [CLIENT: <local machine>]
  • Login failed for user 'TestUser'. Reason: Could not find a login matching the name provided. [CLIENT: <local machine>]

DEFAULT_DATABASE Login Parameter:

When we create a login in SQL Server, a default database name is always required. Up to SQL 2005, the default database could not be specified, so the ultimate login database became the master database. In SQL 2008 and up, the default database name can be specified with the “DEFAULT_DATABASE” clause. While this option enhances security, using this option unwisely or unexpectedly may cause application login failure.

An Example:

Let’s say that an application needs a database. A SQL or a Windows Login will be authorized to access the database and the login has been granted certain permissions for the DML operation. In this case, instead of the master database, the user database is used as the default database for the login.

Later on, suppose a few more databases have been created on the same instance of the SQL Server and the same login is used with a different set of permissions for each database.

Technically, there won’t be any issues until the default database is taken offline or removed from the server. Once this happens, all other applications will start failing to connect to the SQL Server with the following message:

Login failed for user '<Login Name>’. Reason: Failed to open the explicitly specified database '<User database>'. [CLIENT: <IP> or <local machine>]

Unavailability of default login database:

A SQL Server Login (SQL, Windows Domain Account, Windows Domain Group) needs a default database and if the default login database is removed, renamed, taken offline or single user mode is enabled, then the login will fail. While the user or application is connected to the SQL Server, the user will remain connected to the database server despite the database being taken offline during their session. However, once the user disconnects and attempts to login in again, then it will fail to connect.

To resolve the login issue, the following command will change the default login database to master:

-- Use any one of the following
ALTER LOGIN <Login Name> WITH DEFAULT_DATABASE = master
EXEC sp_defaultdb '<Login Name>', 'master'

Usages of default login:

When multiple applications or users need to connect to a SQL Server with the same login, it might be a good idea to use master database as the default login database. When a specific login needs to connect to a specific database, then the default database should be same user database.

----- Find all logins and default database
SELECT p.[name] login_name,
       p.type_desc,
       p.default_database_name,      
	   DATABASEPROPERTYEX(p.default_database_name, 'Status') AS database_status
FROM   sys.server_principals p
WHERE p.type_desc IN ('WINDOWS_LOGIN','SQL_LOGIN','WINDOWS_GROUP')  

-- Change default login database to master
-- Use any one of the following
ALTER LOGIN <Login Name> WITH DEFAULT_DATABASE = master
EXEC sp_defaultdb <Login Name>, 'master'

Collecting login failure event:

It is often required to collect login failure events and preserve them for analysis. There are several methods available to do this. Following is a PowerShell script to specifically collect the “Login Failed” events from one or more SQL Server instances and then preserve those events in a SQL Server Database.

The script uses a checkpoint method so that the event will not be collected twice.

PowerShell Script:

<##########################################################
-- Table to export login faulure event from PowerShell
CREATE TABLE [dbo].[EventTbl](
	[EventID] [bigint] IDENTITY(1,1) NOT NULL,
	[EventDate] [datetime] NULL,
	[EventType] [varchar](50) NULL,
	[EventSource] [varchar](100) NULL,
	[EventText] [varchar](8000) NULL
) ON [PRIMARY]
GO
##########################################################>

#Clear-Host

# Database server to export login failure events 
$database = 'TestDB'
$server = 'SHBMAIN\SQL2019'
$table = 'dbo.EventTbl'

[String[]]$sqlInstanaces = "SHBMAIN\SQL2008","SHBMAIN\SQL2019","SHBMAIN\SQL2017"
[Int]$nInitialDays = 15
[DateTime]$InitialStart = $(Get-Date).AddDays(-$nInitialDays)

# Load the checkpoint file
$PSScriptRoot = Split-Path -Parent -Path $MyInvocation.MyCommand.Definition
$path = "$PSScriptRoot"
$CheckPointFile = $path + "\Checkpoint.txt"

# Retrieve last scan date/time from Checkpoint file
if (! (Test-Path $CheckPointFile)) # Checkpoint file not found
{
	$InitialStart | Set-Content -Force $CheckPointFile
	$LastCheckpoint = $InitialStart
	Write-Host 'Checkpoint file does NOT exist'
	Write-Host "Last checkpoint: $LastCheckpoint"
}
else # the Checkpoint file does exist 
{
	[DateTime]$LastCheckpoint = Get-Content -Force $CheckPointFile
	Write-Host 'Checkpoint file does exist'
	Write-Host "Last checkpoint : $LastCheckpoint"
}

# Next check point date and time
$NextCheckpointTime = Get-Date
$NextCheckpointTime | Set-Content -Force $CheckPointFile

Write-Host "Next Checkpoint: $NextCheckpointTime"

$match    =  'Login failed'
$notmatch  =  'without errors|found 0 errors'

# multiple filter for other events
# $match    =  'Login failed|requests taking longer|is full'
# $notmatch  =  'without errors|found 0 errors|backup'

# $result =@()
foreach ($sqlInstanace in $sqlInstanaces) { # create an empty object $obj = "" | select Date, Source, ServerInstance, Text # Collect errorlog and prepare for insert process $obj = Get-SqlErrorLog -ServerInstance $sqlInstanace ` -After $LastCheckpoint -Before $NextCheckpointTime ` -ErrorAction SilentlyContinue ` | ? { ($_.Text -match $match) -and ($_.Text -notmatch $notmatch) } | ` select Date, Source, ServerInstance, @{ n = "Text"; e = { $_.Text.Replace("'", "''") } } # if object is not empty if ($obj) { Write-Host "Number of records: $($obj.count). Exporting log for $sqlInstanace to SQL Server $server" $obj | ForEach-Object { Invoke-Sqlcmd ` -Database $database -ServerInstance $server ` -Query "insert into $table VALUES ('$($_.Date)','$($_.Source)','$($_.ServerInstance)','$($_.Text)') " } } # $result += $obj } # $result |ft

Executing PowerShell Script:



Querying table:



Thursday, November 19, 2020

PowerShell: Calculating folder size, file/folder count and size of each folder and subfolder

We often need to find the total number of files, folders or subfolders along with the total size for a given folder or drive. The PowerShell cmdlet “Get-ChildItem” can help us accomplish. The custom function “Get-FolderSize” described below is one potential solution:

Custom function Get-FolderSize: 

################################################
# To get number of files and folder size 
# Version: 1.0
# Date: 2020-11-19
################################################
function Get-FolderSize
{
	[CmdletBinding()]
	param (
		[Parameter(Mandatory = $true)]
		[string]$FolderPath = "D:\temp",
		
		[Parameter(Mandatory = $false)]
		[Switch]$SubFolder = $false		
	)

    $result = @()

    # Calculate parent folder grand total of all folder and file size, count
    $fileCount   = (Get-ChildItem $FolderPath -recurse | where-object {-not ($_.PSIsContainer)}).Count
    $folderCount = (Get-ChildItem $FolderPath -recurse | where-object { ($_.PSIsContainer)}).Count
    $mainFolder = Get-ChildItem -Path $FolderPath -Recurse | Measure-Object -Sum Length	
    $obj ="" | select Folder, FolderCount, FileCount, Size_kb, Size_mb, Size_gb
    
    $obj.Folder       = '.. Parent Folder ['+ $FolderPath +'] '+'Summary.. '
    $obj.FolderCount  = $FolderCount;
    $obj.FileCount    = $FileCount;
    $obj.Size_kb = $([Math]::Round($mainFolder.Sum / 1kb, 2));
    $obj.Size_mb = $([Math]::Round($mainFolder.Sum / 1mb, 2));
    $obj.Size_gb = $([Math]::Round($mainFolder.Sum / 1gb, 2));
    
    $result += $obj

	if ($SubFolder)
	{
		## size of each folder and each subfolders seperarely
		$folders = Get-ChildItem -ErrorAction SilentlyContinue $FolderPath -Recurse -force | `
    		Where-Object { $_.PSIsContainer -eq $true } | Sort-Object
	}
	else
	{
		# Size of each folder only
		$folders = Get-ChildItem -ErrorAction SilentlyContinue $FolderPath -force | `
		    Where-Object { $_.PSIsContainer -eq $true } | Sort-Object
	}

	foreach ($i in $folders)
	{        
        # sum of files and folder
		$subFolders = Get-ChildItem -ErrorAction SilentlyContinue $i.FullName -Recurse -force | `
            where-object {-not ($_.PSIsContainer)} |`
		    Measure-Object -property Length -sum | Select-Object Sum		
        # file counts
		$fileCount = (Get-ChildItem -ErrorAction SilentlyContinue $i.FullName -Recurse -force | `
		    where-object {-not ($_.PSIsContainer)}).Count
        # folder counts
		$folderCount = (Get-ChildItem -ErrorAction SilentlyContinue $i.FullName -Recurse -force | `
		    where-object { ($_.PSIsContainer)}).Count
            
        # empty object creation
        $obj ="" | select Folder, FolderCount, FileCount, Size_kb, Size_mb, Size_gb

        # assign value to objects
        $obj.Folder = $i.FullName;
        $obj.FolderCount  = $FolderCount;
        $obj.FileCount    = $FileCount;
        $obj.Size_kb = $([Math]::Round($subFolders.Sum / 1kb, 2));
        $obj.Size_mb = $([Math]::Round($subFolders.Sum / 1mb, 2));
        $obj.Size_gb = $([Math]::Round($subFolders.Sum / 1gb, 2));
    
        $result += $obj
	}	
    # return the objects
    return $result

}
# end of function creation

The function has two parameters:

  • $FolderPath: A mandatory parameter; it passes the folder or the drive name that we want to evaluate.
  • $SubFolder: An optional parameter; this conditional value will indicate whether to calculate each subfolder and file inside the original folder and will print separately.

Example of using Get-FolderSize:

Suppose we have a folder “d:\temp3” and this folder has multiple files and subfolders. Each subfolder may have more subfolders and files.

1. To get only the total size of each folder and the file counts from “d:\temp3”:

$Folders = "D:\Temp3"
Get-FolderSize -FolderPath $Folders 

2. To get the total size of each folder and the file counts from “d:\temp3” and then export the result to a CSV file: 

$Folders = "D:\Temp3"
Get-FolderSize -FolderPath $Folders|`
Export-Csv "D:\Export\foldersize.csv" -NoTypeInformation 

3. To get the total size of each folder, the file counts from “d:\temp3” and each subfolder: 

$Folders = "D:\Temp3"
Get-FolderSize -FolderPath $Folders -SubFolder 

4. To get the total size of each folder, the file counts from “d:\temp3” and each subfolder sorted by name: 

$Folders = "D:\Temp3"
Get-FolderSize -FolderPath $Folders -SubFolder sort Folder 

5. To get the total size of each folder, the file counts from “d:\temp3” and each subfolder sorted by name, using a custom display format: 

$fields = "Folder", `
    @{ N = "Folder (#)"; E = { '{0:N0}' -f $_.FolderCount }; Align = "right" }, `
    @{ N = "File (#)";   E = { '{0:N0}' -f $_.FileCount };   Align = "right" }, `
    @{ N = "Size (kb)";  E = { '{0:N2}' -f $_.Size_kb };     Align = "right" }, `
    @{ N = "Size (mb)";  E = { '{0:N2}' -f $_.Size_mb };     Align = "right" }, `
    @{ N = "Size (gb)";  E = { '{0:N2}' -f $_.Size_gb };     Align = "right" }

$Folders = "D:\Temp3"

# Get folder and subfolder size and format the output
Get-FolderSize -FolderPath $Folders -SubFolder | `
    sort Folder |`
     Format-Table $fields -Auto

Screenshot for visual appealing: 

Conclusion: I hope that you find this custom function useful. You can enhance and include additional parameters for this function to accommodate custom requirements. If you implement your own parameters, please send me a copy.