SQL Server: Run A Very Large Script By Splitting It Into Pieces Using PowerShell

Everyday is an interesting day in the life of a DBA. Today, I received a request to run in a HUGE script. This script had 125k INSERT statements each in a separate line. Just opening the script was an issue, leave alone running it into a database. You get this error in SSMS just trying to open – “The operation could not be completed. Not enough storage is available to complete this operation

LargeScriptError

This is how I handled it

  1. I split the file into a manageable 1000 line files (total of 125 files)
  2. Looped through each split file and ran it in!

Yes, it was that simple. Thanks to PowerShell!

Splitting the file:

I used this PowerShell script to split. Be careful to make sure you are not splitting files with half a SQL in one file and the other half in the next split! You may want to consider splitting by other methods like wherever there is a GO or something. Do review your file to make sure it split cleanly.

A huge Thanks to Eric Wright for the below script he has shared on GitHub!


#############################################
# Split a log/text file into smaller chunks #
#############################################
#
# WARNING: This will take a long while with extremely large files and uses lots of memory to stage the file
#
# Set the baseline counters
#
# Set the line counter to 0
$linecount = 0
# Set the file counter to 1. This is used for the naming of the log files
$filenumber = 1
# Prompt user for the path
$sourcefilename = Read-Host "What is the full path and name of the log file to split? (e.g. D:\mylogfiles\mylog.txt)"
# Prompt user for the destination folder to create the chunk files
$destinationfolderpath = Read-Host "What is the path where you want to extract the content? (e.g. d:\yourpath\)"
Write-Host "Please wait while the line count is calculated. This may take a while. No really, it could take a long time."
# Find the current line count to present to the user before asking the new line count for chunk files
Get-Content $sourcefilename | Measure-Object | ForEach-Object { $sourcelinecount = $_.Count }
#Tell the user how large the current file is
Write-Host "Your current file size is $sourcelinecount lines long"
# Prompt user for the size of the new chunk files
$destinationfilesize = Read-Host "How many lines will be in each new split file?"
# the new size is a string, so we convert to integer and up
# Set the upper boundary (maximum line count to write to each file)
$maxsize = [int]$destinationfilesize
Write-Host File is $sourcefilename destination is $destinationfolderpath new file line count will be $destinationfilesize
# The process reads each line of the source file, writes it to the target log file and increments the line counter. When it reaches 100000 (approximately 50 MB of text data)
$content = get-content $sourcefilename | % {
Add-Content $destinationfolderpath\splitlog$filenumber.txt "$_"
$linecount ++
If ($linecount -eq $maxsize) {
$filenumber++
$linecount = 0
}
}
# Clean up after your pet
[gc]::collect()
[gc]::WaitForPendingFinalizers()

view raw

SplitLogs.ps1

hosted with ❤ by GitHub

It is very straight-foward. You just supply the file name to split and the path to place the split files. It creates files named SplitLog[xyz].log where xyz is the numeric split file counter.

You can simply rename the files to have a .sql extension by running

rename \\myunc\splitfiles\*.log *.sql

Install-MSSQLScript – The script runner!

I have created a script called Install-MSSQLScript. It simply calls SQLCMD to run-in the supplied script and logs the output. You can also use the return value to collect additional details about the execution. This script is part of a large library of scripts we use to do fully automated deployments!


#######################

function Global:Install-MSSQLScript
{
    [CmdletBinding()]
    param( 

        [Parameter(Mandatory=$true)]
        [string] $InstanceName,

        [Parameter(Mandatory=$true)]
        [string] $DatabaseName,

        [ValidateScript({Test-Path $_ -PathType Leaf -Include '*.sql'})]
        [Parameter(Mandatory=$true)]
        [string] $InputFile,

        [Parameter(Mandatory=$true)]
        [string] $LogFile

    )

    [string] $fn = $MyInvocation.MyCommand
    [string] $stepName = "Begin [$fn]"
    [string] $errorMessage = ''
    [string] $command = ''

    [HashTable]$params = @{
            'InstanceName' = $InstanceName;
            'DatabaseName' = $DatabaseName;
            'InputFile' = $InputFile;
            'LogFile' = $LogFile
            }

    try
    {        

        $stepName = "[$fn]: Validate parameters"
        #--------------------------------------------
        Write-Verbose $stepName  

        $stepName = "[$fn]: Put the command together"
        #--------------------------------------------
        Write-Verbose $stepName          

        $command = "SQLCMD -E -S $InstanceName -d $DatabaseName -i `"$InputFile`" -o `"$LogFile`" -b -I"

        $stepName = "[$fn]: Run the command and collect output"
        #--------------------------------------------
        Write-Verbose $stepName          

        #Create an empty log file
        New-Item -Path  $LogFile -Type 'File' -Force >$null

        #Run the command
        Invoke-Expression $command > $null
        $exitCode = $LASTEXITCODE

        #Read-in command output
        $commandOutput = (Get-Content -LiteralPath $LogFile | Out-String)

        $stepName = "[$fn]: Create return object with output"
        #--------------------------------------------
        Write-Verbose $stepName          

        $psObjectOutput = New-Object PSObject        

        $psObjectOutput | Add-Member -NotePropertyName 'TargetType' -NotePropertyValue 'MSSQL'
        $psObjectOutput | Add-Member -NotePropertyName 'InputFile' -NotePropertyValue $InputFile
        $psObjectOutput | Add-Member -NotePropertyName 'LogFile' -NotePropertyValue $LogFile
        $psObjectOutput | Add-Member -NotePropertyName 'Command' -NotePropertyValue $command
        $psObjectOutput | Add-Member -NotePropertyName 'CommandType' -NotePropertyValue 'Commandline'
        $psObjectOutput | Add-Member -NotePropertyName 'CommandOutput' -NotePropertyValue $commandOutput
        $psObjectOutput | Add-Member -NotePropertyName 'ExitCode' -NotePropertyValue $exitCode        

        $stepName = "[$fn]: Return the output"
        #--------------------------------------------
        Write-Verbose $stepName  

        $psObjectOutput

    }
    catch
    {
        [Exception]$ex = $_.Exception
        Throw "Unable to install MSSQL script. Error in step: `"{0}]`" `n{1}" -f `
                        $stepName, $ex.Message
    }
    finally
    {
        #Return value if any

    }

}

Running a large number of scripts in a folder:

At this point I have to run in the split files and write out to log for each file execution (to review potential errors later).

$files = gci -LiteralPath '\\myuncpath\splitfiles' split*.sql

foreach($file in $files)
{
    "Running $($file.FullName)"

    Install-MSSQLScript `
        -InstanceName 'MYSQLINSTANCE' `
        -DatabaseName 'MYDATABASE' `
        -InputFile $file.FullName `
        -LogFile "$($file.FullName).log"
}

Hope you find this useful!

Advertisement

One thought on “SQL Server: Run A Very Large Script By Splitting It Into Pieces Using PowerShell

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s