The Release Pipeline
The release pipeline is only trigger when a successful build has been completed on the master branch, our release pipeline has 5 stages:
The first Stage is to create a staging schema only DB, and apply our migrations to it. We need to create a staging schema only DB to be used as a kinda luanchpad to script out any new/updated databases objects and then apply them to source control using git.
the next stage is to apply the migrations to our QA environment, after this stage has been approved, we then apply to live production environment and apply to source control, the final stage is to perform a clean up and remove the staging db that we created in the first stage.
to make use of Variables I created them at the release pipeline level, these variables are called:
- DB_Build_Password
- DB_Build_User
- DB_ServerInstance
- DataFixMigrationsFolder
- MigrationsFolder
Stage 1 – Apply To Staging
This stage imports the DacPac that we created during our Build pipeline process and created a Staging DB called DB-Staging-123 where 123 is the release id:
1 2 |
sqlpackage.exe /action:Publish /SourceFile:%RELEASE_PRIMARYARTIFACTSOURCEALIAS%/DBSourceControl/BuildDB-$(Build.BuildId).dacpac /TargetUser:%DB_Build_User% /TargetPassword: %DB_Build_Password% /TargetDatabaseName:DB-Staging-%RELEASE_RELEASEID% /TargetServerName:%DB_ServerInstance% |
after the DacPac has been imported we then insert the Migration tables data back so that we know where to start the migration process of.
Stage 2 – Apply To QA
We then apply the migrations to our QA environment – you can we more about this process in the following post
Stage 3 – Apply To Production
This process is exactly the same as Stage 2.
Stage 4 – Stage Changes To Git Repo
This stage is the main workhorse of the release pipeline, this stage as the name suggests deals with pushing any changes made to the database into the schema source control repo.
it starts with first cloning the repo, this is an inline Powershell script:
1 2 3 4 5 6 7 |
# Write your PowerShell commands here. $env:GIT_REDIRECT_STDERR = '2>&1' dir $env:PATH $env:Path += ";C:\Program Files\Git\cmd" git.exe clone https://azureorgname@dev.azure.com/azureorgname/DBSourceControl/_git/DB-Schema $env:SYSTEM_DEFAULTWORKINGDIRECTORY/StagingSchema |
The next task is to script out all the database objects:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 |
function Export-DatabaseObject { param ( [Parameter(Mandatory = $true, Position=1)] [ValidateNotNullOrEmpty()][string]$inst = $null, [Parameter(Mandatory = $false, Position=2)] [string]$port = '1433', [Parameter(Mandatory = $true, Position=3)] [ValidateNotNullOrEmpty()][string]$path = $null, [Parameter(Mandatory = $false, Position=4)] [string]$dblist = 'DB-Staging', [Parameter(Mandatory = $false, Position=5)] [Alias("timestamp")] [bool]$includetimestamp = $true, [Parameter(Mandatory = $false, Position=6)] [Alias("inct")] [bool]$includetables = $true, [Parameter(Mandatory = $false, Position=7)] [Alias("incv")] [bool]$includeviews = $true, [Parameter(Mandatory = $false, Position=8)] [Alias("incsp")] [bool]$includesp = $true, [Parameter(Mandatory = $false, Position=9)] [Alias("incu")] [bool]$includeudf = $true ) #Load the assembly [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null # Create the server object and retrieve the information try{ # Make a connection to the database $server = New-Object ('Microsoft.SqlServer.Management.Smo.Server') "devsql,1433" $depWalker = New-Object ('Microsoft.SqlServer.Management.Smo.DependencyWalker') $server $server.ConnectionContext.LoginSecure = $false $server.ConnectionContext.Login=$env:DB_Build_User $server.ConnectionContext.Password=$env:DB_Build_Password #Variables $DBScaffolding = "$env:SYSTEM_DEFAULTWORKINGDIRECTORY/DBScaffoldingArea" # Set the destination Write-Host `n "The path is: $path" $destination = "$path\" if((Test-Path $destination) -eq $false) { # Create the directory New-Item -ItemType Directory -Path "$destination" | Out-Null } $databases = @() #clean up the data $dblist = $dblist.Replace(' ', '') # Split the string $values = $dblist.Split(',') foreach($value in $values) { $db = New-Object psobject $db | Add-Member -membertype noteproperty -name "Name" -Value $value $databases += $db } # Check if there are any databases if($databases.Count -ge 1) { # Loop through foreach($database in $databases) { Write-Host "##[section] Starting Database Export: " $database.Name # Check if timestamp is needed if($includetimestamp) { # Create a timestamp $timestamp = Get-Date -Format yyyyMMddHHmmss # Set the desitnation $dbDestination = "$destination" } else { # Set the desitnation $dbDestination = "$destination" } # Create the variable for holding all the database objects $objects = $null $tables = $null # Check if the tables need to be included if($includetables) { Write-Host "##[section] Retrieving Tables" # Get the tables $objects += $server.Databases[$database.Name].Tables | where {!($_.IsSystemObject)} $tables += $server.Databases[$database.Name].Tables | where {!($_.IsSystemObject)} } # Check if the views need to be included if($includeviews) { Write-Host "##[section] Retrieving Views" # Get the views $objects += $server.Databases[$database.Name].Views | where {!($_.IsSystemObject)} $views += $server.Databases[$database.Name].Views | where {!($_.IsSystemObject)} } # Check if the stored procedures need to be included if($includesp) { Write-Host "##[section] Retrieving Stored Procedures" # Get the stored procedures $objects += $server.Databases[$database.Name].StoredProcedures | where {!($_.IsSystemObject)} } # Check if the user defined functions need to be included if($includeudf) { Write-Host "##[section] Retrieving User Defined Functions" # Get the UDFs $objects += $server.Databases[$database.Name].UserDefinedFunctions | where {!($_.IsSystemObject)} } #Below creates the Dependancy Tree that is needed latter on #This checks the objects and orders them based on their depedancie Write-Host "The Objects before depwalk are: " $objects $depTree = $depWalker.DiscoverDependencies($objects, $True) $orderedUrns = $depWalker.WalkDependencies($depTree) Write-Host "##[section] Please Wait whilst we check and script out any databases objects that have been created and or updated" Write-Host "##[section] This process can take upto 3 minutes to complete" # Check if there any objects to export if($objects.Length -ge 1) { # # Create the scripter object $scripter = New-Object ("Microsoft.SqlServer.Management.Smo.Scripter") $server #"$inst,$port" # Set general options $scripter.Options.AppendToFile = $false $scripter.Options.ScriptSchema = $true $scripter.Options.AllowSystemObjects = $false $scripter.Options.Default = $true #$scripter.Options.DriAll = $false $scripter.Options.DriChecks = $false $scripter.Options.DriClustered = $true $scripter.Options.DriDefaults = $true $scripter.Options.DriPrimaryKey = $true $scripter.Options.DriUniqueKeys = $true $scripter.Options.DriWithNoCheck = $true $scripter.Options.DriNonClustered = $true $scripter.Options.ScriptDrops = $false $scripter.Options.IncludeHeaders = $false $scripter.Options.ToFileOnly = $true $scripter.Options.Indexes = $true $scripter.Options.Triggers = $true $scripter.Options.ClusteredIndexes = $true $scripter.Options.WithDependencies = $false foreach($table in $tables){ foreach($fk in $table.ForeignKeys) { $typeDir = $fk.GetType().Name #Script out FK ScriptoutsqlFile -objectType $fk -dbDestination $dbDestination -typeDir $typeDir } foreach($check in $table.Checks) { $typeDir = $check.GetType().Name #Script out Check constraints ScriptoutsqlFile -objectType $check -dbDestination $dbDestination -typeDir $typeDir } } #Script Out the database Object foreach($item in $objects ) { # Get the type of object $typeDir = $item.GetType().Name #Script Out Object ScriptoutsqlFile -objectType $item -dbDestination $dbDestination -typeDir $typeDir #Compare file object with the object in local git repo CompareSQLObjectWithSourceControl -objectType $item -dbDestination $dbDestination -typeDir $typeDir } #Output the database Object in order of dependancies in order to re-create the staging DB #This gets writting line by line to a text file which is needed when re-creating the DB. #so that we always haave an updated dependency file we need to remove the file and then re-create it #before we start writting to it. if((Test-Path "$DBScaffolding\Objects.txt") -eq $true) { #We Now need to remove the file so that it holds the next commit message for the next build/release job Remove-Item -path "$DBScaffolding\Objects.txt" -Force } foreach($urn in $orderedUrns) { try { $smoObject = $server.GetSmoObject($urn.Urn) $typeDir = $smoObject.GetType().Name Write-Host "##[section] adding to Dependancy Tree file: $smoObject" $filename = $smoObject -replace "\[|\]" Add-Content $DBScaffolding\Objects.txt "$filename.sql" } catch { Write-Host "##[warning] A problem Occured whilst trying to find the dependency tree releationship for one of the database objects this will not prevent the operation of this script or recreating the staging DB" } } } } } #End if($databases.Count -ge 1) else { Write-Host "##[error] No databases found." } } catch [Exception] { $errorMessage = $_.Exception.Message $line = $_.InvocationInfo.ScriptLineNumber $script_name = $_.InvocationInfo.ScriptName Write-Host "##[error] Error: Occurred on line $line in script $script_name." Write-Host "##[error] Error: $ErrorMessage" } } function ScriptoutsqlFile{ param([object]$objectType, [string]$dbDestination, [string]$typeDir) if((Test-Path "$dbDestination\$typeDir") -eq $false) { New-Item -ItemType Directory -Name "$typeDir" -Path "$dbDestination" | Out-Null } #Since we have already initally populated the DB_Schema lets only update it if a new DB object is present $filename = $objectType -replace "\[|\]" # if((Test-Path "$dbDestination\$typeDir\$filename.sql") -eq $false) { #Setup the output file for the item $scripter.Options.FileName = "$dbDestination\$typeDir\$filename.sql" # Script out the object Write-Host "##[command] Scripting out $typeDir $objectType" $scripter.Script($objectType) } } function CompareSQLObjectWithSourceControl { param([object]$objectType, [string]$dbDestination, [string]$typeDir) $DBScaffolding = "$env:SYSTEM_DEFAULTWORKINGDIRECTORY\DBScaffoldingArea" $filename = $objectType -replace "\[|\]" #so that we can compare the contents of the file for example an column #was added we need to create the file into a temp holding directory# if((Test-Path $DBScaffolding) -eq $true) { $scripter.Options.FileName = "$DBScaffolding\$filename.sql" # Script out the object $scripter.Script($objectType) } else { Write-Host "Create the Scaffolding Direcotry" $folder = $env:SYSTEM_DEFAULTWORKINGDIRECTORY [system.io.directory]::CreateDirectory("$folder\DBScaffoldingArea") $scripter.Options.FileName = "$DBScaffolding\$filename.sql" # Script out the object $scripter.Script($objectType) } $file1 = "$dbDestination$typeDir\$filename.sql" $file2 = "$DBScaffolding\$filename.sql" if(Compare-Object -ReferenceObject $(Get-Content $file1) -DifferenceObject $(Get-Content $file2)) { Write-Host "##[command] The following file: $file2 has been updated and is different from the local git repo file: $file1 this will be replaced." #If files are different then copy the temp file into the local repo directory Copy-Item -Path $file2 -Destination $file1 -force } #so we keep this new temp working irectory clean remove the file after use #Write-Host "Removing file" Remove-Item -path "$DBScaffolding\$filename.sql" -Force } Export-DatabaseObject -inst 'devsql' -dblist "DB-Staging-$env:RELEASE_RELEASEID" -path "$env:SYSTEM_DEFAULTWORKINGDIRECTORY/StagingSchema" |
This script looks complicated but essential what it’s doing is scripting out the database objects from the staging DB that was created, it then writes the objects in correct order to Objects.txt file this is so that when re-creating a datbase from source control the objects are created in dependency order, we then script our the Foreign Keys and check constraints separately, so that we can apply them back later on. The reasons for doing this is I was getting conflict errors when re-creating the database so separating them out worked for me.
The next task is to copy this Objects.txt into our local git repo
1 2 3 4 5 6 |
Write-Host "We need to copy the objects txt file into the local git repo, this file is required to re-build database objects in correct order" $sourceLocation = "$env:SYSTEM_DEFAULTWORKINGDIRECTORY\DBScaffoldingArea\Objects.txt" $targetLocation = "$env:SYSTEM_DEFAULTWORKINGDIRECTORY\StagingSchema\Objects.txt" Copy-Item -Path $sourceLocation -Destination $targetLocation -force |
next we script out the data in our Migrations state tables, we then copy this to our git repo, the script for copying this is as shown below:
1 2 3 4 5 6 |
Write-Host "Copying Migrations Table data Git Repo" $sourceLocation = "$env:RELEASE_PRIMARYARTIFACTSOURCEALIAS\DBSourceControl\dbo._migrations-data.sql" $targetLocation = "$env:SYSTEM_DEFAULTWORKINGDIRECTORY\StagingSchema\dbo._migrations-data.sql" Copy-Item -Path $sourceLocation -Destination $targetLocation -force |
Finally we then publish are DB-Schema repo back to Git:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
#CD into git repo # $env:Path += ";C:\Program Files\Git\cmd" $env:GIT_REDIRECT_STDERR = '2>&1' #Read first line from git CommitDetails.txt file so that we can store the git commit message when we push the files to source contorl if((Test-Path "$env:RELEASE_PRIMARYARTIFACTSOURCEALIAS\DBSourceControl\CommitDetails.txt") -eq $true) { $gitCommitMessageFilename = "$env:RELEASE_PRIMARYARTIFACTSOURCEALIAS\DBSourceControl\CommitDetails.txt" $file = New-Object System.IO.StreamReader($gitCommitMessageFilename) $commitMessage = $file.ReadLine() $file.Close() #now that we have the git commit message file we can change to the git folder location Set-Location -Path "$env:SYSTEM_DEFAULTWORKINGDIRECTORY\StagingSchema" -PassThru git.exe config --global user.email $env:BUILD_REQUESTEDFOREMAIL git.exe config --global user.name $env:BUILD_REQUESTEDFOR if(git.exe status --porcelain |Where {$_ -match '^\?\?'}){ # untracked files exist Write-Host "##[section] Untracked files" Write-Host "##[section] Git Add" git.exe add . Write-Host "##[section] Commiting Changes" git.exe commit -m $commitMessage git.exe push origin } elseif(git.exe status --porcelain |Where {$_ -notmatch '^\?\?'}) { # uncommitted changes Write-Host "##[section] Ucommited changes" Write-Host "##[section] Commiting Changes" git.exe add . git.exe commit -m $commitMessage git.exe push origin } else { # tree is clean Write-Host "##[section] Tree is clean" #Get-ChildItem Env: $env:BUILD_REQUESTEDFOREMAIL $env:BUILD_REQUESTEDFOR } } else { Write-Host "Git Commit details file not located so no git operations will happen" } |
That’s all, there is to it. If you need help in moving your database systems over to Azure then get in contact with me.