Azure Data Factory(三)集成 Azure Devops 实现CI/CD

一,引言

  由于上一节文章内容过长,无法分享Azure Data Factory 的持续集成,持续发布。今天将着重介绍一下在使用 Azure DevOps Pipeline 发布,自动进行持续集成,并且已自动化发布,将Azure Data Factory 部署到多个环境中。

  其实大家也不必惊讶,这里的部署其实也没有多么神秘的,我们在ADF中的 master 分支发布之后,其实会将ADF中所有的配置信息打包,编译到adf_master 分支下面,如果大家仔细看过发布之后的代码,就很很容易发现,都是一些 ARM 模板资源。以当前我创建的demo为例

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

这是就有人在问了,什么是ARM模板?这里就先见到你的概规一下:

  针对于Azure上一些基础设施资源的部署,可以通过运用基础架构即代码,可是实现部署自动化,在代码中定义需要部署的基础架构资源。将这些基础架构资源代码变成项目的一部分,与应用程序代码一样。通过代码管理工具,管理起来,方便部署,通过代码的方式创建、删除资源。而ARM模板就是基础架构即代码的一种形式(另外一种是 Terraform),该模板是一个定义项目基础结构和配置的 JavaScript 对象表示法 (JSON) 文件。 该模板使用声明性语法,使你可以指明要部署的内容,而不需要编写一系列编程命令来创建内容。 在该模板中,指定要部署的资源以及这些资源的属性。

说人话,就是ARM模板中描述了我们需要部署的云资源以及其资源所需要的参数,比如说通过使用ARM不是一台VM,那么ARM模板中就描述了VM资源以及其创建VM所需要的必要的参数。

回到ADF中,也就是说我们最终通过master分支发布到 adf_master 分支的代码,其实就是一堆描述ADF资源以及ADF配置的属性及参数。我们今天的内容也就是通过ARM去实现ADF的 UAT,PRO 环境的部署。

--------------------我是分割线--------------------

1,Azure Data Factory(一)入门简介

2,Azure Data Factory(二)复制数据

3,Azure Data Factory(三)集成 Azure Devops 实现CI/CD

二,正文

1,创建ADF的 UAT 环境

我们通过资源名+UAT的方式来模拟测试环境,如下新创建了一个叫 "ADF-CnBateBlogWeb-UAT" 的 Azure Data Factory 和 两个UAT 环境的 Blob Storage

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

”cnbateblogwebaccount1dev“ 作为UAT 环境的数据源,我们也为UAT环境创建叫 “resformfolder” 的容器

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

“cnbateblogwebaccount2dev” 作为UAT 环境的目标源,我们也为UAT环境创建叫 “restofolder” 的容器

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

2,初始化部署前后脚本

实际中,我们在部署ADF 甚至其他项目代码时,都是需要将AFD的触发器停止,等到部署完成后,我们就得重启ADF中配置的触发器。微软有帮助我们提供了部署前后可以使用的脚本

Azure Data Factory(三)集成 Azure Devops 实现CI/CDAzure Data Factory(三)集成 Azure Devops 实现CI/CD
  1 param
2 (
3 [parameter(Mandatory = $false)] [String] $armTemplate,
4 [parameter(Mandatory = $false)] [String] $ResourceGroupName,
5 [parameter(Mandatory = $false)] [String] $DataFactoryName,
6 [parameter(Mandatory = $false)] [Bool] $predeployment=$true,
7 [parameter(Mandatory = $false)] [Bool] $deleteDeployment=$false
8 )
9
10 function getPipelineDependencies {
11 param([System.Object] $activity)
12 if ($activity.Pipeline) {
13 return @($activity.Pipeline.ReferenceName)
14 } elseif ($activity.Activities) {
15 $result = @()
16 $activity.Activities | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
17 return $result
18 } elseif ($activity.ifFalseActivities -or $activity.ifTrueActivities) {
19 $result = @()
20 $activity.ifFalseActivities | Where-Object {$_ -ne $null} | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
21 $activity.ifTrueActivities | Where-Object {$_ -ne $null} | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
22 return $result
23 } elseif ($activity.defaultActivities) {
24 $result = @()
25 $activity.defaultActivities | ForEach-Object{ $result += getPipelineDependencies -activity $_ }
26 if ($activity.cases) {
27 $activity.cases | ForEach-Object{ $_.activities } | ForEach-Object{$result += getPipelineDependencies -activity $_ }
28 }
29 return $result
30 } else {
31 return @()
32 }
33 }
34
35 function pipelineSortUtil {
36 param([Microsoft.Azure.Commands.DataFactoryV2.Models.PSPipeline]$pipeline,
37 [Hashtable] $pipelineNameResourceDict,
38 [Hashtable] $visited,
39 [System.Collections.Stack] $sortedList)
40 if ($visited[$pipeline.Name] -eq $true) {
41 return;
42 }
43 $visited[$pipeline.Name] = $true;
44 $pipeline.Activities | ForEach-Object{ getPipelineDependencies -activity $_ -pipelineNameResourceDict $pipelineNameResourceDict} | ForEach-Object{
45 pipelineSortUtil -pipeline $pipelineNameResourceDict[$_] -pipelineNameResourceDict $pipelineNameResourceDict -visited $visited -sortedList $sortedList
46 }
47 $sortedList.Push($pipeline)
48
49 }
50
51 function Get-SortedPipelines {
52 param(
53 [string] $DataFactoryName,
54 [string] $ResourceGroupName
55 )
56 $pipelines = Get-AzDataFactoryV2Pipeline -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
57 $ppDict = @{}
58 $visited = @{}
59 $stack = new-object System.Collections.Stack
60 $pipelines | ForEach-Object{ $ppDict[$_.Name] = $_ }
61 $pipelines | ForEach-Object{ pipelineSortUtil -pipeline $_ -pipelineNameResourceDict $ppDict -visited $visited -sortedList $stack }
62 $sortedList = new-object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSPipeline]
63
64 while ($stack.Count -gt 0) {
65 $sortedList.Add($stack.Pop())
66 }
67 $sortedList
68 }
69
70 function triggerSortUtil {
71 param([Microsoft.Azure.Commands.DataFactoryV2.Models.PSTrigger]$trigger,
72 [Hashtable] $triggerNameResourceDict,
73 [Hashtable] $visited,
74 [System.Collections.Stack] $sortedList)
75 if ($visited[$trigger.Name] -eq $true) {
76 return;
77 }
78 $visited[$trigger.Name] = $true;
79 if ($trigger.Properties.DependsOn) {
80 $trigger.Properties.DependsOn | Where-Object {$_ -and $_.ReferenceTrigger} | ForEach-Object{
81 triggerSortUtil -trigger $triggerNameResourceDict[$_.ReferenceTrigger.ReferenceName] -triggerNameResourceDict $triggerNameResourceDict -visited $visited -sortedList $sortedList
82 }
83 }
84 $sortedList.Push($trigger)
85 }
86
87 function Get-SortedTriggers {
88 param(
89 [string] $DataFactoryName,
90 [string] $ResourceGroupName
91 )
92 $triggers = Get-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName
93 $triggerDict = @{}
94 $visited = @{}
95 $stack = new-object System.Collections.Stack
96 $triggers | ForEach-Object{ $triggerDict[$_.Name] = $_ }
97 $triggers | ForEach-Object{ triggerSortUtil -trigger $_ -triggerNameResourceDict $triggerDict -visited $visited -sortedList $stack }
98 $sortedList = new-object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSTrigger]
99
100 while ($stack.Count -gt 0) {
101 $sortedList.Add($stack.Pop())
102 }
103 $sortedList
104 }
105
106 function Get-SortedLinkedServices {
107 param(
108 [string] $DataFactoryName,
109 [string] $ResourceGroupName
110 )
111 $linkedServices = Get-AzDataFactoryV2LinkedService -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName
112 $LinkedServiceHasDependencies = @('HDInsightLinkedService', 'HDInsightOnDemandLinkedService', 'AzureBatchLinkedService')
113 $Akv = 'AzureKeyVaultLinkedService'
114 $HighOrderList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]
115 $RegularList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]
116 $AkvList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]
117
118 $linkedServices | ForEach-Object {
119 if ($_.Properties.GetType().Name -in $LinkedServiceHasDependencies) {
120 $HighOrderList.Add($_)
121 }
122 elseif ($_.Properties.GetType().Name -eq $Akv) {
123 $AkvList.Add($_)
124 }
125 else {
126 $RegularList.Add($_)
127 }
128 }
129
130 $SortedList = New-Object Collections.Generic.List[Microsoft.Azure.Commands.DataFactoryV2.Models.PSLinkedService]($HighOrderList.Count + $RegularList.Count + $AkvList.Count)
131 $SortedList.AddRange($HighOrderList)
132 $SortedList.AddRange($RegularList)
133 $SortedList.AddRange($AkvList)
134 $SortedList
135 }
136
137 $templateJson = Get-Content $armTemplate | ConvertFrom-Json
138 $resources = $templateJson.resources
139
140 #Triggers
141 Write-Host "Getting triggers"
142 $triggersInTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/triggers" }
143 $triggerNamesInTemplate = $triggersInTemplate | ForEach-Object {$_.name.Substring(37, $_.name.Length-40)}
144
145 $triggersDeployed = Get-SortedTriggers -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
146
147 $triggersToStop = $triggersDeployed | Where-Object { $triggerNamesInTemplate -contains $_.Name } | ForEach-Object {
148 New-Object PSObject -Property @{
149 Name = $_.Name
150 TriggerType = $_.Properties.GetType().Name
151 }
152 }
153 $triggersToDelete = $triggersDeployed | Where-Object { $triggerNamesInTemplate -notcontains $_.Name } | ForEach-Object {
154 New-Object PSObject -Property @{
155 Name = $_.Name
156 TriggerType = $_.Properties.GetType().Name
157 }
158 }
159 $triggersToStart = $triggersInTemplate | Where-Object { $_.properties.runtimeState -eq "Started" -and ($_.properties.pipelines.Count -gt 0 -or $_.properties.pipeline.pipelineReference -ne $null)} | ForEach-Object {
160 New-Object PSObject -Property @{
161 Name = $_.name.Substring(37, $_.name.Length-40)
162 TriggerType = $_.Properties.type
163 }
164 }
165
166 if ($predeployment -eq $true) {
167 #Stop all triggers
168 Write-Host "Stopping deployed triggers`n"
169 $triggersToStop | ForEach-Object {
170 if ($_.TriggerType -eq "BlobEventsTrigger") {
171 Write-Host "Unsubscribing" $_.Name "from events"
172 $status = Remove-AzDataFactoryV2TriggerSubscription -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
173 while ($status.Status -ne "Disabled"){
174 Start-Sleep -s 15
175 $status = Get-AzDataFactoryV2TriggerSubscriptionStatus -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
176 }
177 }
178 Write-Host "Stopping trigger" $_.Name
179 Stop-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name -Force
180 }
181 }
182 else {
183 #Deleted resources
184 #pipelines
185 Write-Host "Getting pipelines"
186 $pipelinesADF = Get-SortedPipelines -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
187 $pipelinesTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/pipelines" }
188 $pipelinesNames = $pipelinesTemplate | ForEach-Object {$_.name.Substring(37, $_.name.Length-40)}
189 $deletedpipelines = $pipelinesADF | Where-Object { $pipelinesNames -notcontains $_.Name }
190 #dataflows
191 $dataflowsADF = Get-AzDataFactoryV2DataFlow -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
192 $dataflowsTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/dataflows" }
193 $dataflowsNames = $dataflowsTemplate | ForEach-Object {$_.name.Substring(37, $_.name.Length-40) }
194 $deleteddataflow = $dataflowsADF | Where-Object { $dataflowsNames -notcontains $_.Name }
195 #datasets
196 Write-Host "Getting datasets"
197 $datasetsADF = Get-AzDataFactoryV2Dataset -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
198 $datasetsTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/datasets" }
199 $datasetsNames = $datasetsTemplate | ForEach-Object {$_.name.Substring(37, $_.name.Length-40) }
200 $deleteddataset = $datasetsADF | Where-Object { $datasetsNames -notcontains $_.Name }
201 #linkedservices
202 Write-Host "Getting linked services"
203 $linkedservicesADF = Get-SortedLinkedServices -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
204 $linkedservicesTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/linkedservices" }
205 $linkedservicesNames = $linkedservicesTemplate | ForEach-Object {$_.name.Substring(37, $_.name.Length-40)}
206 $deletedlinkedservices = $linkedservicesADF | Where-Object { $linkedservicesNames -notcontains $_.Name }
207 #Integrationruntimes
208 Write-Host "Getting integration runtimes"
209 $integrationruntimesADF = Get-AzDataFactoryV2IntegrationRuntime -DataFactoryName $DataFactoryName -ResourceGroupName $ResourceGroupName
210 $integrationruntimesTemplate = $resources | Where-Object { $_.type -eq "Microsoft.DataFactory/factories/integrationruntimes" }
211 $integrationruntimesNames = $integrationruntimesTemplate | ForEach-Object {$_.name.Substring(37, $_.name.Length-40)}
212 $deletedintegrationruntimes = $integrationruntimesADF | Where-Object { $integrationruntimesNames -notcontains $_.Name }
213
214 #Delete resources
215 Write-Host "Deleting triggers"
216 $triggersToDelete | ForEach-Object {
217 Write-Host "Deleting trigger " $_.Name
218 $trig = Get-AzDataFactoryV2Trigger -name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName
219 if ($trig.RuntimeState -eq "Started") {
220 if ($_.TriggerType -eq "BlobEventsTrigger") {
221 Write-Host "Unsubscribing trigger" $_.Name "from events"
222 $status = Remove-AzDataFactoryV2TriggerSubscription -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
223 while ($status.Status -ne "Disabled"){
224 Start-Sleep -s 15
225 $status = Get-AzDataFactoryV2TriggerSubscriptionStatus -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
226 }
227 }
228 Stop-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name -Force
229 }
230 Remove-AzDataFactoryV2Trigger -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force
231 }
232 Write-Host "Deleting pipelines"
233 $deletedpipelines | ForEach-Object {
234 Write-Host "Deleting pipeline " $_.Name
235 Remove-AzDataFactoryV2Pipeline -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force
236 }
237 Write-Host "Deleting dataflows"
238 $deleteddataflow | ForEach-Object {
239 Write-Host "Deleting dataflow " $_.Name
240 Remove-AzDataFactoryV2DataFlow -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force
241 }
242 Write-Host "Deleting datasets"
243 $deleteddataset | ForEach-Object {
244 Write-Host "Deleting dataset " $_.Name
245 Remove-AzDataFactoryV2Dataset -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force
246 }
247 Write-Host "Deleting linked services"
248 $deletedlinkedservices | ForEach-Object {
249 Write-Host "Deleting Linked Service " $_.Name
250 Remove-AzDataFactoryV2LinkedService -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force
251 }
252 Write-Host "Deleting integration runtimes"
253 $deletedintegrationruntimes | ForEach-Object {
254 Write-Host "Deleting integration runtime " $_.Name
255 Remove-AzDataFactoryV2IntegrationRuntime -Name $_.Name -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Force
256 }
257
258 if ($deleteDeployment -eq $true) {
259 Write-Host "Deleting ARM deployment ... under resource group: " $ResourceGroupName
260 $deployments = Get-AzResourceGroupDeployment -ResourceGroupName $ResourceGroupName
261 $deploymentsToConsider = $deployments | Where { $_.DeploymentName -like "ArmTemplate_master*" -or $_.DeploymentName -like "ArmTemplateForFactory*" } | Sort-Object -Property Timestamp -Descending
262 $deploymentName = $deploymentsToConsider[0].DeploymentName
263
264 Write-Host "Deployment to be deleted: " $deploymentName
265 $deploymentOperations = Get-AzResourceGroupDeploymentOperation -DeploymentName $deploymentName -ResourceGroupName $ResourceGroupName
266 $deploymentsToDelete = $deploymentOperations | Where { $_.properties.targetResource.id -like "*Microsoft.Resources/deployments*" }
267
268 $deploymentsToDelete | ForEach-Object {
269 Write-host "Deleting inner deployment: " $_.properties.targetResource.id
270 Remove-AzResourceGroupDeployment -Id $_.properties.targetResource.id
271 }
272 Write-Host "Deleting deployment: " $deploymentName
273 Remove-AzResourceGroupDeployment -ResourceGroupName $ResourceGroupName -Name $deploymentName
274 }
275
276 #Start active triggers - after cleanup efforts
277 Write-Host "Starting active triggers"
278 $triggersToStart | ForEach-Object {
279 if ($_.TriggerType -eq "BlobEventsTrigger") {
280 Write-Host "Subscribing" $_.Name "to events"
281 $status = Add-AzDataFactoryV2TriggerSubscription -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
282 while ($status.Status -ne "Enabled"){
283 Start-Sleep -s 15
284 $status = Get-AzDataFactoryV2TriggerSubscriptionStatus -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name
285 }
286 }
287 Write-Host "Starting trigger" $_.Name
288 Start-AzDataFactoryV2Trigger -ResourceGroupName $ResourceGroupName -DataFactoryName $DataFactoryName -Name $_.Name -Force
289 }
290 }

adf_ci_cd.sh

我们将当前powershell 脚本添加到代码中。文件命名为 “adf_ci_cd.sh”

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

将,上面的脚本文件的内容贴进去,点击 “Commit”,并且提交保存到 “master” 分支中

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

3,配置Azure DevOps Pipeline 环境

回到Azure DevOps 中,选择 “Pipelines=》Releases”,点击 “New pipeline”

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

此时需要我们选择模板,我们先点击 “Empty job” 创建一个空的 jobAzure Data Factory(三)集成 Azure Devops 实现CI/CD

修改补助名称为 “UAT”

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

接下来选择添加 “artifact”,先添加一个afd_master 发布产品源

Source Type 选择:”Azure Repos Git“

Project 选择:”CnBateBlogWeb_Proj“

Source(reposity)选择:”CnBateBlogWeb_Proj“

Default branch:”adf_publish“

Source alias(源别名):”_CnBateBlogWeb_Proj_Publish“

点击 ”Add“ 进行添加操作。

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

再添加 “artifact”,先添加一个ADF的 ”master“ 分布作为产品发布源,大家注意里面的参数

Default branch:”master“

Source alias:”_CnBateBlogWeb_Proj_Master“

点击 ”OK“ 进行添加操作。

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

最后,让我们给 ”UAT“ stages 添加 task,点击图中圈起来的部分

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

点击 ”+“,进行添加 ”Azure PowerShell“ Task,同时PowerShell 脚本去停止目前所有正在运行的 Triggers

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

对于当前停止Trigger的 Task,有些需要我们留意的属性配置

1,Display name 可以改为 ”Stop Triggers“(表面当前Task具体干了什么事)

2,选择 ”UAT“ 环境所在的Azure 订阅

3,Script Path(脚本路径)

4,Script Arguments(脚本参数)

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

上面提到的脚本文件说明一下,就是我们刚刚在 ”master“ 分支提交的 adf_ci_cd.sh 这个文件,我们可以进行选择

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

部署前,需要停止所有Trigger,关于脚本参数(停止Trigger)

-armTemplate "$(System.DefaultWorkingDirectory)/<your-arm-template-location>" -ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name> -predeployment $true -deleteDeployment $false

部署完成后,需要启动所有Trigger,关于脚本参数(启动Trigger)

-armTemplate "$(System.DefaultWorkingDirectory)/<your-arm-template-location>" -ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name> -predeployment $false -deleteDeployment $true

大家将上面对应的的脚本参数贴上自己的实际的UAT环境所在的资源组,资源名称,以及模板所在路径

以下是我当前的模板参数;大家可以作为参考:

-armTemplate "$(System.DefaultWorkingDirectory)/_CnBateBlogWeb_Proj_Publish/ADF-CnBateBlogWeb-Dev/ARMTemplateForFactory.json" -ResourceGroupName "Web_Test_DF_RG_UAT" -DataFactoryName "ADF-CnBateBlogWeb-AUT" -predeployment $true -deleteDeployment $false

接下来就是添加 ARM 模板部署了,搜索 ”ARM template deployment“,找到 ARM 模板部署,点击 ”Add“ 进行添加

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

修改当前Task属性

Display name:”ARM Template deployment:ADF Deploy“

Subscription 选择:当前需要UAT环境所在的订阅

Resource group 选择:”Web_Test_DF_RG_UAT“(也就是当前UAT环境 ADF所在的资源组)

Loaction 选择:”East Asia“

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

Template 以及 Template parameters 分别选择 ”_CnBateBlog_Proj_Publish“ 下的 ”ARMTemplateForFactory.json“,"ARMTemplateParametersForFactory.json"

而 Override template parameters(替换模板参数)则指我们需要把模板参数文件中的一些参数替换掉,如需要部署的ADF的名称,数据源、目标源的blob Storage 的链接字符串

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

如下,我演示的覆盖模板参数实例(大家需要将UAT环境的ADF 名称换成自己创建命名的,以及两个链接字符参数名称需要在模板参数文件中找到自己对应的名称,链接字符串复制粘贴自己UAT环境的两个Blob Storage的链接字符串):

-factoryName "ADF-CnBateBlogWeb-AUT" -CnBateBlogFromBlobStorage_connectionString "DefaultEndpointsProtocol=https;AccountName=cnbateblogwebaccount1uat;AccountKey=XF7XkFsnEdIGZT8CCUJs7At1E/Ni9k/XJBtwWaKvwAuBnp3cCN6e2CYiV2uzsq2iI0vI2eZVHS8Kh9CvcuoNcg==;EndpointSuffix=core.windows.net" -CnBateBlobToBlobStorage_connectionString "DefaultEndpointsProtocol=https;AccountName=cnbateblogwebaccount2uat;AccountKey=E5z+q7XL7+8xTlqHsyaIGr0Eje/0DhT9+/E+oro4D57tsSuEchmnjLiK8zftTtyvQKLQUvTGJEsAOFCBGdioHw==;EndpointSuffix=core.windows.net"

最近就是添加重启所有 Trigger 的 PowerShell 的 Task,具体参数如下图

Script Path 选择 :”_CnBateBlogWeb_Proj_Master“ 文件下的 “adf_ci_cd.sh” 文件

Script Path (需要清理资源、重启所有Trigger的脚本参数):

-armTemplate "$(System.DefaultWorkingDirectory)/<your-arm-template-location>" -ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name> -predeployment $false -deleteDeployment $true

大家将上面对应的的脚本参数贴上自己的实际的UAT环境所在的资源组,资源名称,以及模板所在路径

以下是我当前的模板参数;大家可以作为参考:

-armTemplate "$(System.DefaultWorkingDirectory)/_CnBateBlogWeb_Proj_Publish/ADF-CnBateBlogWeb-Dev/ARMTemplateForFactory.json" -ResourceGroupName "Web_Test_DF_RG_UAT" -DataFactoryName "ADF-CnBateBlogWeb-AUT" -predeployment $true -deleteDeployment $true

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

修改当前pipeline 的名称,点击 "Save” 进行保存。

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

接下来,就是设置 pipeline 的触发条件

开启持续部署触发,每次在所选存储库中发生Git推送时触发pipeline,接下来添加分支筛选条件

Type:Include,Branch:“adf_master”,也就是说每当 “adf_publish” 发生git 推送的时候,触发此 pipeline

设置完毕后,点击 “Save” 进行保存

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

我们可以手动 “创建Release”,测试 pipeline 状态显示正常,成功

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

回到Azure UAT 环境中,我们可以发现 UAT 环境的ADF的 pipeline dataset 等也已经部署配置成功

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

4,测试ADF 自动化部署

我们回到ADF的 Dev 环境,我们尝试在ADF Dev 环境创建一个的新的 “Feature Branch” 尝试更改 pipeline 中复制数据的名称

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

我们将上图中的Copy Data 的名称由 “Copy data1” 改为 “Copy data2”,并且进行保存,验证操作。

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

验证没有问题后,我们就可以将当前 “allen/Feature_1” 分支上的更改合并到 “master” 分支上

通过提交 “Create pull request”

Azure Data Factory(三)集成 Azure Devops 实现CI/CD

接下来我就不再演示了,有什么的不懂的,大家可以去看上一篇文章。等待合并完成后,我们回到Dev 环境的ADF 中就可以在 “master” 分支进行发布了。当发布完成后,就会将最新更改推送到 "adf_master" 分支上的 ARM 模板以及参数等。这个时候就会触发之前设置好的Azure DevOps pipeline 了,就可以自动就这些更改自动部署到UA环境了。

三,结尾

  今天的内容着重介绍了一下Azure Data Factory 与Azure Pipeline 的集成和自动化部署,但是同样的也遗留了一个问题,就是在部署ARM 模板的时候,我们时直接将Blob Storage 的链接字符串直接粘贴到覆盖模板参数一栏中,如果ADF 中添加很多数据源的化,使用直接这种方式也是不合适的,这个问题怎么解决,下一节我们继续。*★,°*:.☆( ̄▽ ̄)/$:*.°★* 。

作者:Allen

版权:转载请在文章明显位置注明作者及出处。如发现错误,欢迎批评指正。

上一篇:20210212【学习笔记】Python初步


下一篇:jmeter之beanshell断言---数据处理