Often, as part of build and release pipelines, one needs to interact with the data lake in order to create items such as files and folders. The goal of this post is to examine how to do this with Azure DevOps.
One really cool feature about Azure DevOps is service connections. These service connections allow you to securely manage access to resources, such as the Azure management plane, and pass that “access” to your Azure DevOps tasks. This is particularly useful when using Azure powershell task. In the not too recent past, there were no Azure powershell cmdlets for Azure Data Lake Gen2, and so, you had to make use of the rest api. This is no longer the case.
Steps for creating a pipeline that interacts with ADLS Gen2:
Here is an example Azure DevOps task:
- task: AzurePowerShell@4
inputs:
azureSubscription: $(connectionName)
scriptType: 'filepath'
scriptPath: DeployFile.ps1
scriptArguments: -filePath $(filePath) -destinationStorageAccountName $(storageAccountName) -destinationStorageAccountFolder $(storageAccountFolder) -destinationFileSystemName $(fileSystemName)
azurePowerShellVersion: latestVersion
And here is the corresponding powershell:
param(
[Parameter(Mandatory=$true)
[string]$filePath,
[Parameter(Mandatory=$true)
[string]$destinationStorageAccountName,
[Parameter(Mandatory=$true)
[string]$destinationStorageAccountFolder,
[Parameter(Mandatory=$true)
[string]$destinationFileSystemName
)
Write-Host ("##[command]Installing modules")
if (-not (Get-Module -ListAvailable -Name Az.Storage)){
Install-Module Az.Storage -Repository PSGallery -Force -Scope CurrentUser
}
if (-not ($destinationStorageAccountFolder.EndsWith("/"))){
$destinationStorageAccountFolder = $destinationStorageAccountFolder + "/"
}
$file = Get-Item -Path $filePath
Write-Host ("##[command]Processing file {0}" -f $file.FullName)
$destinationFilePath = $destinationStorageAccountFolder + $file.Name
Write-Host ("##[command]Processing file {0}" -f $destinationFilePath)
$dataLakeFile = Get-AzDataLakeGen2Item -FileSystem $destinationFileSystemName `
-Path $destinationFilePath `
-Context $storageContext `
-ErrorAction SilentlyContinue
if ($dataLakeFile){
Remove-AzDataLakeGen2Item -FileSystem $destinationFileSystemName `
-Path $destinationFilePath `
-Context $storageContext `
-Force
}
New-AzDataLakeGen2Item -Context $storageContext `
-Path $destinationFilePath `
-FileSystem $destinationFileSystemName `
-Source $file.FullName `
-Force
You’ll notice in the code above that I remove a file if it already exists, prior to updating. At time of writing, overwriting a file doesn’t work properly using these cmdlets and fails with an error saying the “Overwrite permission is not allowed”.
With the approach above, you can easily add ADLS Gen2 environment setup steps to your build/release pipelines. You can find all the cmdlet references for gen2 here.
 
Shamir Charania, a seasoned cloud expert, possesses in-depth expertise in Amazon Web Services (AWS) and Microsoft Azure, complemented by his six-year tenure as a Microsoft MVP in Azure. At Keep Secure, Shamir provides strategic cloud guidance, with senior architecture-level decision-making to having the technical chops to back it all up. With a strong emphasis on cybersecurity, he develops robust global cloud strategies prioritizing data protection and resilience. Leveraging complexity theory, Shamir delivers innovative and elegant solutions to address complex requirements while driving business growth, positioning himself as a driving force in cloud transformation for organizations in the digital age.