Skip to content

[BUG]: AzureFileCopyV5: Incorrectly returns "Storage account not found" error for recently created Storage account. #19566

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
4 of 7 tasks
mm2709 opened this issue Feb 21, 2024 · 29 comments
Assignees
Labels

Comments

@mm2709
Copy link

mm2709 commented Feb 21, 2024

New issue checklist

Task name

AzureFileCopy

Task version

V5

Issue Description

In my pipeline, I'm creating a new storage account. Then in the immediate next step, I'm trying to copy files to this newly created storage account. However, I'm receiving the error "Storage account: not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only."

image

It seems the highlighted call to "Get-AzResource" cmdlet is returning nothing/null and eventually ends up throwing the exception.

image

I confirmed the above behavior by running the same cmdlet from my dev machine. Get-AzResource cmdlet returns nothing/null, but Get-AzStorageAccount cmdlet returns the storage account details and confirms its existence.
image

Environment type (Please select at least one enviroment where you face this issue)

  • Self-Hosted
  • Microsoft Hosted
  • VMSS Pool
  • Container

Azure DevOps Server type

dev.azure.com (formerly visualstudio.com)

Azure DevOps Server Version (if applicable)

No response

Operation system

Microsoft Windows Server 2022

Relevant log output

2024-02-21T21:23:06.8331198Z ##[debug][Azure Call]Retrieved resource details successfully for azure storage account resource: ugbmriexfyge4 with resource type: Microsoft.Storage/storageAccounts
2024-02-21T21:23:06.8417318Z ##[debug](ARM)Storage account: ugbmriexfyge4 not found
2024-02-21T21:23:06.8799015Z ##[debug]Processed: ##vso[task.logissue type=error;code={"Task_Internal_Error":"RMStorageAccountNotFound"};]
2024-02-21T21:23:06.9549946Z ##[debug]System.Management.Automation.RuntimeException: Storage account: ugbmriexfyge4 not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
2024-02-21T21:23:06.9571527Z ##[debug]Processed: ##vso[task.logissue type=error;code={"Task_Internal_Error":"TemporaryCopyingToBlobContainerFailed"};]
2024-02-21T21:23:06.9617576Z ##[debug]Trying to disconnect from Azure and clear context at process scope
2024-02-21T21:23:06.9703679Z ##[debug]Cannot verify the Microsoft .NET Framework version 4.7.2 because it is not included in the list of permitted versions.
2024-02-21T21:23:06.9732368Z ##[debug]Populating RepositorySourceLocation property for module Az.Accounts.
2024-02-21T21:23:06.9760483Z ##[debug]Loading module from path 'C:\Modules\az_9.3.0\Az.Accounts\2.15.1\Az.Accounts.psm1'.
2024-02-21T21:23:07.0482685Z ##[command]Disconnect-AzAccount -Scope Process -ErrorAction Stop
2024-02-21T21:23:07.0858552Z ##[command]Clear-AzContext -Scope Process -ErrorAction Stop
2024-02-21T21:23:07.1376557Z ##[debug]Caught exception from task script.
2024-02-21T21:23:07.1418622Z ##[debug]Error record:
2024-02-21T21:23:07.2000401Z ##[debug]Storage account: ugbmriexfyge4 not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
2024-02-21T21:23:07.2015479Z ##[debug]At D:\a\_tasks\AzureFileCopy_eb72cb01-a7e5-427b-a8a1-1b31ccac8a43\5.234.0\AzureUtilityAz1.0.ps1:22 char:13
2024-02-21T21:23:07.2031483Z ##[debug]+             Throw (Get-VstsLocString -Key "AFC_StorageAccountNotFound ...
2024-02-21T21:23:07.2047633Z ##[debug]+             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2024-02-21T21:23:07.2062023Z ##[debug]    + CategoryInfo          : OperationStopped: (Storage account...ager type only.:String) [], RuntimeException
2024-02-21T21:23:07.2075945Z ##[debug]    + FullyQualifiedErrorId : Storage account: ugbmriexfyge4 not found. The selected service connection 'Service Princ    ipal' supports storage accounts of Azure Resource Manager type only.
2024-02-21T21:23:07.2089757Z ##[debug] 
2024-02-21T21:23:07.2110696Z ##[debug]Script stack trace:
2024-02-21T21:23:07.2150702Z ##[debug]at Get-AzureStorageAccountResourceGroupName, D:\a\_tasks\AzureFileCopy_eb72cb01-a7e5-427b-a8a1-1b31ccac8a43\5.234.0\AzureUtilityAz1.0.ps1: line 22
2024-02-21T21:23:07.2174345Z ##[debug]at Get-AzureStorageKeyFromARM, D:\a\_tasks\AzureFileCopy_eb72cb01-a7e5-427b-a8a1-1b31ccac8a43\5.234.0\AzureUtilityRest.ps1: line 13
2024-02-21T21:23:07.2188773Z ##[debug]at Get-StorageKey, D:\a\_tasks\AzureFileCopy_eb72cb01-a7e5-427b-a8a1-1b31ccac8a43\5.234.0\Utility.ps1: line 72
2024-02-21T21:23:07.2203930Z ##[debug]at <ScriptBlock>, D:\a\_tasks\AzureFileCopy_eb72cb01-a7e5-427b-a8a1-1b31ccac8a43\5.234.0\AzureFileCopy.ps1: line 125
2024-02-21T21:23:07.2218571Z ##[debug]at <ScriptBlock>, <No file>: line 1
2024-02-21T21:23:07.2232900Z ##[debug]at <ScriptBlock>, <No file>: line 22
2024-02-21T21:23:07.2246853Z ##[debug]at <ScriptBlock>, <No file>: line 18
2024-02-21T21:23:07.2262865Z ##[debug]at <ScriptBlock>, <No file>: line 1
2024-02-21T21:23:07.2287128Z ##[debug]Exception:
2024-02-21T21:23:07.2308201Z ##[debug]System.Management.Automation.RuntimeException: Storage account: ugbmriexfyge4 not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
2024-02-21T21:23:07.2372367Z ##[error]Storage account: ugbmriexfyge4 not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
2024-02-21T21:23:07.2373535Z ##[debug]Processed: ##vso[task.logissue type=error]Storage account: ugbmriexfyge4 not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
2024-02-21T21:23:07.2388283Z ##[debug]Processed: ##vso[task.complete result=Failed]
2024-02-21T21:23:07.3037523Z ##[section]Finishing: Copy Templates v5

Full task logs with system.debug enabled

 [REPLACE THIS WITH YOUR INFORMATION] 

Repro steps

Step 1: Create a new storage account
Step 2: Use AzureFileCopyv5 task to copy files to the above storage account
@mm2709 mm2709 added the bug label Feb 21, 2024
@mm2709 mm2709 changed the title [BUG]: AzureFileCopyV5: Incorrectly returns "Storage account: dkmu2hii3pmte not found" error for recently created Storage account. [BUG]: AzureFileCopyV5: Incorrectly returns "Storage account not found" error for recently created Storage account. Feb 21, 2024
@v-mohithgc
Copy link
Contributor

Hi @mm2709, is there any fix do you like to purpose here?

@v-mohithgc v-mohithgc added Area:RM RM task team and removed Area: Release labels Feb 22, 2024
@mm2709
Copy link
Author

mm2709 commented Feb 22, 2024

Hi @mm2709, is there any fix do you like to purpose here?

Yes. I have the following options to fix the issue.

Option 1: I have already mentioned it in the description that using Get-AzStorageAccount cmdlet instead of Get-AzResource should fix the issue. All we have to do is replace

$azureStorageAccountResourceDetails = (Get-AzResource -ErrorAction Stop) | Where-Object { ($_.ResourceType -eq $ARMStorageAccountResourceType) -and ($_.Name -eq $storageAccountName)}
with

$azureStorageAccountResourceDetails = (Get-AzStorageAccount -ErrorAction Stop) | Where-Object {($_.StorageAccountName -eq $storageAccountName)}

Option 2: We can supply the ResourceGroupName (keep it as optional for backward compatibility) as input parameter from the pipeline task. Then we can skip the call Get-AzureStorageAccountResourceGroupName (in below screenshot) if the ResourceGroupName is not null or empty.

image

@pdtit
Copy link

pdtit commented Feb 23, 2024

having the exact same issue since a few days, but not constant. 1 out of 5 pipeline runs still get through. Had the same with azurefilecopy@4 lately, and started integrating a 5min wait task before creating the storage account and running azcopy. To no avail. The storage account and container is getting created fine, but the task doesn't see it...

@ybadragon
Copy link

We recently started having this issue very frequently now. It started for us a couple days ago but would fix itself after about 5 minutes, today it is taking much longer to fix itself. This is occuring even on storage accounts that are brand new and have never been deleted/re-created

@wjdavis5
Copy link

wjdavis5 commented Feb 27, 2024

I'm also running into this on the windows-2022 agent in ADO. I create a storage account and a few steps later in the pipeline try to copy a file to it and it fails.

@rkistinger-carvana
Copy link

I'm also running into this on the windows-2022 agent in ADO. I create a storage account and a few steps later in the pipeline try to copy a file to it and it fails.

I am experiencing the same issue with this agent.

@robdlee
Copy link

robdlee commented Feb 27, 2024

Also happening to me too

@mm2709
Copy link
Author

mm2709 commented Feb 27, 2024

Hi @mm2709, is there any fix do you like to purpose here?

Yes. I have the following options to fix the issue.

Option 1: I have already mentioned it in the description that using Get-AzStorageAccount cmdlet instead of Get-AzResource should fix the issue. All we have to do is replace

$azureStorageAccountResourceDetails = (Get-AzResource -ErrorAction Stop) | Where-Object { ($_.ResourceType -eq $ARMStorageAccountResourceType) -and ($_.Name -eq $storageAccountName)} with

$azureStorageAccountResourceDetails = (Get-AzStorageAccount -ErrorAction Stop) | Where-Object {($_.StorageAccountName -eq $storageAccountName)}

Option 2: We can supply the ResourceGroupName (keep it as optional for backward compatibility) as input parameter from the pipeline task. Then we can skip the call Get-AzureStorageAccountResourceGroupName (in below screenshot) if the ResourceGroupName is not null or empty.

image

@v-mohithgc Option 1 is a straight-forward fix. Can you please prioritize and assign this bug?

@v-mohithgc
Copy link
Contributor

v-mohithgc commented Feb 28, 2024

Hi all, I have created the PR for the proposed changes #19588
I will notify and try to get it approved from relevant owners.
Just to be sure that the changes are serving the purpose, if possible, can anyone follow the below simple steps to validate real pipeline scenario?

Local task test:
Have code changes ready (checkout to the branch in that PR), navigate to task root path ie C:...\azure-pipelines-tasks
Note: tfx build works only on node 8 and 10, so make sure to use node 8/10 while testing the task

Step 1: run "npm i"

Step 2: Build:

node make.js build --task AzureFileCopyV5

Step 3: Install tfx:

npm install -g tfx-cli

Step 4: Login:

tfx login
Service URL : [your task test org url]
PAT :

Step 5: Upload:

tfx build tasks upload --task-path C:\AzurePipelineTask\azure-pipelines-tasks_build\Tasks\AzureFileCopyV5

New task version will be uploaded to your org, please validate the changes by running the pipeline.

Step 6: Delete: // optional and can be done after all the testing is completed.

tfx build tasks delete --task-id { }

@v-mohithgc
Copy link
Contributor

looks to be a breaking change found while validating.
image

@mm2709
Copy link
Author

mm2709 commented Feb 28, 2024

looks to be a breaking change found while validating. image

Looking at your error logs, it seems the code is failing even before hitting the updated line of code. May be you are not supplying the mandatory input parameters.

image

@ybadragon
Copy link

This is still occurring very frequently on new storage account deployments, is there any update on the fix suggested @v-mohithgc ?

@v-mohithgc
Copy link
Contributor

This is still occurring very frequently on new storage account deployments, is there any update on the fix suggested @v-mohithgc ?

I have notified the relevant task owners; team is working on it.

@v-mohithgc
Copy link
Contributor

v-mohithgc commented Mar 4, 2024

looks to be a breaking change found while validating. image

Looking at your error logs, it seems the code is failing even before hitting the updated line of code. May be you are not supplying the mandatory input parameters.

image

Hi, the pre-configured PR checks are failing with the same error, and I don't have access to modify/control any input parameter related to PR checks. In order to merge the PR, all those checks should succeed. PR: #19588
image

Let me see what I can do here.
Thanks

@v-mohithgc
Copy link
Contributor

v-mohithgc commented Mar 11, 2024

Hi, can anyone try to enable the debug logs (system.debug = true) and share us the recent failure logs related to this issue at "[email protected]". and also let us know the type of service connection been used.
Thanks

@v-mohithgc
Copy link
Contributor

v-mohithgc commented Mar 14, 2024

Hi all, can anyone please confirm if there is any recent occurrence on this issue? if so please share us the complete debug logs, details of the task configuration and type of service connection used, if possible, send all these info to [email protected], for team to proceed with further analysis.
Thanks

@ybadragon
Copy link

yes we had a pipeline fail this week due to this issue. Let me see if I can get the logs for that.

@ybadragon
Copy link

I've sent an email with the issue number 19566 in the header. I has a screenshot of the task itself and a debug log output from the pipeline with our subscription, tenant information redacted. It's a very simple thing to set up and replicate.

@garun-kumar
Copy link
Contributor

garun-kumar commented Mar 19, 2024

looks to be a breaking change found while validating. image

Looking at your error logs, it seems the code is failing even before hitting the updated line of code. May be you are not supplying the mandatory input parameters.

image

@mm2709, The error in the Logs point to the line (line number 13) where we replace Get-AzResource with Get-AzStorageAccount.
image

@ybadragon
Copy link

@v-mohithgc any update on this? We are still getting this error today, and it happens in pipelines where we are deploying an arbitrary number of storage accounts based on configurations in the pipeline, so run by run this fails large portion of the time.

@garun-kumar
Copy link
Contributor

@v-mohithgc any update on this? We are still getting this error today, and it happens in pipelines where we are deploying an arbitrary number of storage accounts based on configurations in the pipeline, so run by run this fails large portion of the time.

@ybadragon We tried to reproduce this issue by deploying a storage account and created release by adding Azure file copy Task to see if fails. But the release was successful without any error. So, we suppose this to be a replication error due to large number of storage accounts.
So, could you please try to create release after a gap of few hours of deploying new storage accounts and confirm whether it still gives the same error?
Thanks

@ybadragon
Copy link

Yes we've already said this previously, after an arbitrary amount of time the task eventually works, however waiting for hours is not an acceptable resolution to this bug. Were you running the task in AzureDevOps? What agent were you using? As I said this is very easily replicate able. In addition looking at the previous comments @mm2709 seems to have found where the issue is, has there been an attempt to revert that change, or implement either of the fixes mentioned?

@garun-kumar
Copy link
Contributor

Yes we've already said this previously, after an arbitrary amount of time the task eventually works, however waiting for hours is not an acceptable resolution to this bug. Were you running the task in AzureDevOps? What agent were you using? As I said this is very easily replicate able. In addition looking at the previous comments @mm2709 seems to have found where the issue is, has there been an attempt to revert that change, or implement either of the fixes mentioned?

I am using Microsoft hosted agent to run the task in AzureDevops.
Yes we tried to implement the changes suggested by @mm2709. But it was breaking, leading to another error.

@garun-kumar
Copy link
Contributor

Yes we've already said this previously, after an arbitrary amount of time the task eventually works, however waiting for hours is not an acceptable resolution to this bug. Were you running the task in AzureDevOps? What agent were you using? As I said this is very easily replicate able. In addition looking at the previous comments @mm2709 seems to have found where the issue is, has there been an attempt to revert that change, or implement either of the fixes mentioned?

Is it feasible to introduce a delay task after you create storage account and then try to run AzureFileCopy task?

@ybadragon
Copy link

@garun-kumar not for hours, but even if we could it is an arbitrary amount of time so it still would not succeed 100% of the time and is not a fix for the bug.

@ybadragon
Copy link

FYI even after the merge from 2 weeks ago we are still seeing this error occur.

@eriktack
Copy link

eriktack commented Apr 16, 2024

Seeing this with AzureFileCopyV6 as well. The request is unable to see the container and tries to create it (even though it already exists), but our service principal do not have create container access (which it doesn't even need as the container is there and working just fine):

##[debug][Azure Call]Retrieved storage account type successfully for the storage account: XXXXXXX in resource group: YYYYYYY
##[debug]Obtained Storage Account type: Standard
##[debug][Azure Call]Getting container: CONTAINERNAME in storage account: XXXXXXX
##[debug]Container: CONTAINERNAME does not exist in storage account: XXXXXXX
##[debug]Creating container if the containerName provided does not exist
##[debug][Azure Call]Creating container: CONTAINERNAME in storage account: XXXXXXX
##[debug]Azure.RequestFailedException: This request is not authorized to perform this operation.
##[debug]RequestId:81753e3c-201e-0015-15ca-8fad6a000000
##[debug]Time:2024-04-16T06:50:04.2396633Z
##[debug]Status: 403 (This request is not authorized to perform this operation.)
##[debug]ErrorCode: AuthorizationFailure

Noticing that "sometimes" the operation works, but extremely inconsistent.

@aya-bjoseph
Copy link

@v-schhabra can we have this issue re-opened or copied to a new issue but targeting AzureFileCopyV6? AzureFileCopyV6 currently has this exact issue and there has been a pending PR for the last 4 months with no activity recently.

This bug is currently a blocker for anyone trying to migrate to "Workload Identity Federation" which MS is heavily pushing.

@v-schhabra
Copy link
Contributor

@v-schhabra can we have this issue re-opened or copied to a new issue but targeting AzureFileCopyV6? AzureFileCopyV6 currently has this exact issue and there has been a pending PR for the last 4 months with no activity recently.

This bug is currently a blocker for anyone trying to migrate to "Workload Identity Federation" which MS is heavily pushing.

Hi @aya-bjoseph
Could you pls let us know if still you are having issues?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests