You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Note: The version is visible when running AzCopy without any argument
Which platform are you using? (ex: Windows, Mac, Linux)
phusion/baseimage-docker:focal-1.2.0
What command did you run?
Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.
We are running the following command through cronjob from kube to upload a file to an Azure storage.
I obscured the url and tokens for security reasons. (I put parameters "<?>" instead of actual details)
We are getting an authorization failure in the logs and the AZCopy Job Status of Cancelled. But the AZCopy command finishes successfully. This causes the pipeline to finish successfully, although the files were not uploaded properly to the destination folder. The expectation is that the AZCopy should finish with an error and cause the whole pipeline to report a failure, instead of silently not copy the
files.
How can we reproduce the problem in the simplest way?
I don't think there is a simple way to reproduce, but you need to have an Azure storage account created with a folder and a proper SAS token. Wait for the token expiration or remove write permission on it.
Have you found a mitigation/solution?
No
The text was updated successfully, but these errors were encountered:
Hi @LacazeThomas, The job status is cancelled. When you say that azcopy completed successfully, what do you mean?
Are you checking $LASTEXITCODE?
If yes,
As a mitigation, please do not use the last exit code and write your logic on top of it. Instead, you can parse the json output of the azcopy command and fetch value of "JobStatus" field to check the status of your azcopy command.
`# Run AzCopy command and capture JSON output
$azcopyOutput = azcopy copy "source" "destination" --output-type=json
#Parse JSON output and extract job status
$jsonObject = $azcopyOutput | ConvertFrom-Json
$jobStatus = $jsonObject.jobStatus
#Perform actions based on job status
if ($jobStatus -eq "Completed") {
Write-Host "Data transfer completed successfully."
}
elseif ($jobStatus -eq "Failed") {
Write-Host "Data transfer failed. Please check logs for more information."
}
else {
Write-Host "Data transfer is still in progress. Monitoring job status..."
}`
Which version of the AzCopy was used?
AZCopy version 10.19.0 - 10.26.0
Note: The version is visible when running AzCopy without any argument
Which platform are you using? (ex: Windows, Mac, Linux)
phusion/baseimage-docker:focal-1.2.0
What command did you run?
Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.
We are running the following command through cronjob from kube to upload a file to an Azure storage.
I obscured the url and tokens for security reasons. (I put parameters "<?>" instead of actual details)
What problem was encountered?
We are getting an authorization failure in the logs and the AZCopy Job Status of Cancelled. But the AZCopy command finishes successfully. This causes the pipeline to finish successfully, although the files were not uploaded properly to the destination folder. The expectation is that the AZCopy should finish with an error and cause the whole pipeline to report a failure, instead of silently not copy the
files.
How can we reproduce the problem in the simplest way?
I don't think there is a simple way to reproduce, but you need to have an Azure storage account created with a folder and a proper SAS token. Wait for the token expiration or remove write permission on it.
Have you found a mitigation/solution?
No
The text was updated successfully, but these errors were encountered: