zockette edited a comment on issue #19409:
URL: https://github.com/apache/airflow/issues/19409#issuecomment-962650499


   Well I can say that create_container error is the same via the CLI:
   
   
![image](https://user-images.githubusercontent.com/58522841/140654255-beb06d1c-066d-47f1-9c8a-8b4e1e301108.png)
   
   Uploading works just fine:
   
![image](https://user-images.githubusercontent.com/58522841/140654387-3f513a10-bb81-45a1-a8ea-5db0c44ed41a.png)
   
   Your SAS token has been generated at Storage level, not at Container leveln 
yes?  I'm pretty sure because Storage level tokens start with ?sv while 
Container level tokens start with ?sp.
   
   After further digging I can fairly confidently say that BlobServiceClient 
doesn't behave the same if it's a Container SAS token and it only works "as 
intended" with a Storage SAS token.
   
   Works fine with Storage level token:
   
![image](https://user-images.githubusercontent.com/58522841/140655083-8e07ac93-5e49-4953-8ca6-c07cc6e48977.png)
   
   So now the question I guess is: should and will the microsoft-azure provider 
handle the error that comes when trying to run create_container with a 
Container level SAS token? 
   
   Handling could be done by either confirming the behaviour with the guys 
managing the BlobServiceClient code and adding it to the error mappings, by 
having a flag at operator level to bypass the container existence 
check/container creation attempt bit or by documenting that Container level SAS 
token are not supported (pls no D:).
   
   Also one extra bit of weirdness: using a Container level SAS token will make 
the "container" parameter of the operator act as a path on the container. Not 
sure if intended :-)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to