that worked! You're amazing.

On Friday, May 20, 2016 at 2:59:13 PM UTC-5, Matt Martz wrote:
>
> Maybe use the |default filter like:
>
> until: result['status']|default(0) == 200
>
> On Fri, May 20, 2016 at 2:57 PM, Marcus Morris <[email protected] 
> <javascript:>> wrote:
>
>> So I am running 'canary' tests post CI and pre deployment to test if the 
>> newest code runs without error in a real environment. 
>>
>> The tests are just hitting a url and checking that it returns 200. 
>> Sometimes it takes a little bit for the process I'm testing against to 
>> become available which means my tests will fail if I run them before the 
>> process is ready to serve requests. At first I was just having a pause in 
>> between starting the process and running the test to give it time to come 
>> up, but this is very brittle because that time can vary, and it is also 
>> very inefficient if I wait for longer than I need to. 
>>
>> I then thought I could solve these problems by using a do-until loop. The 
>> problem is, there is nothing I can check for in the registered var that 
>> exists on both failure and success.
>>
>> Example:
>>
>> - name: run test
>>   uri:
>>     url: "https://0.0.0.0:3030/api/canary";
>>     validate_certs: no
>>   register: result
>>   until: result['status'] == 200
>>
>> This doesn't help because when the test fails because the url isn't ready 
>> to serve requests, the register variable only contains something like:
>>
>> {
>>         "changed": false,
>>         "failed": true,
>>         "msg": "Socket error: [Errno 104] Connection reset by peer to 
>> https://0.0.0.0:3030/api/canary";
>> }
>>
>>
>> Therefor, the until will fail with:
>>
>> {
>>  "failed": true,
>>  "msg": "ERROR! The conditional check 'result['status'] == 200' failed. 
>> The error was: ERROR! error while evaluating conditional (result['status'] 
>> == 200): ERROR! 'dict object' has no attribute 'status'"
>> }
>>
>> I thought that maybe "failed": false would exist in successful tests, but 
>> that is not the case.
>>
>> It's kind of a catch-22 because if the test succeeds the first time, I 
>> don't need the do-until, but if it fails, then there is nothing for the 
>> until to check against.
>>
>> Any ideas on how to handle this?
>>
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Ansible Project" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected] <javascript:>.
>> To post to this group, send email to [email protected] 
>> <javascript:>.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/ansible-project/f41735fb-ff5b-44a5-b898-ae645ff07e23%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/ansible-project/f41735fb-ff5b-44a5-b898-ae645ff07e23%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
>
> -- 
> Matt Martz
> @sivel
> sivel.net
>

-- 
You received this message because you are subscribed to the Google Groups 
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/ansible-project/250c580c-887a-49f3-9470-7ad179ff9648%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to