Hi all,
I notice a weird behavior regarding async_status
Assuming user1 and user2 exists on the remote host (here localhost)
async.yml
---
- hosts: localhost
become: true
tasks:
- name: simulate long running op, allow to run for 45 sec, fire and forget
command: /bin/sleep 15
become_user: user2
async: 45
poll: 0
register: async_task
- name: check async job in same play
async_status: jid={{ async_task.ansible_job_id }}
become_user: user2
- hosts: localhost
tasks:
- name: check async job in another play
async_status: jid={{ async_task.ansible_job_id }}
become_user: user2
*The result is:*
user1@localhost ~$ ansible-playbook test/async.yml -vv
ansible-playbook 2.4.0.0
config file = /home/user1/ansible.cfg
configured module search path = [u'/home/user1/.ansible/plugins/modules',
u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 2.7.5 (default, Aug 4 2017, 00:39:18) [GCC 4.8.5
20150623 (Red Hat 4.8.5-16)]
Using /home/user1/ansible.cfg as config file
[WARNING]: Could not match supplied host pattern, ignoring: all
[WARNING]: provided hosts list is empty, only localhost is available
PLAYBOOK: async.yml
****************************************************************************************************
2 plays in test/async.yml
PLAY [localhost]
*******************************************************************************************************
TASK [Gathering Facts]
*************************************************************************************************
ok: [localhost]
META: ran handlers
TASK [simulate long running op, allow to run for 45 sec, fire and forget]
**********************************************
task path: /home/user1/test/async.yml:7
changed: [localhost] => {"ansible_job_id": "182826547249.21334", "changed":
true, "failed": false, "finished": 0, "results_file":
"/opt/user2/.ansible_async/182826547249.21334", "started": 1}
TASK [check async job in same play]
************************************************************************************
task path: /home/user1/test/async.yml:14
ok: [localhost] => {"ansible_job_id": "182826547249.21334", "changed":
false, "failed": false, "finished": 0, "started": 1}
META: ran handlers
META: ran handlers
PLAY [localhost]
*******************************************************************************************************
TASK [Gathering Facts]
*************************************************************************************************
ok: [localhost]
META: ran handlers
TASK [check async job in another play]
*********************************************************************************
task path: /home/user1/test/async.yml:21
fatal: [localhost]: FAILED! => {"ansible_job_id": "182826547249.21334",
"changed": false, "failed": true, "finished": 1, "msg": "could not find
job", "started": 1}
to retry, use: --limit @/home/user1/test/async.retry
PLAY RECAP
*************************************************************************************************************
localhost : ok=4 changed=1 unreachable=0 failed=1
here, /home/user1/ansible.cfg is just an empty file.
the first play we run as root then become user2
On the second play we run directly as user1 and become user2...
is this a bug regarding permissions or am I missing something ?
Regards,
Julien
--
You received this message because you are subscribed to the Google Groups
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/ansible-project/e89b34b0-51ed-4577-b763-fc55b0e6ccc8%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.