Hi Matt,

Tried what you suggested. Here's what I get for azure_rm_publicipaddress, 
which works in both instances. Both tasks are in the same playbook, so 
they're executed one right after the other with the same args.
1) discrete tags dict: "tags": {"env": "dev", "service": "foo"}
2) templated tags: "tags": {"env": "dev", "foo": "dev"}

For azure_rm_virtualmachine, which breaks with templated:
1) discrete tags dict: "tags": {"env": "dev", "service": "foo"}
2) templated tags: blew up, -vv does not show what it tried to send

Running with -vvv, I get the following:
ASK [Create VM and attach NIC] 
***************************************************************************************************************************************************************************************************************
task path: /Users/mkporwit/work/infra/ansible/azure/create_vm.yml:46
Using module file 
/usr/local/lib/python2.7/site-packages/ansible-2.4.0-py2.7.egg/ansible/modules/cloud/azure/azure_rm_virtualmachine.py
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: mkporwit
<127.0.0.1> EXEC /bin/sh -c 'echo ~ && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo 
/Users/mkporwit/.ansible/tmp/ansible-tmp-1491610965.68-33490890086942 `" && 
echo ansible-tmp-1491610965.68-33490890086942="` echo 
/Users/mkporwit/.ansible/tmp/ansible-tmp-1491610965.68-33490890086942 `" ) 
&& sleep 0'
<127.0.0.1> PUT /var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/tmpnQtd7v 
TO 
/Users/mkporwit/.ansible/tmp/ansible-tmp-1491610965.68-33490890086942/azure_rm_virtualmachine.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x 
/Users/mkporwit/.ansible/tmp/ansible-tmp-1491610965.68-33490890086942/ 
/Users/mkporwit/.ansible/tmp/ansible-tmp-1491610965.68-33490890086942/azure_rm_virtualmachine.py
 
&& sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/local/opt/python/bin/python2.7 
/Users/mkporwit/.ansible/tmp/ansible-tmp-1491610965.68-33490890086942/azure_rm_virtualmachine.py;
 
rm -rf 
"/Users/mkporwit/.ansible/tmp/ansible-tmp-1491610965.68-33490890086942/" > 
/dev/null 2>&1 && sleep 0'
The full traceback is:
Traceback (most recent call last):
  File 
"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py",
 
line 1310, in <module>
    main()
  File 
"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py",
 
line 1307, in main
    AzureRMVirtualMachine()
  File 
"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py",
 
line 554, in __init__
    supports_check_mode=True)
  File 
"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_modlib.zip/ansible/module_utils/azure_rm_common.py",
 
line 197, in __init__
  File 
"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py",
 
line 796, in exec_module
    vm_id=vm_dict['properties']['vmId'],
KeyError: 'vmId'
failed: [localhost] (item=01) => {
    "failed": true, 
    "item": "01", 
    "module_stderr": "Traceback (most recent call last):\n  File 
\"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py\",
 
line 1310, in <module>\n    main()\n  File 
\"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py\",
 
line 1307, in main\n    AzureRMVirtualMachine()\n  File 
\"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py\",
 
line 554, in __init__\n    supports_check_mode=True)\n  File 
\"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_modlib.zip/ansible/module_utils/azure_rm_common.py\",
 
line 197, in __init__\n  File 
\"/var/folders/06/zm5r71tj60b4_19wlz7h822m0000gn/T/ansible_24PReY/ansible_module_azure_rm_virtualmachine.py\",
 
line 796, in exec_module\n   
 vm_id=vm_dict['properties']['vmId'],\nKeyError: 'vmId'\n", 
    "module_stdout": "", 
    "msg": "MODULE FAILURE", 
    "rc": 0
}
to retry, use: --limit 
@/Users/mkporwit/work/infra/ansible/azure/create_vm.retry

I don't think this show any more additional information. What stumps me is 
why this same templating would be OK for other resources, but not 
azure_rm_virtualmachine. That's what makes me think this might be a bug. I 
can create such a tag manually in the azure portal, so I'm pretty sure the 
key=value pair I'm trying to set is acceptable to azure.

-- 
You received this message because you are subscribed to the Google Groups 
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/ansible-project/c50140db-2fcf-444c-b207-b5aa2a4bed61%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to