Your host file might need to be pathed with -i if it's in an unexpected
location.




On Thu, Mar 27, 2014 at 6:25 AM, <[email protected]> wrote:

> Hi Michael,
>
> My hosts file is simply:
>
>
> [hosts]
>
> 172.20.0.36 ansible_connection=ssh ansible_ssh_user=deployment
> ansible_ssh_pass=password
>
>
> and in the playbook I use:
>
> - name: configure authorized_keys
>
> hosts: hosts
>
> user: deployment
>
> sudo: yes
>
>
> So not sure why there should be a problem in that instance?
>
> However, my main issue is getting ssh working?
>
> Thanks,
>
> Tim
>
>
>
>
> On Tuesday, March 25, 2014 3:05:23 PM UTC, [email protected] wrote:
>
>> Hi There,
>>
>> I'm attempting to configure ssh access to a user via ansible, as
>> described in the blog entry at:
>> http://www.hashbangcode.com/blog/ansible-ssh-setup-playbook
>>
>> I'm running this playbook using ansible version 1.4.5 on rhel 6.3.
>>
>> My inventory hosts file looks like:
>>
>> [hosts]
>> 172.20.0.36 ansible_connection=ssh ansible_ssh_user=deployment
>> ansible_ssh_pass=password
>>
>> I have sshpass installed:
>>
>>
>> [ansible@rwc-host1 inventory]$ sudo yum list | grep sshpass
>> sshpass.x86_64                             1.05-1.el6
>> @epel
>>
>> My ansible.cfg file looks like this:
>>
>>
>> [ansible@rwc-host1 inventory]$ cat ansible.cfg
>> [defaults]
>> host_key_checking=False
>> [ansible@rwc-host1 inventory]$
>>
>> I already have the user created on the remote server with sudo access, so
>> all the playbook really needs to do is take the contents of id_rsa.pub and
>> add it to the authorized_keys file for the remote user.
>>
>>
>> The user I'm connecting as is the same as the user who's authorized_keys
>> file I want to create.
>>
>> However, the user I'm running the playbook as on the ansible control
>> machine is different.
>> For example, the control user is named 'ansible' and the remote user is
>> named 'deployment'
>>
>> The  playbook file is:
>>
>> ---
>>
>> - name: configure authorized_keys
>>   hosts: hosts
>>   user: deployment
>>   sudo: yes
>>
>>   roles:
>>     - setup
>>
>>
>> The task in my playbook is simply:
>>
>>
>> - name: add create authorized_keys file
>>   authorized_key: user=deployment key="{{ lookup('file',
>> '~/.ssh/id_rsa.pub') }}"
>>
>> But when I run the playbook I get the following error:
>>
>>
>> [ansible@rwc-host1 vm]$ ansible-playbook -i inventory/hosts setup.yml
>> PLAY [configure authorized_keys] 
>> **********************************************
>>
>> GATHERING FACTS ******************************
>> *********************************
>> previous known host file not found
>> fatal: [172.20.0.36] => using -c ssh on certain older ssh versions may
>> not support ControlPersist, set ANSIBLE_SSH_ARGS="" (or ansib
>> le_ssh_args in the config file) before running again
>> TASK: [setup | add create authorized_keys file]
>> *******************************
>> FATAL: no hosts matched or all hosts have already failed -- aborting
>> PLAY RECAP 
>> ********************************************************************
>>
>>            to retry, use: --limit @/export/home/ansible/setup.retry
>> 172.20.0.36                : ok=0    changed=0    unreachable=1
>> failed=0
>> [ansible@rwc-host1 vm]$
>>
>> So then I tried adding the below to my ansible.cfg file:
>>
>>
>> [ssh_connection]
>> ssh_args = ""
>>
>> rerunning the playbook resulted in the same error:
>>
>> [ansible@rwc-host1 vm]$ ansible-playbook  -i inventory/hosts setup.yml
>> PLAY [configure authorized_keys] 
>> **********************************************
>>
>> GATHERING FACTS ******************************
>> *********************************
>> previous known host file not found
>> fatal: [172.20.0.36] => using -c ssh on certain older ssh versions may
>> not support ControlPersist, set ANSIBLE_SSH_ARGS="" (or ansib
>> le_ssh_args in the config file) before running again
>> TASK: [setup | add create authorized_keys file]
>> *******************************
>> FATAL: no hosts matched or all hosts have already failed -- aborting
>> PLAY RECAP 
>> ********************************************************************
>>
>>            to retry, use: --limit @/export/home/ansible/setup.retry
>> 172.20.0.36                : ok=0    changed=0    unreachable=1
>> failed=0
>>
>> So, then I thought since ansible uses paramiko instead of openssl on rhel
>> systems I added the below to my ansible.cfg file:
>>
>> [paramiko_connection]
>> record_host_keys = False
>>
>> But that made no difference either.
>>
>> I then added the ANSIBLE_SSH_ARGS environment variable:
>>
>> export ANSIBLE_SSH_ARGS=""
>>
>> This resulted in a different error:
>>
>>
>> [ansible@rwc-host1 vm]$ ansible-playbook  -i inventory/hosts setup.yml
>> PLAY [configure authorized_keys] 
>> **********************************************
>>
>> GATHERING FACTS ******************************
>> *********************************
>> previous known host file not found
>> fatal: [172.20.0.36] => Authentication or permission failure.  In some
>> cases, you may have been able to authenticate and did not have permissions
>> on the remote directory. Consider changing the remote temp path in
>> ansible.cfg to a path rooted in "/tmp". Failed comm
>> and was: mkdir -p $HOME/.ansible/tmp/ansible-1395740233.19-20098518683931
>> && chmod a+rx $HOME/.ansible/tmp/ansible-1395740233.19-200
>> 98518683931 && echo $HOME/.ansible/tmp/ansible-1395740233.19-20098518683931,
>> exited with result 6
>> TASK: [setup | add create authorized_keys file]
>> *******************************
>> FATAL: no hosts matched or all hosts have already failed -- aborting
>> PLAY RECAP 
>> ********************************************************************
>>
>>            to retry, use: --limit @/export/home/ansible/setup.retry
>> 172.20.0.36                : ok=0    changed=0    unreachable=1
>> failed=0
>>
>> I then set the remote_tmp  variable in the [defaults] section of my
>> ansible.cfg file, but rerunning the playbook resulted in the same error.
>>
>> Since setting the environment variable  ANSIBLE_SSH_ARGS seem to have
>> more affect than  settings in the ansible.cfg file, it makes me wonder if
>> ansible is taking any notice of my ansible.cfg file at all.  I'm not sure
>> how this could happen since its in the same directory as my hosts file and
>> that is read correctly.
>>
>> Is this problem related to rhel  and the fact it uses paramiko instead of
>> openssl?
>>
>> Has people  any other thoughts as to why I can't seem to ssh to the user
>> in question using my current configuration?
>>
>> Many thanks,
>>
>> Tim
>>
>  --
> You received this message because you are subscribed to the Google Groups
> "Ansible Project" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/ansible-project/d9e6e6df-3b1e-4544-9e4b-42dcfaea49e3%40googlegroups.com<https://groups.google.com/d/msgid/ansible-project/d9e6e6df-3b1e-4544-9e4b-42dcfaea49e3%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/ansible-project/CAEVJ8QMMKndSRm_bRJ-eYk20PD%2B1gOcXwGSyRHWzYR64uo0DjA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to