I recently ran into this same situation!

The best way I got around it is to set up a single inventory file and 
create an "all" group with the variable *ansible_connection: "local"*
This way, you can create whatever hosts you want and ansible won't attempt 
to actually ssh to any of them, it will know to connect locally (unless you 
want to override ansible_connection on a host level).

Example:
all:
vars:
ansible_connection: "local"
children:
dbservers:
vars:
foo: "bar"
hosts:
dbserver1:
sys: "TEST"

Hope that helps!
- Dakota

On Tuesday, June 16, 2020 at 2:23:35 AM UTC-4, Nicholas Britton wrote:
>
> I am trying to see how I can setup an inventory file for an application 
> that is all managed by uri calls. 
>
> I have multiple end points that will end up with some different vars by 
> host. But I have the host as 127.0.0.1 since everything runs local I 
> thought that would be the way to do it. 
>
> I am finding that since I have the same host in different groups it's 
> taking the last car that gets set for all the groups, as designed from what 
> I can see. Is the best option a different inventory file for each end point?
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/ansible-project/2b1e990f-6cd3-477d-8f26-fd15fa554b7bo%40googlegroups.com.

Reply via email to