Specifying ssh key in ansible playbook file
Ansible playbook can specify the key used for ssh connection using --key-file
on the command line.
ansible-playbook -i hosts playbook.yml --key-file "~/.ssh/mykey.pem"
Is it possible to specify the location of this key in playbook file instead of using --key-file
on command line?
Because I want to write the location of this key into a var.yaml
file, which will be read by ansible playbook with vars_files:
.
The followings are parts of my configuration:
vars.yml file
key1: ~/.ssh/mykey1.pem
key2: ~/.ssh/mykey2.pem
playbook.yml file
---
- hosts: myHost
remote_user: ubuntu
key_file: {{ key1 }} # This is not a valid syntax in ansible. Does there exist this kind of directive which allows me to specify the ssh key used for this connection?
vars_files:
- vars.yml
tasks:
- name: Echo a hello message
command: echo hello
I've tried adding ansible_ssh_private_key_file
under vars
. But it doesn't work on my machine.
vars_files:
- vars.yml
vars:
ansible_ssh_private_key_file: "{{ key1 }}"
tasks:
- name: Echo a hello message
command: echo hello
If I run ansible-playbook
with the playbook.yml
above. I got the following error:
TASK [Gathering Facts] ******************************************************************************************************************************
Using module file /usr/local/lib/python2.7/site-packages/ansible/modules/system/setup.py
<192.168.5.100> ESTABLISH SSH CONNECTION FOR USER: ubuntu
<192.168.5.100> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ubuntu -o ConnectTimeout=10 -o ControlPath=/Users/myName/.ansible/cp/2d18691789 192.168.5.100 '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
<192.168.5.100> (255, '', 'Permission denied (publickey).\r\n')
fatal: [192.168.5.100]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
}
to retry, use: --limit @/Users/myName/playbook.retry
I don't find the name of my key file in the ssh command. It's strange.
The variable name you're looking for is ansible_ssh_private_key_file
.
You should set it at 'vars' level:
-
in the inventory file:
myHost ansible_ssh_private_key_file=~/.ssh/mykey1.pem myOtherHost ansible_ssh_private_key_file=~/.ssh/mykey2.pem
-
in the
host_vars
:# host_vars/myHost.yml ansible_ssh_private_key_file: ~/.ssh/mykey1.pem # host_vars/myOtherHost.yml ansible_ssh_private_key_file: ~/.ssh/mykey2.pem
-
in a
group_vars
file if you use the same key for a group of hosts -
in the
vars
section of an entry in a play:- hosts: myHost remote_user: ubuntu vars_files: - vars.yml vars: ansible_ssh_private_key_file: "{{ key1 }}" tasks: - name: Echo a hello message command: echo hello
-
in setting a fact in a play entry (task):
- name: 'you name it' ansible.builtin.set_fact: ansible_ssh_private_key_file: "{{ key1 }}"
Inventory documentation
You can use the ansible.cfg file, it should look like this (There are other parameters which you might want to include):
[defaults]
inventory = <PATH TO INVENTORY FILE>
remote_user = <YOUR USER>
private_key_file = <PATH TO KEY_FILE>
Hope this saves you some typing
If you run your playbook with ansible-playbook -vvv
you'll see the actual command being run, so you can check whether the key is actually being included in the ssh command (and you might discover that the problem was the wrong username rather than the missing key).
I agree with Brian's comment above (and zigam's edit) that the vars section is too late. I also tested including the key in the on-the-fly definition of the host like this
# fails
- name: Add all instance public IPs to host group
add_host: hostname={{ item.public_ip }} groups=ec2hosts ansible_ssh_private_key_file=~/.aws/dev_staging.pem
loop: "{{ ec2.instances }}"
but that fails too.
So this is not an answer. Just some debugging help and things not to try.