Locked out of my own server: getting "Too many authentication failures" right away when connecting via ssh

I have an AWS EC2 Ubuntu instance for pet projects. When I tried logging in one day, this error results:

~$ ssh -i"/home/kona/.ssh/aws_kona_id" [email protected] -p22 
Enter passphrase for key '/home/kona/.ssh/aws_kona_id': 
Received disconnect from [IP address] port 22:2: Too many authentication failures
Disconnected from [IP address] port 22
~$

kona is the only account enabled on this server

I've tried rebooting the server, changing my IP address, and waiting.

EDIT:

kona@arcticjieer:~$ ssh -o "IdentitiesOnly yes" -i"/home/kona/.ssh/aws_kona_id" -v [email protected] -p22 
OpenSSH_8.1p1 Debian-1, OpenSSL 1.1.1d  10 Sep 2019
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 19: Applying options for *
debug1: Connecting to ec2-3-17-146-113.us-east-2.compute.amazonaws.com [3.17.146.113] port 22.
debug1: Connection established.
debug1: identity file /home/kona/.ssh/aws_kona_id type -1
debug1: identity file /home/kona/.ssh/aws_kona_id-cert type -1
debug1: Local version string SSH-2.0-OpenSSH_8.1p1 Debian-1
debug1: Remote protocol version 2.0, remote software version OpenSSH_7.6p1 Ubuntu-4ubuntu0.3
debug1: match: OpenSSH_7.6p1 Ubuntu-4ubuntu0.3 pat OpenSSH_7.0*,OpenSSH_7.1*,OpenSSH_7.2*,OpenSSH_7.3*,OpenSSH_7.4*,OpenSSH_7.5*,OpenSSH_7.6*,OpenSSH_7.7* compat 0x04000002
debug1: Authenticating to ec2-3-17-146-113.us-east-2.compute.amazonaws.com:22 as 'kona'
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug1: kex: algorithm: curve25519-sha256
debug1: kex: host key algorithm: ecdsa-sha2-nistp256
debug1: kex: server->client cipher: [email protected] MAC: <implicit> compression: none
debug1: kex: client->server cipher: [email protected] MAC: <implicit> compression: none
debug1: expecting SSH2_MSG_KEX_ECDH_REPLY
debug1: Server host key: ecdsa-sha2-nistp256 SHA256:D3sIum9dMyyHNjtnL7Pr4u5DhmP5aQ1jaZ8Adsdma9E
debug1: Host 'ec2-3-17-146-113.us-east-2.compute.amazonaws.com' is known and matches the ECDSA host key.
debug1: Found key in /home/kona/.ssh/known_hosts:41
debug1: rekey out after 134217728 blocks
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug1: SSH2_MSG_NEWKEYS received
debug1: rekey in after 134217728 blocks
debug1: Will attempt key: /home/kona/.ssh/aws_kona_id  explicit
debug1: SSH2_MSG_EXT_INFO received
debug1: kex_input_ext_info: server-sig-algs=<ssh-ed25519,ssh-rsa,rsa-sha2-256,rsa-sha2-512,ssh-dss,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521>
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug1: Authentications that can continue: publickey
debug1: Next authentication method: publickey
debug1: Trying private key: /home/kona/.ssh/aws_kona_id
Enter passphrase for key '/home/kona/.ssh/aws_kona_id': 
debug1: Authentications that can continue: publickey
debug1: No more authentication methods to try.
[email protected]: Permission denied (publickey).
kona@arcticjieer:~$ 

Solution 1:

This error usually means that you’ve got too many keys loaded in your ssh-agent.

Explanation: Your ssh client will attempt to use all the keys from ssh-agent one by one before it gets to use the key specified with -i aws_kona_id. Yes, it's a bit counter-intuitive. Because each such attempt counts as an authentication failure and by default only 5 attempts are allowed by the SSH server you are getting the error you see: Too many authentication failures.

You can view the identities (keys) attempted with ssh -v.

The solution is to tell ssh to only use the identities specified on the command line:

ssh -o "IdentitiesOnly yes" -i ~/.ssh/aws_kona_id -v [email protected]

If it doesn’t help post the output of that command here.

Solution 2:

I think MLu's answer is possibly correct in this case. The way to validate this is to run a command line ssh command specifying the correct key to the server.

ssh -i "keyfile.pem" [email protected]

If that doesn't work, and in the general case of "I've been locked out of my server, help!", the generally recommend approach is to mount the volume to another instance as a data volume.

  1. Stop the EC2 server.
  2. Mount the volume onto a new instance as a data volume.
  3. Do any investigation or repairs required (look at logs, add keys, etc). This can include creating new users and new keys, changing files on the file system, etc.
  4. Mount the volume as a root volume on the original instance.

Repeat until you have access. If you can't get access this at least gets you access to your data.

Solution 3:

SSH by default tries all available SSH keys. It does so in a "random" order. Specifying the -i option simply tells SSH to add that keyfile to the list of keys to try.

It does not:

  • limit SSH to use only that key
  • tell SSH to try that key first

What ends up happening (quite often if you use many keys) is that SSH tries a couple random key that don't work and the server stops accepting authentication attempts from your client.

If you want to tell SSH to "use only this key" you must specify the IdentitiesOnly yes option:

ssh -o "IdentitiesOnly yes" -i"/home/kona/.ssh/aws_kona_id" [email protected] -p22 

IdentitiesOnly yes tells SSH only to use the explicitly specified keys (in this case only the key specified using -i).

This is why when I use custom keys for different hosts I always define the host configuration in .ssh/config. This allows me to use a simple alias and, more importantly, to specify IdentitiesOnly yes and which key to use to avoid this kind of mistake:

Host kona.server
    Hostname server.akona.me
    IdentityFile ~/.ssh/aws_kona_id
    IdentitiesOnly yes
    Port 22
    User kona

With the above in your .ssh/config you should be able to login in your server with simply:

$ ssh kona.server

Solution 4:

The verbose output you've just added shows that you get Permission denied for ~/.ssh/aws_kona_id.

That's a completely different problem than Too many authentication failures.

Perhaps your aws_kona_id isn't the right key for the user (and that's why it kept trying all the other identities from the ssh-agent) or you should use the default EC2 user account, e.g. ec2-user or ubuntu or what have you.

Try those accounts or try to find the right key for kona user.

Solution 5:

Ubuntu EC2 Instance have a ubuntu user account installed with your ssh key.

If you don't remove this account you can still connect with :

ssh -i "/home/kona/.ssh/aws_kona_id" [email protected]

And fix your account problem after sudo -i and investigating /home/kona/.ssh/authorized_keys