How to execute complex command line over ssh?
I sometimes want to get the env of a docker container running on a remote host. To do this I log into the host:
ssh [email protected]
and then I run this command:
sudo docker exec -it `sudo docker ps | grep mycontainername | awk '{print $1;}'` env
I now want to do this in one command (do it more than 3 times and I want to automate it.. :-) ).
This works:
ssh -t [email protected] sudo docker ps | grep mycontainername
but when I do this
ssh -t [email protected] sudo docker exec -it `sudo docker ps | grep mycontainername | awk '{print $1;}'` env
I get
"docker exec" requires at least 2 arguments.
See 'docker exec --help'.
Usage: docker exec [OPTIONS] CONTAINER COMMAND [ARG...]
Run a command in a running container
Connection to 123.456.789.10 closed.
Does anybody know how I can get this running?
Solution 1:
General observations
There are some keywords in Bash that affect parsing what's after them, e.g. [[
. But ssh
is not one of them, it's a regular command. This means that:
- The whole
ssh …
line is normally parsed by your local shell; characters like|
,;
,*
,"
,$
or space mean something to the shell, they won't get tossh
, unless you quote or escape them (with few exceptions, e.g. sole$
as a separate word is not special). This is the first level of parsing and interpreting. - Whatever arguments get to
ssh
(or any other regular command) after the shell does its job, they are just arguments, strings. It's now the tool's job to interpret them. This is the second level.
In case of ssh
some (zero, one or more) of its command line arguments are interpreted as a command to be run on the server side. In general ssh
is able to build a command from many arguments. The effect is as if you invoked something like this on the server:
"$SHELL" -c "$command_line_built_by_ssh"
(I'm not claiming it's exactly like this, but it's certainly close enough to understand what happens. I wrote it as if it was invoked in a shell, so it looks familiar; but in fact there is no shell yet. And there is no $command_line…
as a variable, I'm just using this name to refer to some string for the purpose of this answer.)
Then $SHELL
on the server parses $command_line…
on its own. This is the third level.
Specific observations
The command that failed
ssh -t [email protected] sudo docker exec -it `sudo docker ps | grep mycontainername | awk '{print $1;}'` env
failed because 123.456.789.10
is not a valid IP address.
OK, I understand 123.456.789.10
is a placeholder, still it's not valid. :)
The command failed because it executed sudo docker ps | grep mycontainername | awk '{print $1;}'
locally. The output was probably empty. Then $command_line_built_by_ssh
was far from what you wanted.
Note ssh … | grep mycontainername
run grep
locally (you might or might not be aware of this).
Discussion
To be in control of what the remote shell will get as $command_line_built_by_ssh
, you need to understand, predict and mastermind the parsing and interpreting that happens before. You need to craft your local command, so after the local shell and ssh
digest it, it becomes the exact $command_line…
you want to execute on the remote side.
It may be quite complicated if you actually want your local shell to expand or substitute anything before the result gets to ssh
. Your case is simpler because you already have the verbatim string you want as $command_line_built_by_ssh
. The string is:
sudo docker exec -it $(sudo docker ps | grep mycontainername | awk '{print $1;}') env
Notes:
- I used command substitution in a form of
$()
, not backticks. There are reasons to prefer$()
. - I don't know
docker
at all, I cannot tell if your$(…)
should be double-quoted. In general not quoting is almost always bad. Ask yourself what happens when the substitution returns multiple words (i.e. multiple lines enterawk
). This is a different issue (if ever an issue in this case) and I won't address it in this answer.
To protect everything from being expanded/interpreted by the local shell, you need to properly quote or escape (with \
) all characters that can trigger expansion or can be interpreted. In this case quote $
, (
, )
, |
, ;
, {
, }
, '
and (maybe or optionally) spaces.
I said "maybe or optionally quote or escape spaces" because of how ssh … some command
works. If it finds two or more arguments that it interprets as code to be run on the server, it will concatenate them, adding single spaces in between. This is how $command_line_built_by_ssh
is built. If you neither quote nor escape spaces in what looks like code for the remote shell, then the local shell will consume spaces (and tabs) while splitting words, then ssh
will add spaces. The result may not be exactly what you want if there are tabs or multiple consecutive spaces. For example:
ssh user@server echo a b
ssh
gets user@server
, echo
, a
, b
. The remote command will be echo a b
and echo
there will get a
, b
. It will print a b
.
Then this:
ssh user@server 'echo a b'
ssh
gets user@server
, echo a b
. The remote command will be echo a b
and echo
there will get a
, b
. It will print a b
.
And finally this:
ssh user@server 'echo "a b"'
ssh
gets user@server
, echo "a b"
. The remote command will be echo "a b"
and echo
there will get a b
. It will print a b
.
The conclusion is you should quote in the context of the local shell and separately in the context of the remote shell. Keep in mind that when it comes to expanding things by a shell, the outer quotes matter.
Solution
Putting all this information together (and still assuming you want to protect everything from being expanded/interpreted by the local shell), I advise as follows:
- Quote or escape.
- Prefer quoting over escaping because a single pair of quotes can protect many characters, while a single
\
protects one character only. You will most likely need many backslashes to protect everything; often the same result can be achieved with just one pair of quotes. - Prefer single-quotes (
'
), they can protect everything but'
. On the other hand double-quotes ("
) can protect'
but not$
nor"
(nor\
sometimes, nor!
sometimes in Bash), unless$
and such are escaped as well (i.e. escaped and quoted, i.e. escaped within double-quotes; except!
which is troublesome). - Prefer providing command(s) as a single argument to
ssh
.
This leads to the following procedure:
- Prepare a verbatim command you want to run on the remote side.
- Replace every
'
with'"'"'
or with'\''
(you can choose independently for each'
). - Embrace the whole resulting string with single-quotes.
- Add
ssh …
in front.
Your verbatim command is:
sudo docker exec -it $(sudo docker ps | grep mycontainername | awk '{print $1;}') env
The procedure results in:
ssh -t user@server 'sudo docker exec -it $(sudo docker ps | grep mycontainername | awk '\''{print $1;}'\'') env'
# single-quoted ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^ ^^^^^
# escaped ^ ^
Note the procedure may not lead to the shortest possible string. With insight one can sometimes "optimize" the string. But the procedure is quite simple and totally reliable. If only you know you want to protect everything from being expanded/interpreted by the local shell, the procedure itself requires no further insight at all.
Automation
In fact the procedure can be automated. There are tools that can add quotes to a string and properly preserve existing quotes. Bash itself is such tool. This answer of mine provides a way to do this with a keystroke in Bash. A possible solution adjusted to your case is:
-
In your local shell define the following custom function and binding:
_prepend_ssh() { READLINE_LINE="ssh ${READLINE_LINE@Q}"; READLINE_POINT=4; } bind -x '"\C-x\C-h":_prepend_ssh'
Still in your local shell type (or paste) the command you want to run in the remote shell; do not execute. The command should be exactly as you want it to be in the remote shell.
- Hit Ctrl+x,Ctrl+h. The local shell will take care of quoting. It will also add
ssh
in front and place the cursor just after. - Add (type) missing arguments (e.g.
-t user@server
). For your convenience the cursor is already in the right position to do this. - Enter
Alternatives
There is another way to pass verbatim commands to a remote shell. In some cases you can pipe them via ssh
. Let's assume the remote command line should be:
echo "$PATH"; date
We could proceed like above, add single-quotes and run locally like this:
ssh user@server 'echo "$PATH"; date'
The example is simple but in general adding quotes in not always that easy. Alternatively we can pipe the command like this (echo
for simplicity; printf
is better):
echo 'echo "$PATH"; date' | ssh user@server bash
which still requires these single-quotes. But if you have the command(s) in a file, then:
<file ssh user@server bash
Or even without any file (here document):
ssh user@server bash <<'EOF'
echo "$PATH"
date
EOF
(Note the quotes in <<'EOF'
prevent $PATH
from being expanded locally.)
Advantages:
- You can easily pass multi-line commands/snippets/scripts (I split
echo … ; date
just to show this). - No additional layer of quoting required.
- You can explicitly choose a remote interpreter which doesn't have to be a shell (e.g.
bash
orzsh
, orpython
).
Disadvantages:
- You should explicitly specify a remote interpreter, otherwise the default login shell will be spawned, message of the day printed maybe. You can still use the default shell as a non-login shell by specifying properly quoted
exec "$SHELL"
(the line will be likessh … 'exec "$SHELL"' <<'EOF'
). - Standard input of
ssh
is not a terminal, so you cannot use-t
(this is why I didn't use your original command as an example). - The commands get to the remote interpreter (
bash
in the example) via its standard input. Possible problems:- Child processes (or builtins) will use the same stdin. If any of them reads from its stdin then it will read the same stream, possibly it will read the next command(s) destined for the interpreter. This behavior can be suppressed or even creatively (ab)used, I won't elaborate though.
- You cannot easily use this channel to pipe anything else.