How can I create persistent SSH connection to "stream" commands over a period of time?
Not sure if it can be used in production but you can do something like this:
create file on #1
1> touch /tmp/commands
Then run command:
1> tail -f /tmp/commands | ssh [email protected]
That will open file /tmp/commands and start sending its content to server x.x.x.x (#2) and run it there line by line
now, every time something happens on #1 do:
1> echo "ls -l" >> /tmp/commands
or
1> echo "reboot" >> /tmp/commands
whatever you add to file /tmp/commands will be sent to #2 and executed. Just make sure you do not run anything interactive, or deal with it somehow.
Automatic Persistency Using OpenSSH
You can also use the ControlMaster
feature of OpenSSH, which opens a unix domain socket for the first connection and reuses this connection in all subsequent calls.
To enable the feature, you can either use -M
as the command line switch or enable the ControlMaster
option in your ~/.ssh/ssh_config
, e.g.:
ControlMaster auto
Additionally, you should set the ControlPath
using the following lines in your ~/.ssh/ssh_config
:
Host *
ControlPath ~/.ssh/master-%r@%h:%p
To maintain a persistent connection to a host, e.g. if you want to run a script which needs to establish many ssh connections to the host, none of which persistent over the whole lifetime of the script, you can start a silent connection in advance using:
ssh -MNf remotehost
Cheerio, nesono
In /etc/ssh/ssh_config
add
# Send keep alive signal to remote sshd
ServerAliveInterval 60
If you run into this sort of thing a lot, try Parallel. It is like dsh (distributed shell) but has some neat features like counting semaphores and it is actively maintained.
From the documentation:
EXAMPLE: GNU Parallel as queue system/batch manager
GNU Parallel can work as a simple job queue system or batch manager. The idea is to put the jobs into a file and have GNU Parallel read from that continuously. As GNU Parallel will stop at end of file we use tail to continue reading:
echo >jobqueue; tail -f jobqueue | parallel
To submit your jobs to the queue:
echo my_command my_arg >> jobqueue
You can of course use -S to distribute the jobs to remote computers:
echo >jobqueue; tail -f jobqueue | parallel -S ..
There are many great examples that just scratch the surface. Here is a cool one.
EXAMPLE: Distributing work to local and remote computers
Convert *.mp3 to *.ogg running one process per CPU core on local computer and server2:
parallel --trc {.}.ogg -j+0 -S server2,: \
'mpg321 -w - {} | oggenc -q0 - -o {.}.ogg' ::: *.mp3