How do you remotely administer your Linux boxes? [closed]

Solution 1:

My toolset for these operations is painfully sparse (SSH into the box, edit files in VIM, WGET remote files that I need), and I suspect there is a much better way to do it. I'm curious to hear what other people in my position are doing.

Sparse? What on earth do you mean? Excuse me for ranting, but dismissing ssh, vim and wget as painful is almost insulting. From your question I deduce you are mainly a programmer for your daytime job, so I kinda understand the question. But honestly, I would not hire a Linux admin who is not comfortable with any of the three tools you mentioned.

Are you using some form of Windowing system and remote-desktop equivalent to access the box, or is it all command line? Managing remote Windows boxes is trivial, since you can simply remote desktop in and transfer files over the network. Is there an equivalent to this in the Linux world?

For administrator tasks I never, ever use an X environment. You do not need one, it'll only take up system resources and, for the most of the time, they're a hindrance instead of a help. Most GUI configuration tools (well, practically all, really) only offer a subset of the configuration option you can set in a configuration file with vim.

Managing a Linux box is no less trivial than managing a Windows box. It just takes some time to gain a decent skill set.

And a network file transfer equivalent? Plenty. scp, sftp, ftp, nfs, cifs / smb (Windows file sharing protocols), and then some.

Are you doing your config file changes/script tweaks directly on the machine? Or do you have something set up on your local box to edit these files remotely? Or are you simply editing them remotely then transferring them at each save?

Depends on what I am doing. Most of the things I do directly in the config files on the machine (for development and testing boxes) and then I push the file into a configuration channel on our Satellite server, after which I deploy the file to all servers directly (for production boxes). Really, vim is a treasure. That is, when you find out how to use it properly.

How are you moving files back and forth between the server and your local environment? FTP? Some sort of Mapped Drive via VPN?

scp all the way and maybe some sftp, and I suggest you do too. Never, ever use FTP to move sensitive files (e.g. config files) over a public network. I do not use a mapped network because again, all I need is on the server. If you mean c files and not configuration files here, I usually use something like svn or git and then push my changes to the box.

I'd really need to get some best practices in place for administering these boxes. Any suggestions to remove some of the pain would be most welcome!

You are already using them: ssh, scp, wget and vim. Those are not pain. There might be some teething pains, while you figure out how powerful they are. But, to bring the Windows analogy back, I feel seriously hampered when I have to use a Windows box. For you it's the other way around. It's just what you are used to. So, give it some time and it'll come to you.

Solution 2:

You mentioned already ssh, vim and wget which is essential and perfect. Some additional tools that can make life easier:

1. GNU Screen / byobu

"GNU Screen is a free terminal multiplexer that allows a user to access multiple separate terminal sessions inside a single terminal window or remote terminal session. It is useful for dealing with multiple programs from the command line, and for separating programs from the shell that started the program." (From the GNU_Screen page on wikipedia)

A main advantage is that you can have one or several virtual terminals that are in the exactly same state as you left them when you come back (i.e. relogin via ssh). This is also good when your connection is broken for some reason.

Screen works indepently from the software you use to connect to the box (it lives on the server), so it combines well with putty or most other terminal software.

This article shows some nice things you can do with it: http://www.pastacode.de/extending-gnu-screen-adding-a-taskbar/en/

A good alternative is byobu, which comes nicely preconfigured on some distributions: http://byobu.co/


2. Midnight Commander

A console based graphical-like browsing tool for viewing and manipulation of files and directories.

Can also do secure remote transfers. There is a built in FISH and FTP client.

This means you have 2 text-windows side by side in a command line console and one shows your remote box and the other wherever you connect it to (which can also be your local system) Then you can navigate both file systems side by side and mark or investigate individual files or file treeees and also copy or move them between locations. FISH is secure, FTP isn't. Very powerful and simple for beginners.


3. rsync

For fast, secure and reliable file transfer and synchronisation between different locations


4. VCS

Use of a distributed version control system like bazaar, mercurial or git to update code. Github or Bitbucket offer commecrcial code hosting, but it is not necessary, you can also use it efficiently on your own machines.

Joseph Kern: can you elaborate how you exactly use git for remote config organisation?


5. Terminal Clients

On unix-like-systems they are already on board, on Windows you can use Putty, Tera Term, Mind Term or Pandora. Or make a cygwin installation and ssh from the cygwin terminel windows to the remote boxes (which has more advantages but this is a question of what you prefer).


6. Tunneling and Port Forwarding

It can be helpful to forward certain ports securely to your local machine. For example you could forward the mysql port TCP 3306 or postgres TCP 5432 and install some database administration tool locally.

You can build tunnels from Windwos machines with putty (or command line based with it's little brother plink), with cygwin and Mindterm also can do port forwarding. If you are locally on a unix-like machine you can use ssh odr plink to create such tunneling.

To create some more stable and permanent tunneling for various ports I recommend OpenVPN. The "pre-shared-key" tunneling method from point to point is not so hard to install.


7. Have a local unix-like system

When your local machine is a Mac you have this already, you can open a local shell. When your workstation is windows-based it could be helpful to create a local unix-like server, which is in the same local network. This can be a different machine in a different room connected to the same router or switch. Or if you want only one machine, you can install the free vmware server and make a virtual machine, preferably the same operating system as your remote machine. Install a samba server on it and you can "net use" the samba shares from your desktop.

If you an ssh server on the local server and open port 22 on your router for it you can ssh into your local system when you are outside.

You can build tunnels to remote machines or transfer and synchronise files and whole file trees with rsync. You can use it for testing, for VSC, for local development, as a local webserver, for training purposes.

You can pull backups from remote machines. You can create local cron jobs that do te backups autmatically (e.g. databases you want to save locally regularly)


8. X Remote GUI

If you are workin on Linux like system physically, it is also possible to run GUI applications on your linux servers that draw the gui on your local machine. This could be a graphical file compare tool or almost anything you want.

Although it is not very common and in msot cases not necessary to use gui software for linux box administration you might in some cases find it useful if you can.

On the remote machine make sure in /etc/ssh/sshd_config this line exists:

X11Forwarding Yes

Restart the ssh server with

/etc/init.d/sshd restart

Then next time you login with

ssh -X me@remote-box

You will have an X tunnel, try to install xclock on the remote server for testing purposes and execute xclock iin the ssh session I just mentioned. A simple x clock for testinng purposes should appear on your Linux GUI.

This is also possible an a Mac if you install a local X enviroment.


9. If you have a bunch of similar boxes or tasks: use a system configuration tool

If you have a server farm or do big cloud deployments with many redundant or otherwise equal or similar machines, you could use this.

Probably it would not make sense, if most boxes are individual or have different operating systems or different versions running.

There are several tools:

  • chef http://www.getchef.com/chef/ See Evan Andersons brilliant answer below: https://serverfault.com/a/28789/45819

  • puppet the other big player http://puppetlabs.com/

  • salt looks promising http://www.saltstack.com/


10. Deploy application containers with docker

This goes even one step further. Docker is an open source project that automates the deployment of applications inside software containers: https://www.docker.io


11. Use Google Compute Engine with automatic deployment management

https://cloud.google.com/products/compute-engine/

Google offers Linux VMs with very exciting possibilities. You can quickly deploy large clusters of virtual machines with tools including a RESTful API, command-line interface and web-based Console. You can also use tools such as RightScale and Scalr to automatically manage your deployment.

Solution 3:

If you're looking for a nice GUI to work with file management via SSH from Windows boxes, have a look at WinSCP: http://winscp.net

I don't administer any EC2 instances, but in general if I have more than a single machine performing a role I'll try and write a script to perform work on all the like boxes, in lieu of making changes box-for-box.

I'd like to get started using Puppet (http://reductivelabs.com/products/puppet/), because it makes system administration more of a configuration management exercise. I haven't had the spare cycles to have a look at it in detail yet, but I've heard very good things.