How to make sshfs + VPN + git a tolerable working environment?

Currently the code base for the project I am working on is remotely on a company server. and it has to stay like that. also the remote git repository cannot be made public.

My current setup is:

  • Connect to the VPN
  • run sshfs to mount a copy of the code
  • start working on the code
  • when I am done: ssh to the remote server and run git commands there

The problem with this, is that the VPN falls from time to time, so My sshfs mounth breaks, and my IDE freezes. what I do is to manually reconnect the VPN, then run sshfs again, and get back to work.

But it gets annoying as the VPN falls more often.

So I wonder if there are any settings for sshfs for some sort of cache, that would allow me to work, and only sync the changes when the VPN gets back.

That may make no sense, since if the remote driver is not available there is nothing to write to. So what about a different setup that uses some watch kind of thing and uses rsync to move changes in a bidirectional way (either when I save a file, or when I do git pull )

I can't just git clone, because I can't reproduce the entire environment to work 'locally' (DB and stuff)

the code has to be in their servers, in order for me to test/see my work I have to access a URL, that is my sandbox. I can't git push each time I want to see my changes.


Solution 1:

zecrazytux is right -- Why don't you use git the way you're supposed to: by cloning the repository, working on it remotely, and pushing the changes back to the master?

I see no reason you "can't" git push your work each time you want to see your changes (ideally pushing to a development branch that then gets merged when it's tested and proven working) -- lots of people do this. You can even use a post-receive hook to deploy your changes into the environment if you want to automate that part of things.
(You obviously don't WANT to do this, but you haven't given any reason why so I reject the premise of your problem.)


Frankly there's nothing you can do to make an unreliable network connection "tolerable" (ESPECIALLY if you're trying to mount network filesystems) -- you can either work remotely as outlined above, SSH into the system and work directly on it (screen is your friend here), or investigate and fix the underlying network instability.
Trying to do something else to "make it tolerable" is an exercise in futility (think "cocktail umbrella in a hurricane").

Solution 2:

I am using this sshfs options to minimize latency:

sshfs -o Ciphers=arcfour,compression=no,nonempty,auto_cache,reconnect,workaround=all [email protected]:/usr/local/gitdev/ ~/dev/code

It has the reconnect flag, all sshfs workarounds, using auto cache and arcfour chiper.

You can read about those options on sshfs manual, I found those the fastest sshfs options at least to my setup.

ETA: More on sshfs performance read here: sshfs performance