Synchronize two directories on linux pc

I need a distributed filesystem (or a synchronization tool) that is capable of keeping a directory synchronized across 4 pc.

My requirements are:

  • offline access (data must be available offline on each pc)
  • preserve execution rights: some files are marked executable on a linux partition. This flag should be replicated.
  • efficient sync strategy: some of my files are 20GB, they are changed quite often, but in very little parts (Virtualbox images). Delta transmissions are welcome.
  • efficient handling of space: no history for files, files shouldn't be copied to temp directories "just in case you break it".
  • it must propagate deletions of files
  • modification can happen in any of the 4 pcs, they should be propagated when other pc are connected.
  • solution must be fault tolerant: most of the time the 4 pc are disconnected/not able to synchronize.

Other specs of my solution are:

  • Sync is over a lan, the total amount of data to be synced is around 180GB, in some ten thousand files. Changes are small, but can happen in large files.
  • At the moment i'm interested in a linux only solution.
  • Simple merge strategy: conflicts either don't happen or are solved with "last one wins"

I haven't found any good solution. I've been trying:

  • unison: it is the only one working at the moment, but i had to do a lot of custom configurations (cron scripts that chains executions). It hangs my pc for many minutes to detect changes, disk light steady on.
  • Sparkleshare doesn't handle large files nicely. It keeps an history of all your changes that grows up indefinitely. They promise it will be fixed in next releases, but at the moment it still doesn't fit my needs.
  • Owncloud doesn't handle large files nicely, has poor copy performances, keeps history of each file i change (can be disabled).
  • coda ? (help! i couldn't set it up correctly!)
  • git-annex assistant transforms all your files in symlinks and mark the original file as read only ("just in case you make a mistake while you modify it"!). Before you edit a file you have to issue a special command "git-annex unlock", that creates a local copy of the file, and you have to remember to lock it again if you want it synchronized. I had to study the manual before could get my 180Gb of files back! Never again on my pc!

What to try next?


Looks like I'm necro'ing an old thread, but hopefully this will help someone in the future.

BTsync is probably a perfect option for you. It uses a private swarm and tracker to create sync shares. It is very fault tolerant and handles blocks of data well instead of just full file syncing.

BitTorrent Sync