Keeping config files synced across multiple pc's

Keep the files under version control. This has multiple benefits, including facilitating keeping files synchronized (commit on one machine, update on the others) and keeping a history of changes (so you can easily find out what broke a program that worked last month).

I use CVS and synchronize the repositories with Unison or sneakernet, but that's because I've been doing this since a time before widely-available distributed version control. Anyone starting now should use a proper distributed version control tool, such as bazaar, darcs, git, mercurial, ...

Managing files that need to differ between machines is always a bit of a pain. If the configuration language allows conditionals, use them. Otherwise, if there is an include mechanism, use it to split the configuration file into a machine-dependent part and a shared part. Keep all the machine-dependent parts in a separate directory (something like ~/.local/NAME/) which is always referred to through a symbolic link (~/.here -> local/NAME on each machine). I have a few files that are generated by a script in the shared part from parameters kept in the machine-specific part; this precludes modifying these files indirectly through a GUI configuration interface. Avoid configuring things in /etc, it's harder to synchronize between machines.


I agree with the version control answer, but another method I've been experimenting with recently is Dropbox. It's essentially a version control system that automatically syncs between all your machines, so if you edit a file on one computer you'll see the changes reflected on your other computers in a couple seconds, without needing to commit on the former and update on the latter.

Their free basic plan is 2GB, so I use it to version my configuration files and chat logs


Puppet and Cfengine are two good tools for syncing files (and a lot more..)