Are configuration management tools (Puppet, Chef) capable of keeping installed packages up to date?

Solution 1:

You can do it with puppet, you either do:

ensure => latest,

or

ensure=> "1.0.2",

to specify the latest/required version. i.e.

package { apache2: ensure => "2.0.12-2" }
package { apache2: ensure => latest }

This does at least mean you can specify the same version across all systems, as well as preventing servers from (potentially dangerously) automatically upgrading themselves. I've used this method in production on a number of sites, and it works very well.

Running unattended upgrades scares me a bit, especially if they're upgrading mission-critical packages, kernels, mysql libraries, apache, etc. Especially if the install script might want to restart the service!

Solution 2:

I think this is probably the wrong question. Certainly using configuration management tools like Puppet and Chef to maintain your infrastructure is a huge leap forward from trying to do it all manually. The issue of keeping your package versions up to date and in sync is not one that any of these tools solves directly. To automate this properly you need to bring the package repositories themselves under your control.

The way I do this is to maintain a dedicated Yum repo (for Redhat/Fedora/CentOS; an APT repository for Debian/Ubuntu) which contains the packages I care about for a particular site. These will generally be the dependencies of the application itself (Ruby, PHP, Apache, Nginx, libraries and so on) and security-critical packages.

Once you have this set up (usually you can just mirror the required packages from the upstream repo to start with) you can use Puppet's "ensure => latest" syntax to make sure that all your machines will be up to date with the repo.

It would be wise to use a 'staging' repo to enable you to test updated versions of packages before rolling them blithely out to production. This is easily done with Puppet without any duplication of code by using repository templates.

Automating your package versioning strongly encourages you to bring all of your production systems into sync, as maintaining multiple repos and packages for different OS distros, versions and machine architectures is very time consuming and likely to lead to all sorts of obscure problems and incompatibilities.

All of this advice applies equally to Ruby gems, Python eggs and other package systems which you may use.

I've written a little Puppet tutorial which should help you get up and running with Puppet quickly. You could deploy a custom repo definition to your machines using Puppet as the first step in bringing package versions under control.


Solution 3:

Puppet (I'm pretty sure chef does also) ties in with your apt-get/yum software repositories. Since they do the heavy lifting of figuring out which packages are available, that means ensure => latest just works for Ubuntu/CentOS/Debian the like. As long as you set up the appropriate files correctly (/etc/apt/sources.list, etc).


Solution 4:

Whilst Puppet/Chef are possible contenders for this functionality, to make them keep everything on the system up-to-date requires either custom types or listing every package (including underlying system libraries like libc6) as resources with ensure => latest. For the specific case of automated package updates, you might want to look into the cron-apt package, which does what you want as well.


Solution 5:

This question is old, but I thought i'd answer in an up-to-date way as a currently existing answer was unavailable back then.

If you are using puppet or chef, look into mcollective. It is a very nice tool by the puppetlabs guys that allows you to send commands to groups of servers. http://docs.puppetlabs.com/mcollective/

It also has an apt plugin, which can be used to do an apt update on any number of servers: http://projects.puppetlabs.com/projects/mcollective-plugins/wiki/AgentApt