How does Ubuntu manage so many daily images?

From these pages, https://cloud-images.ubuntu.com/locator/daily/ and http://cloud-images.ubuntu.com/releases/16.04/beta-2/, Ubuntu provides daily images for different releases (14.04 to 16.04), platforms (AWS, Azure, KVM, Vagrant...) and architectures (i386, amd64...).

This must need great automation. I'm curious about the architecture of this kind of build system. Are there any documents regarding this? Thanks.


Solution 1:

For Ubuntu probably it is done in a similar way than for Debian. Here some info about the Debian Autobuilder network.

Here some details on how to build a specific Ubuntu installer image.

For Debian, to decrease server-load, there are many mirrors which e.g. provide the cd-images build by the main Debian-servers. Most mirrors are maintained by volunteers. Here some doc about : Debian Mirrors. The same exists for Ubuntu: Ubuntu Mirrors

It should be sufficient to have one or maybe two build-servers per architecture. Each build-server can build the cd-images for its own architecture, for all the platforms, for all versions periodically. ( When using cross-compilation, even less build-servers could be needed )

Before uploading a installer-cd-image there are many integration-tests which need to be run in order to validate, that newly build packages do work with each other (see specification dep8) And of course there are package-specific tests during the build of each package.

However I dont know if the cd-builds itself are triggered by script + a cron job, or if they use some kind of software for continous integration (e.g. Jenkins). Or if they use a tool like automated linux from scratch.