1.. SPDX-License-Identifier: CC-BY-SA-2.0-UK 2 3******************************************* 4Understanding the Yocto Project Autobuilder 5******************************************* 6 7Execution Flow within the Autobuilder 8===================================== 9 10The "a-full" and "a-quick" targets are the usual entry points into the 11Autobuilder and it makes sense to follow the process through the system 12starting there. This is best visualized from the Autobuilder Console 13view (:yocto_ab:`/typhoon/#/console`). 14 15Each item along the top of that view represents some "target build" and 16these targets are all run in parallel. The 'full' build will trigger the 17majority of them, the "quick" build will trigger some subset of them. 18The Autobuilder effectively runs whichever configuration is defined for 19each of those targets on a separate buildbot worker. To understand the 20configuration, you need to look at the entry on ``config.json`` file 21within the ``yocto-autobuilder-helper`` repository. The targets are 22defined in the ‘overrides' section, a quick example could be qemux86-64 23which looks like:: 24 25 "qemux86-64" : { 26 "MACHINE" : "qemux86-64", 27 "TEMPLATE" : "arch-qemu", 28 "step1" : { 29 "extravars" : [ 30 "IMAGE_FSTYPES:append = ' wic wic.bmap'" 31 ] 32 } 33 }, 34 35And to expand that, you need the "arch-qemu" entry from 36the "templates" section, which looks like:: 37 38 "arch-qemu" : { 39 "BUILDINFO" : true, 40 "BUILDHISTORY" : true, 41 "step1" : { 42 "BBTARGETS" : "core-image-sato core-image-sato-dev core-image-sato-sdk core-image-minimal core-image-minimal-dev core-image-sato:do_populate_sdk", 43 "SANITYTARGETS" : "core-image-minimal:do_testimage core-image-sato:do_testimage core-image-sato-sdk:do_testimage core-image-sato:do_testsdk" 44 }, 45 "step2" : { 46 "SDKMACHINE" : "x86_64", 47 "BBTARGETS" : "core-image-sato:do_populate_sdk core-image-minimal:do_populate_sdk_ext core-image-sato:do_populate_sdk_ext", 48 "SANITYTARGETS" : "core-image-sato:do_testsdk core-image-minimal:do_testsdkext core-image-sato:do_testsdkext" 49 }, 50 "step3" : { 51 "BUILDHISTORY" : false, 52 "EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; DISPLAY=:1 oe-selftest ${HELPERSTMACHTARGS} -j 15"], 53 "ADDLAYER" : ["${BUILDDIR}/../meta-selftest"] 54 } 55 }, 56 57Combining these two entries you can see that "qemux86-64" is a three step build where the 58``bitbake BBTARGETS`` would be run, then ``bitbake SANITYTARGETS`` for each step; all for 59``MACHINE="qemx86-64"`` but with differing SDKMACHINE settings. In step 601 an extra variable is added to the ``auto.conf`` file to enable wic 61image generation. 62 63While not every detail of this is covered here, you can see how the 64template mechanism allows quite complex configurations to be built up 65yet allows duplication and repetition to be kept to a minimum. 66 67The different build targets are designed to allow for parallelization, 68so different machines are usually built in parallel, operations using 69the same machine and metadata are built sequentially, with the aim of 70trying to optimize build efficiency as much as possible. 71 72The ``config.json`` file is processed by the scripts in the Helper 73repository in the ``scripts`` directory. The following section details 74how this works. 75 76Autobuilder Target Execution Overview 77===================================== 78 79For each given target in a build, the Autobuilder executes several 80steps. These are configured in ``yocto-autobuilder2/builders.py`` and 81roughly consist of: 82 83#. *Run clobberdir*. 84 85 This cleans out any previous build. Old builds are left around to 86 allow easier debugging of failed builds. For additional information, 87 see :ref:`test-manual/understand-autobuilder:clobberdir`. 88 89#. *Obtain yocto-autobuilder-helper* 90 91 This step clones the ``yocto-autobuilder-helper`` git repository. 92 This is necessary to prevent the requirement to maintain all the 93 release or project-specific code within Buildbot. The branch chosen 94 matches the release being built so we can support older releases and 95 still make changes in newer ones. 96 97#. *Write layerinfo.json* 98 99 This transfers data in the Buildbot UI when the build was configured 100 to the Helper. 101 102#. *Call scripts/shared-repo-unpack* 103 104 This is a call into the Helper scripts to set up a checkout of all 105 the pieces this build might need. It might clone the BitBake 106 repository and the OpenEmbedded-Core repository. It may clone the 107 Poky repository, as well as additional layers. It will use the data 108 from the ``layerinfo.json`` file to help understand the 109 configuration. It will also use a local cache of repositories to 110 speed up the clone checkouts. For additional information, see 111 :ref:`test-manual/understand-autobuilder:Autobuilder Clone Cache`. 112 113 This step has two possible modes of operation. If the build is part 114 of a parent build, it's possible that all the repositories needed may 115 already be available, ready in a pre-prepared directory. An "a-quick" 116 or "a-full" build would prepare this before starting the other 117 sub-target builds. This is done for two reasons: 118 119 - the upstream may change during a build, for example, from a forced 120 push and this ensures we have matching content for the whole build 121 122 - if 15 Workers all tried to pull the same data from the same repos, 123 we can hit resource limits on upstream servers as they can think 124 they are under some kind of network attack 125 126 This pre-prepared directory is shared among the Workers over NFS. If 127 the build is an individual build and there is no "shared" directory 128 available, it would clone from the cache and the upstreams as 129 necessary. This is considered the fallback mode. 130 131#. *Call scripts/run-config* 132 133 This is another call into the Helper scripts where it's expected that 134 the main functionality of this target will be executed. 135 136Autobuilder Technology 137====================== 138 139The Autobuilder has Yocto Project-specific functionality to allow builds 140to operate with increased efficiency and speed. 141 142clobberdir 143---------- 144 145When deleting files, the Autobuilder uses ``clobberdir``, which is a 146special script that moves files to a special location, rather than 147deleting them. Files in this location are deleted by an ``rm`` command, 148which is run under ``ionice -c 3``. For example, the deletion only 149happens when there is idle IO capacity on the Worker. The Autobuilder 150Worker Janitor runs this deletion. See :ref:`test-manual/understand-autobuilder:Autobuilder Worker Janitor`. 151 152Autobuilder Clone Cache 153----------------------- 154 155Cloning repositories from scratch each time they are required was slow 156on the Autobuilder. We therefore have a stash of commonly used 157repositories pre-cloned on the Workers. Data is fetched from these 158during clones first, then "topped up" with later revisions from any 159upstream when necessary. The cache is maintained by the Autobuilder 160Worker Janitor. See :ref:`test-manual/understand-autobuilder:Autobuilder Worker Janitor`. 161 162Autobuilder Worker Janitor 163-------------------------- 164 165This is a process running on each Worker that performs two basic 166operations, including background file deletion at IO idle (see :ref:`test-manual/understand-autobuilder:Autobuilder Target Execution Overview`: Run clobberdir) and 167maintenance of a cache of cloned repositories to improve the speed 168the system can checkout repositories. 169 170Shared DL_DIR 171------------- 172 173The Workers are all connected over NFS which allows DL_DIR to be shared 174between them. This reduces network accesses from the system and allows 175the build to be sped up. Usage of the directory within the build system 176is designed to be able to be shared over NFS. 177 178Shared SSTATE_DIR 179----------------- 180 181The Workers are all connected over NFS which allows the ``sstate`` 182directory to be shared between them. This means once a Worker has built 183an artifact, all the others can benefit from it. Usage of the directory 184within the directory is designed for sharing over NFS. 185 186Resulttool 187---------- 188 189All of the different tests run as part of the build generate output into 190``testresults.json`` files. This allows us to determine which tests ran 191in a given build and their status. Additional information, such as 192failure logs or the time taken to run the tests, may also be included. 193 194Resulttool is part of OpenEmbedded-Core and is used to manipulate these 195json results files. It has the ability to merge files together, display 196reports of the test results and compare different result files. 197 198For details, see :yocto_wiki:`/Resulttool`. 199 200run-config Target Execution 201=========================== 202 203The ``scripts/run-config`` execution is where most of the work within 204the Autobuilder happens. It runs through a number of steps; the first 205are general setup steps that are run once and include: 206 207#. Set up any ``buildtools-tarball`` if configured. 208 209#. Call "buildhistory-init" if buildhistory is configured. 210 211For each step that is configured in ``config.json``, it will perform the 212following: 213 214#. Add any layers that are specified using the 215 ``bitbake-layers add-layer`` command (logging as stepXa) 216 217#. Call the ``scripts/setup-config`` script to generate the necessary 218 ``auto.conf`` configuration file for the build 219 220#. Run the ``bitbake BBTARGETS`` command (logging as stepXb) 221 222#. Run the ``bitbake SANITYTARGETS`` command (logging as stepXc) 223 224#. Run the ``EXTRACMDS`` command, which are run within the BitBake build 225 environment (logging as stepXd) 226 227#. Run the ``EXTRAPLAINCMDS`` command(s), which are run outside the 228 BitBake build environment (logging as stepXd) 229 230#. Remove any layers added in step 231 1 using the ``bitbake-layers remove-layer`` command (logging as stepXa) 232 233Once the execution steps above complete, ``run-config`` executes a set 234of post-build steps, including: 235 236#. Call ``scripts/publish-artifacts`` to collect any output which is to 237 be saved from the build. 238 239#. Call ``scripts/collect-results`` to collect any test results to be 240 saved from the build. 241 242#. Call ``scripts/upload-error-reports`` to send any error reports 243 generated to the remote server. 244 245#. Cleanup the build directory using 246 :ref:`test-manual/understand-autobuilder:clobberdir` if the build was successful, 247 else rename it to "build-renamed" for potential future debugging. 248 249Deploying Yocto Autobuilder 250=========================== 251 252The most up to date information about how to setup and deploy your own 253Autobuilder can be found in README.md in the ``yocto-autobuilder2`` 254repository. 255 256We hope that people can use the ``yocto-autobuilder2`` code directly but 257it is inevitable that users will end up needing to heavily customise the 258``yocto-autobuilder-helper`` repository, particularly the 259``config.json`` file as they will want to define their own test matrix. 260 261The Autobuilder supports wo customization options: 262 263- variable substitution 264 265- overlaying configuration files 266 267The standard ``config.json`` minimally attempts to allow substitution of 268the paths. The Helper script repository includes a 269``local-example.json`` file to show how you could override these from a 270separate configuration file. Pass the following into the environment of 271the Autobuilder:: 272 273 $ ABHELPER_JSON="config.json local-example.json" 274 275As another example, you could also pass the following into the 276environment:: 277 278 $ ABHELPER_JSON="config.json /some/location/local.json" 279 280One issue users often run into is validation of the ``config.json`` files. A 281tip for minimizing issues from invalid json files is to use a Git 282``pre-commit-hook.sh`` script to verify the JSON file before committing 283it. Create a symbolic link as follows:: 284 285 $ ln -s ../../scripts/pre-commit-hook.sh .git/hooks/pre-commit 286