Bug #17658
Allow building IUKs in parallel (on Jenkins)
0%
Description
This is similar and related to Feature #17657 except the way things are happening in Jenkins seems to warrant a slightly different approach.
At first, I wanted to just write a “RM-side” wrapper that would trigger as many Jenkins jobs as there are items in the IUK_SOURCE_VERSIONS
list but I seem to have failed at submitting jobs using either Perl’s LWP or Python’s Requests modules. This would have meant polling until all jobs have finished, checking the status, and gathering all results anyway.
That’s why I switched to a different direction: I had some experience with some build jobs, triggering sub-jobs for different configurations (e.g. building for amd64
or i386
). This is achieved with a configuration matrix and a user-defined axis. I’ve demo’d this with the kibi_build_IUKs
job, which takes the same parameters as the build_IUKs
one, except for the SOURCE_VERSIONS
parameter that becomes a SOURCE_VERSION
(singular) user-defined axis.
The biggest downside with this approach is that one has to first re-configure the job to update the list of source versions for that configuration matrix, before submitting the other parameters to trigger a build.
But that allows parallelism as all sub-jobs are triggered on various machines (see the isobuilder1 || isobuilder2 || isobuilder3 || isobuilder4
value for label_exp
) when they are available.
Since I don’t know any better, I’ve had to tweak a few things for that to work:
- Using a temporary directory to avoid having overlong blablabla/main-job/blablabla/sub-job paths that could trigger some issues.
- Archiving the (single) generated IUK file.
- Setting up a
kibi_collect_IUKs
downstream job to gather all generated IUK files in one place, so thatcopy-iuks-to-rsync-server-and-verify
can be pointed at it and do its job as usual.
At this point, I’d like to take a step back and ask anonym &
intrigeri what they think about the idea as a whole, and whether they would have ideas to make that better.
A few things I have in mind right now:
- Keeping
build_IUKs
as it is would be a nice and cheap safeguard: if anything goes wrong, we can still run everything sequentially and tada.
- Naming: maybe
parallel_build_IUKs
? (Usingkibi_*
was a quick way to make sure no clashes would happen during the experimentation.)
- Collecting: is collecting all results in a separate job something entirely crazy that would need fixing? If that looks fine enough, maybe name it
parallel_collect_IUKs
?
- Usability: if we could keep the concept of a “user-defined axis” — for the parallelism/automated distribution of sub-jobs — but have that be a parameter rather than an hard-coded list that needs updating (through re-configuration), that would be much cleaner and a tad more practical (build submission in one go).
- Longterm: I think Jenkins jobs are defined somewhere so that one doesn’t have to rely on the configuration done through the web UI, but if that’s indeed the case, I didn’t find where that happens when I last looked.
I’ve used that successfully for 4.4.1, 4.5~rc1, and 4.5 (see job history in https://jenkins.tails.boum.org/job/kibi_collect_IUKs/); it didn’t feel too risky to try this proof-of-concept as the results have to match those we obtain locally, so any deviations (possibly due to this experimentation) were bound to be discovered.
Subtasks
Related issues
Related to Tails - Bug #17434: Building many IUKs (v2) takes a while on Jenkins | Confirmed | ||
Blocks Tails - Feature #16209: Core work: Foundations Team | Confirmed |
History
#1 Updated by CyrilBrulebois 2020-04-26 12:24:16
- Feature Branch set to feature/17658-allow-building-iuks-in-parallel-on-jenkins
I almost forgot: WIP for the updated release process documentation (and trivial change to copy-iuks-to-rsync-server-and-verify
) pushed as feature/17658-allow-building-iuks-in-parallel-on-jenkins
#2 Updated by intrigeri 2020-04-29 16:04:36
- related to Bug #17434: Building many IUKs (v2) takes a while on Jenkins added
#3 Updated by intrigeri 2020-04-29 16:09:19
Hi,
first things first: I’m super happy you’ve been working on this!
Disclaimer: I did not read the issue thoroughly. I’m focusing on sharing info so that progress can happen without blocking on me.
> * Longterm: I think Jenkins jobs are defined somewhere so that one doesn’t have to rely on the configuration done through the web UI, but if that’s indeed the case, I didn’t find where that happens when I last looked.
The jobs definition lives in https://git.tails.boum.org/jenkins-jobs/.
There’s code in https://git.tails.boum.org/puppet-tails/ to update these jobs.
IIRC that code uses stuff that’s in our Python library, which was migrated to tails.git a few months ago.
If the job definitions can be static and the RM only has to set some parameters, then indeed, it would be nice to migrate these jobs to jenkins-job-builder.
If the RM has to edit the job definitions, then I guess we shall live with configuration done via the web UI.
#4 Updated by CyrilBrulebois 2020-05-04 23:58:20
- Assignee set to anonym
Thanks, intrigeri.
Looking around a little more, it seems the current doc for the Matrix Project (https://plugins.jenkins.io/matrix-project/) lists the Dynamic Axis plugin (https://plugins.jenkins.io/dynamic-axis/) as the first “Matrix Axis Extension”. Unfortunately, it hasn’t seen a release in 5+ years, which concerns me a little. On the other hand, if it’s listed there, and if it works, maybe it’s simple enough that it hasn’t needed any update in that time frame?
If I understand it correctly, that would make it possible to stick to the current/previous list of parameters, and use the SOURCE_VERSIONS
one (filed with $IUK_SOURCE_VERSIONS
) to inject the proper items into our axis?
Assigning to @anonym for input from my fellow RM.
Until we decide what to do with the possible “Dynamic Axis” addition, I think we might go ahead with:
- renaming jobs (if only to remove “kibi_” as a prefix);
- documenting in this issue how the jobs are configured, in case we lose them in some way;
- updating the documentation to make it official how to trigger parallel building, keeping the serial case as an option if parallelism breaks.
And of course, all the better if we can get the job definition to be static and committed to git.
#5 Updated by CyrilBrulebois 2020-05-06 04:29:01
- Target version changed from Tails_4.6 to Tails_4.7
#6 Updated by anonym 2020-05-14 10:20:06
- Status changed from Confirmed to In Progress
- Assignee changed from anonym to CyrilBrulebois
Epic stuff! And again, like in Feature #17657#note-7, I think this is FT work, and will mark it as such.
CyrilBrulebois wrote:
> At this point, I’d like to take a step back and ask anonym &
intrigeri what they think about the idea as a whole, and whether they would have ideas to make that better.
What you have done is beyond my knowledge of Jenkins, so I’m afraid I won’t be able to help you improve this. What I can say is that what you’ve done so far looks very usable to me!
> A few things I have in mind right now:
>
> * Keeping build_IUKs
as it is would be a nice and cheap safeguard: if anything goes wrong, we can still run everything sequentially and tada.
Agreed!
> * Naming: maybe parallel_build_IUKs
? (Using kibi_*
was a quick way to make sure no clashes would happen during the experimentation.)
LGTM!
> * Collecting: is collecting all results in a separate job something entirely crazy that would need fixing? If that looks fine enough, maybe name it parallel_collect_IUKs
?
No idea if this is crazy. Proposed name LGTM!
> * Usability: if we could keep the concept of a “user-defined axis” — for the parallelism/automated distribution of sub-jobs — but have that be a parameter rather than an hard-coded list that needs updating (through re-configuration), that would be much cleaner and a tad more practical (build submission in one go).
IMHO your current proposal is definitely good enough to not have to spend time on improving this (or introducing potentially unmaintained plugins!).
> I’ve used that successfully for 4.4.1, 4.5~rc1, and 4.5 (see job history in https://jenkins.tails.boum.org/job/kibi_collect_IUKs/); it didn’t feel too risky to try this proof-of-concept as the results have to match those we obtain locally, so any deviations (possibly due to this experimentation) were bound to be discovered.
That is reassuring! I agree that the local verification done indeed made these experiments-in-production safe. :)
#7 Updated by anonym 2020-05-14 10:20:23
- blocks Feature #16209: Core work: Foundations Team added