Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong public key copied to release buildcaches #473

Open
1 of 6 tasks
alalazo opened this issue Apr 20, 2023 · 1 comment
Open
1 of 6 tasks

Wrong public key copied to release buildcaches #473

alalazo opened this issue Apr 20, 2023 · 1 comment

Comments

@alalazo
Copy link
Member

alalazo commented Apr 20, 2023

When cloning the latest point release, and adding the buildcache mirror associated with it, there are no public keys to be retrieved or they are the wrong ones.

Steps to reproduce

$ git clone --branch=v0.19.2 https://github.com/spack/spack
$ . spack/share/spack/setup-env.sh 

If I add the "top-level" mirror, I can't retrieve any public keys:

$ spack mirror add v0.19 https://binaries.spack.io/releases/v0.19
$ spack buildcache list --allarch
[ ... ] # Lot of output to verify we set up the mirror to the correct URL
$ spack buildcache keys -it
$ spack mirror remove v0.19
==> Removed mirror v0.19.

If I add a "pipeline specific" mirror, e.g. e4s I get the wrong intermediate keys:

$ spack mirror add v0.19 https://binaries.spack.io/releases/v0.19/e4s
$ spack buildcache list --allarch
[ ... ] # Lot of output to verify we set up the mirror to the correct URL
$ spack buildcache keys -it
gpg: key F85815B32355CB19: public key "e4s-uo-spack-01" imported
gpg: Total number processed: 1
gpg:               imported: 1
gpg: inserting ownertrust of 6
gpg: key BC86F6FB94429164: public key "Spack CI Key <[email protected]>" imported
gpg: Total number processed: 1
gpg:               imported: 1
gpg: inserting ownertrust of 6
$ spack mirror remove v0.19
==> Removed mirror v0.19.

Expected result

What I would expect is to retrieve the correct public key, as it happens for develop:

$ spack mirror add develop https://binaries.spack.io/develop
$ spack buildcache list --allarch
[ ... ] # Lot of output to verify we set up the mirror to the correct URL
$ spack buildcache keys -it
gpg: key A8E0CA3C1C2ADA2F: 7 signatures not checked due to missing keys
gpg: key A8E0CA3C1C2ADA2F: public key "Spack Project Official Binaries <[email protected]>" imported
gpg: Total number processed: 1
gpg:               imported: 1
gpg: marginals needed: 3  completes needed: 1  trust model: pgp
gpg: depth: 0  valid:   2  signed:   0  trust: 0-, 0q, 0n, 0m, 0f, 2u
gpg: inserting ownertrust of 6

Proposed solution

We should take the following steps to improve our release buildcache creation process:

  • Update sign-pkgs job to not fail with gpg: can't open '/tmp/*' when it has no binaries to sign. While this was a side-effect of a larger problem (all jobs failing), we should still fix this particular issue so sign-pkgs can upload public keys to the stack-specific buildcache even in the case that it has no binaries to sign.
  • Update the protected-publish job to run even if prior jobs failed, that way we can get a buildcache partially populated from the jobs that did succeed. Testing this will be tricky.
  • Update the protected-publish job to always copy public keys from the stack-specific buildcaches to the "root" level buildcache (addressed in this spack PR).
  • When populating a new buildcache for a tagged release, we should prefer copying prebuilt binaries from the corresponding release branch buildcache rather than rebuilding everything from source again (addressed in this spack PR).
  • To support the above item, pipelines for release branches should always rebuild everything to make sure no issues in core that don't change hashes break builds (addressed in this spack PR).
  • Refine our process for backporting PRs. Perhaps spackbot could be taught to automatically add issues to the release project when a PR is labeled with "backport"?
@scottwittenburg
Copy link
Collaborator

scottwittenburg commented May 3, 2023

Regarding updating the protected-publish job to run even if prior jobs failed: I just saw a case where I'm not sure that behavior would be desirable. In this pipeline, all the signing jobs are failing and if we had copied the binaries to the root anyway, we would have had to find them all later and remove them (since they're still signed with the intermediate signing keys used in the build pipelines, rather than the reputational key).

🤔

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants