Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API improvement: Specify digest of artifact in certificate #475

Closed
haydentherapper opened this issue Mar 15, 2022 · 7 comments
Closed

API improvement: Specify digest of artifact in certificate #475

haydentherapper opened this issue Mar 15, 2022 · 7 comments
Labels
enhancement New feature or request

Comments

@haydentherapper
Copy link
Contributor

Description

Consider the following attack:

  • Client generates keypair K
  • Client fetches a Fulcio certificate
  • Client uploads signature over artifact and Fulcio certificate to Rekor log
  • A malicious process (MP) steals K and certificate
  • MP uploads a signature over its own artifact and Fulcio certificate to Rekor log

Currently the only defense against this attack is the limited lifetime of the certificate and the ephemerality of the keypair.

If the Fulcio certificate were issued not just for an identity but also for an artifact, then this attack would not be possible. On verification, a client would detect that the artifact in the certificate does not match the signed artifact.

To implement this, we need to:

  • Add an OID for the digest of an artifact
  • Add an optional API field for the digest
  • Include the digest in the certificate

Cosign can check for the matching digest. Possibly Rekor could do before uploading to the log, though I'm not sure if we're currently validating the fields of the certificate.

cc @asraa @bobcallaway

@haydentherapper haydentherapper added the enhancement New feature or request label Mar 15, 2022
@asraa
Copy link
Contributor

asraa commented Mar 15, 2022

Add an optional API field for the digest

Let's add some strict verification here for a digest here. I would be a little worried about user-defined OID extensions (does Private CA fuzz? hahaha)

Possibly Rekor could do before uploading to the log, though I'm not sure if we're currently validating the fields of the certificate.

I think this would be suitable for a monitor! I'd assume if an attacker were going at these lengths, they wouldn't be using cosign and be manually uploading to rekor :)

@asraa
Copy link
Contributor

asraa commented Jun 7, 2022

cc @laurentsimon

@asraa
Copy link
Contributor

asraa commented Jun 7, 2022

This also protects against

  1. Entry malleability
  2. The need to distribute separate sig

However, this relies on trusting the CA (which we already do), but the delegation of the trust is different. The CA is trusted to sign code-signing certs, not to attest data or content.

Atlernatives:

  • Add the signature as an optional extension. This would also prevent against any of the risks/threats. It's kind of a bind between the public key and the signature. Better yet, put a hash of the sig if there's a concern that people will not use out of band signatures anymore.

@laurentsimon
Copy link

laurentsimon commented Jun 9, 2022

This also protects against

  1. Entry malleability
  2. The need to distribute separate sig

However, this relies on trusting the CA (which we already do), but the delegation of the trust is different. The CA is trusted to sign code-signing certs, not to attest data or content.

Not sure this is fully true. The leaf cert is only valid for code signing, so verification should fail unless clients disregard the v3 extension. That's not directly related to the proposal in the issue. What I mean in that in the current implementation, clients may also disregard the extension. So I don't think this changes the trust model; it just removes on hop in the chain (and is non-standard).

An alternative would be to have 2 root certs: one that delivers leaf certs, and one that signs "data" (with the code-signing extension). My intuition is that 99% of OIDC use cases have a single hop chain, so optimizing for it seems legitimate.

I agree this is not a general solution to malleability, and you may still want to address it in the general case.

I don't know what the definition of code-signing is, but provenance data is not just "code". It's typically related to code one way or another, but not always code per se: scanning results, SBOMs, scorecard results, etc. I think this is fine, but just curious if you have more info/thoughts on this :-)

Happy to chat more

@znewman01
Copy link
Contributor

IMO in any scenario where the ephemeral key leaks, the OIDC token could leak too so this doesn't really solve the issue. "My laptop is owned" isn't really a threat model I think we're ever going to be able to handle.

A better solution here is generate the keys inside a touch-gated enclave or similar and bind them to the OIDC token (DPoP). This doesn't solve the problem, of course, but nothing will.

@haydentherapper
Copy link
Contributor Author

Is this something there is still interest in? I don't think we'll pursue this otherwise and I'll close it out.

@haydentherapper
Copy link
Contributor Author

Going to close this as I don't think we'll be implementing this. Feel free to reopen if this comes up again!

@haydentherapper haydentherapper closed this as not planned Won't fix, can't repro, duplicate, stale Feb 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants