Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Signing and verification traits #7

Merged
merged 3 commits into from
Mar 26, 2019
Merged

Signing and verification traits #7

merged 3 commits into from
Mar 26, 2019

Conversation

tarcieri
Copy link
Member

This PR contains a number of commits adding a set of traits for creating and verifying digital signatures.

I would suggest reviewing them commit-by-commit:

  • 1f7efa6: Sign trait
  • 9fd884c: Verify trait
  • c438c2c: SignDigest and VerifyDigest
  • f5e2db8: SignSha256, SignSha384, SignSha512, VerifySha256, VerifySha384, VerifySha512

All traits are bounded by Send + Sync to ensure signers and verifiers are thread safe. Libraries which provide access to HSMs will need to e.g. Mutex guard access to the underlying device.

Trait for producing digital signatures
Support for signing and verifying prehashed message `Digest`s, for use
with signature algorithms that support Initialize-Update-Finalize usage.
@newpavlov
Copy link
Member

So after IRC discussion the only change for now is to remove SHA traits, right?

@dignifiedquire
I think it will be nice to support traits from signature crate in rsa as well, so your input will be quite valuable.

@tarcieri
Copy link
Member Author

@newpavlov yes, we can try completely removing the SignSha* and VerifySha* traits for now. Instead, ECDSA signers who want a raw message which they subsequently hash (as opposed to ones which take a raw Digest) can simply implement the Sign trait, and MUST hash the message with the SHA-2 function which is the same size as the modulus.

For posterity, the reason for having separate SignSha256, SignSha384, etc. is because ECDSA supports wacky mix-and-match combinations of curve moduli and hash functions, e.g. you can use SHA-384 with P-256, or SHA-256 with P-384. However, it's not clear to me it's actually worth considering such cases (e.g. ring labels both of these combinations as "Not recommended" despite supporting them).

So for now, I agree, let's try removing them and see if we can get by without them. Worst case, if someone does show up clamoring for the mix-and-match combinations, we can (potentially) add them back, but let's cross that bridge when we get there.

@dignifiedquire
Copy link
Member

dignifiedquire commented Mar 25, 2019

Hmm I am not entirely sure how this would fit into the requirements for RSA. It needs to express the following things

  1. sign vs sign with blinding (with blinded signing an Rng needs to be provided)
  2. specify a Padding Scheme
  3. getting the ASN1 prefix for the chosen hash
  4. variable length hash digests, as one needs to be able to choose the hash function at runtime in some scenarios (e.g. on my pgp implementation)
  5. knowing if the value should be hashed or not (my current api assumes it always gets the hashed digest)

@dignifiedquire
Copy link
Member

(3) could be solved by extending Digest to provide ASN1 prefixes
(5) seems to be solved by the sign_digest handling
I think (4) could be solved by doing a match on the selected hashing method.

@tarcieri
Copy link
Member Author

tarcieri commented Mar 25, 2019

@dignifiedquire for RSA signatures I would suggest instantiating a signer type in your preferred way, then impling the Sign trait, ensuring the signer is set up with the user's preferred configuration prior to signing, with optional choices around how the signature is produced provided in advance.

This is more or less where we netted out around the complexities of selecting which hash function to use for ECDSA, which was previously solved by providing a trait for each with a differently named method. Instead of that, we are requiring the signer to decide that a priori.

Note that in such a scheme, it is still possible to allow the signer to select from different hash functions to perform on the input. You could either select things a priori at runtime via initializers (e.g. KeyPair::new(privkey) vs KeyPair::new_with_digest(privkey, Sha384), or by encoding the default hash function as a generic parameter with a default, e.g. KeyPair::new(privkey) versus KeyPair::<Sha384>::new(privkey), which would be more type safe.

Either way, the point is the incidental complexity around each signature algorithm can be kept out-of-band from the core signature trait.

As it were, this is how ring's signing APIs work.

(5) seems to be solved by the sign_digest handling

Yep! And really I think "to prehash or not to prehash" is the only decision that actually needs to be handled by the signing traits, as many ECDSA libraries (as well as RSA) only accept a prehashed digest at input, and leave how to calculate that as an exercise to the user. So it's convenient for users of these libraries to be able to leverage Digest for that prehashing out-of-the-box.

I imagine we can do some blanket Sign impls for types which implement SignDigest, possibly using a marker trait ala the following (not sure these are the greatest names, but you get the idea):

trait SignUsingDigest {
    type Algorithm: Digest;
}

impl<D, S, T> Sign<S> for T
where
    D: Digest
    S: Signature,
    T: SignDigest<D, S> + SignUsingDigest
{
    fn sign(&self, msg: &[u8]) -> Result<S, Error> {
        self.sign_digest(T::Algorithm::new().chain(msg))
    }
}

@dignifiedquire
Copy link
Member

I like that, it makes the key pair a bit more complicated, but other than that this should work out. In all use cases I have seen so far a key pair is only used for a single combination of params anyway.

@dignifiedquire
Copy link
Member

The only thing I am unsure about then is blinding, as I would like to preserve the ability to pass in an rng every time the method is called, instead of per key pair.

@tarcieri
Copy link
Member Author

ECDSA has similar concerns around the secure RNG for nonces, however I guess I have the opposite preference and like only having to configure the RNG once.

I'm not sure how it's possible to design a least-common-denominator API which supports passing in an RNG on a per-signature basis, as many signature algorithms are deterministic and don't require one (e.g. Ed25519 or RFC 6979 deterministic ECDSA).

What's the use case for doing so?

@dignifiedquire
Copy link
Member

I honestly don't have a good use case, this was mostly an intuition I had, and the way I have seen this being handled in other places. If I really want I can always have sign_blinded method which doesn't match the trait, so I don't think this is a blocker in any case.

@tarcieri tarcieri force-pushed the sign-and-verify-traits branch from f5e2db8 to c438c2c Compare March 26, 2019 16:56
@tarcieri
Copy link
Member Author

tarcieri commented Mar 26, 2019

I removed commit f5e2db8 (SignSha* and VerifySha*) from the PR. Will merge after the test pass.

Will submit a followup PR for the blanket impl of Sign for SignDigest after this lands.

@tarcieri tarcieri merged commit 502c507 into master Mar 26, 2019
@tarcieri tarcieri deleted the sign-and-verify-traits branch March 26, 2019 17:00
@newpavlov
Copy link
Member

newpavlov commented Mar 26, 2019

I wonder if we can do something like this (replace const generics with typenum for now):

trait CoreSigner {
    const DIGEST_SIZE: usize;
    type Sig: Signature;
    
    fn core_sign(&self, msg_digest: &[u8; DIGEST_SIZE]) -> Result<Self::Sig, Error>;
    
    fn digest_sign<D>(&self, msg: &[u8]) -> Result<Self::Sig, Error>
        where D: Digest<Output=Self::DIGEST_SIZE>
    {
        self.core_sign(&D::digest(msg))
    }
}

trait Signer<S: Signature>: Send + Sync {
    /// Sign the given message and return a digital signature
    fn sign(&self, msg: &[u8]) -> Result<S, Error>;
}

struct CoreWrapper<C: CoreSigner, D: Digest> { .. }

impl<C: CoreSigner, D: Digest> CoreSigner for CoreWrapper<C, D> { .. }

impl<C, S, D> Signer<S> for CoreWrapper<C, D>
    where C: CoreSigner,
          D: Digest,
          S: Signature + From<C::Sig>
{
    fn sign(&self, msg: &[u8]) -> Result<S, Error> {
        let sig = self.digest_sign::<D>(msg)?;
        Ok(sig.into())
    }
}

And same for verification.

BTW are you sure about Sign and Verify trait names? In this case Sign can be confused with noun and at least for me Signer/Verifier is easier to understand. Plus agent nouns are already used for trait names in std and ecosystem.

@tarcieri
Copy link
Member Author

tarcieri commented Mar 26, 2019

@newpavlov I don't think that makes sense. It presupposes all signers will have the ability to sign both unhashed messages and digests, which is not the case (or at least, I feel very strongly about these traits being usable with all signers imaginable, be they HSMs/hardware tokens, cloud KMS services, or any existing Rust crate which may already do hashing internally)

It also adds a superfluous method in order to achieve object safety which is redundant with the method in the non-object-safe versions. All in all it seems more complicated and if it has any advantages, I'm failing to see them. I think to reach equivalence with what I have in #9 (as of 2bff7eb), you'd need to add an additional DigestSigner trait beyond all that to provide an object-safe digest signing API.

What are you trying to accomplish with this change?

BTW are you sure about Sign and Verify trait names?

Haha, as it were I used Signer and Verifier in Signatory, and would be fine with going back to them. I somewhat capriciously changed them to Sign and Verify as they are effectively single method traits.

@newpavlov
Copy link
Member

It presupposes all signers will have the ability to sign both unhashed messages and digests

No, HSMs which do hashing themselves will implement only Signer trait and not CoreSigner. In other words users usually will not use CoreSigner directly, except when they'll need to sign/verify pre-computed hash. In other words in terms of RFC 8032 CoreSigner is for "PureEdDSA" and Signer for HashEdDSA, while CoreWrapper is used for converting one into another.

Which method is superfluous here in your opinion? sign? It plays a distinctively different role from core_sign and digest_sign as it does not allow to choose Digest function and pass pre-computed hash value.

I want a clear distinction between algorithms and their levels. Your approach with UseDigestToSign and SignDigest feels somewhat weird and non-idiomatic.

@tarcieri
Copy link
Member Author

tarcieri commented Mar 26, 2019

Which method is superfluous here in your opinion?

core_sign and sign have nearly identical method signatures, aside from the use of an associated type versus a generic:

fn core_sign(&self, msg_digest: &[u8; DIGEST_SIZE]) -> Result<Self::Sig, Error>;
fn sign(&self, msg: &[u8]) -> Result<S, Error>;

I feel like what you're doing is accomplishing less than #9 with a whole lot of added complexity. Just looking at your code I have no idea what a CoreWrapper is or what it's supposed to do, but it just feels like a bunch of glue code which doesn't need to exist in a cleaner design.

I want a clear distinction between algorithms and their levels. Your approach with UseDigestToSign and SignDigest feels somewhat weird and non-idiomatic.

These are trying to communicate an important property of the underlying signature algorithm, which is that the message-based form of the algorithm is equivalent to computing the digest of afforementioned message with the IUF API, which is a property that does not hold for Ed25519 vs Ed25519ph. I'd suggest reviewing the notes I left about this on #9.

They could perhaps use a better name, but I sure had trouble coming up with a proper one to convey that particular idea.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants