Why are EC signatures being padded?

Running through the tutorial on how to sign data using security.framework, I was trying to understand the format Apple is using & wanting for signatures (as this isn't documented anywhere): https://developer.apple.com/documentation/security/certificate_key_and_trust_services/keys/signing_and_verifying?language=objc

I've learned the format of the signatures are just ASN.1 objects, with EC signatures being a sequence of the R and S coordinates as ASN.1 integers.

However, I am noticing when using SecKeyCreateSignature that either the R or S value will always be prepended with an extra byte.

For example:

30 45 02 20 66 B7 4C FB FC A0 26 E9 42 50 E8 B4
E3 A2 99 F1 8B A6 93 31 33 E8 7B 6F 95 D7 28 77
52 41 CC 28 02 21 00 E2 01 CB A1 4C AD 42 20 A2
                  ^^ why is this here?
66 A5 94 F7 B2 2F 96 13 A8 C5 8B 35 C8 D5 72 A0
3D 41 81 90 3D 5A 91 

This is a ASN.1 sequence, first is a 32-byte integer and second is a 33-byte integer. Why is that 00 byte being prepended to the integer? Why is it sometimes the R and sometimes the S?

Removing it causes SecKeyVerifySignature to fail, so obviously it's required, but I need to know the logic here as I'm having to hand-craft these ASN.1 objects as all I have are the raw R and S values.

Replies

Looking at the source of the ecdsa package in go, I can see the logic they use for prepending the 0 byte to either value:

		if bytes[0]&0x80 != 0 {
			c.AddUint8(0)
		}
		c.AddBytes(bytes)

So I'm assuming this is defined is some spec somewhere, but for now I suppose this is good enough for me to go on.

I was trying to understand the format Apple is using & wanting for signatures (as this isn't documented anywhere):

SecKeyCreateSignature takes a SecKeyAlgorithm, and that algorithm determines the signature format. For example, .ecdsaSignatureMessageX962SHA512 indicates a X9.62 encoding, which is a DER encoding of the R and S values. For example, the signature you posted decodes as:

% dumpasn1 -p -a sig.der 
SEQUENCE {
  INTEGER
    66 B7 4C FB FC A0 26 E9 42 50 E8 B4 E3 A2 99 F1
    8B A6 93 31 33 E8 7B 6F 95 D7 28 77 52 41 CC 28
  INTEGER
    00 E2 01 CB A1 4C AD 42 20 A2 66 A5 94 F7 B2 2F
    96 13 A8 C5 8B 35 C8 D5 72 A0 3D 41 81 90 3D 5A
    91
  }

This is using the dumpasn1 tool.

https://www.cs.auckland.ac.nz/~pgut001/dumpasn1.c

why is this here?

Because R and S are encoded as ASN.1 INTEGER values, and DER uses a leading zero byte to indicate a positive value whose first byte has the high bit set.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"