views:

52

answers:

1

I have an X.509 certificate that contains a set of data with the following IMPLICIT [0] tag:

A0 81 C6 (value)...

And I have this excerpt from a standards document:

The IMPLICIT [0] tag is not used for the DER encoding, rather an EXPLICIT SET OF tag is used. That is, the DER encoding of the EXPLICIT SET OF tag, rather than of the IMPLICIT [0] tag, MUST be included along with the length and content octets of the value.

I've done a lot of searching around, but I can't figure out exactly what the standard is calling for. I'm looking for a bit of clarification.

EDIT: Here is the standard I am following: http://tools.ietf.org/html/rfc3852

I am trying to verify the X.509 signature and I need to calculate the message digest to do this. This certificate includes the optional SignedAttributes in the SignerInfo type. I have hashed the signed content and verified that the message digest in the SignedAttributes is correct. The standard says that if the SignedAttributes is present, it should be hashed and encrypted to create the cert's signature. The standard also says that the tag of the SignedAttributes should be changed as discussed in the original question.

Here is the Asn.1 Grammar for the SignerInfo:

SignerInfo ::= SEQUENCE {
        version CMSVersion,
        sid SignerIdentifier,
        digestAlgorithm DigestAlgorithmIdentifier,
        signedAttrs [0] IMPLICIT SignedAttributes OPTIONAL,
        signatureAlgorithm SignatureAlgorithmIdentifier,
        signature SignatureValue,
        unsignedAttrs [1] IMPLICIT UnsignedAttributes OPTIONAL }

SignerIdentifier ::= CHOICE {
        issuerAndSerialNumber IssuerAndSerialNumber,
        subjectKeyIdentifier [0] SubjectKeyIdentifier }

SignedAttributes ::= SET SIZE (1..MAX) OF Attribute

UnsignedAttributes ::= SET SIZE (1..MAX) OF Attribute

Attribute ::= SEQUENCE {
        attrType OBJECT IDENTIFIER,
        attrValues SET OF AttributeValue }

AttributeValue ::= ANY

SignatureValue ::= OCTET STRING
+1  A: 

I'm not sure how to interpret that comment. What standard are you reading? Do you have the ASN.1 grammar for the structure?

An explicit tag is like a wrapper around some underlying type. For example, the underlying type might be a SEQUENCE. It is encoded with the universal SEQUENCE tag, 0x30. But to avoid ambiguity in how the SEQUENCE should be interpreted in the enclosing structure, it is wrapped in an EXPLICIT structure with a context-specific tag. It's not clear from the snippet above what that tag is.

I'm guessing what they mean is a syntax like [0] EXPLICIT SET OF foo, which (using the example from the original question as a value) would be encoded as (hex) A0 81 C9 31 81 C6 (value) ...

Note that the original value that was tagged with a context-specific zero (A0) has been re-tagged with a universal SET OF (31).


Okay, in this case, I think what they mean is that when you sign the attributes, instead of using the implicit tag, the signature is computed over the SET OF tag. If that's what they mean, throwing in "EXPLICIT" really muddied the waters, but whatever. If that's the case, then the encoding would be simply 31 81 C6 (value) ... (replace the context-specific 0xA0 with a universal SET OF 0x31).

erickson
Edited to answer some of your questions. I will try your suggestion, thanks.
Ben
@Ben - check out my update. I think my first guess was off-base.
erickson
@erickson - changing the A0 to 31 was my first thought. I tried that and was unable to get the verify to pass. I think changing A0 to 31 is correct, since it makes logical sense, so I might have a different problem.
Ben
Worked through a bunch of other problems, found out that A0 to 31 is correct.
Ben