I have an X.509 certificate that contains a set of data with the following IMPLICIT [0] tag:
A0 81 C6 (value)...
And I have this excerpt from a standards document:
The IMPLICIT [0] tag is not used for the DER encoding, rather an EXPLICIT SET OF tag is used. That is, the DER encoding of the EXPLICIT SET OF tag, rather than of the IMPLICIT [0] tag, MUST be included along with the length and content octets of the value.
I've done a lot of searching around, but I can't figure out exactly what the standard is calling for. I'm looking for a bit of clarification.
EDIT: Here is the standard I am following: http://tools.ietf.org/html/rfc3852
I am trying to verify the X.509 signature and I need to calculate the message digest to do this. This certificate includes the optional SignedAttributes in the SignerInfo type. I have hashed the signed content and verified that the message digest in the SignedAttributes is correct. The standard says that if the SignedAttributes is present, it should be hashed and encrypted to create the cert's signature. The standard also says that the tag of the SignedAttributes should be changed as discussed in the original question.
Here is the Asn.1 Grammar for the SignerInfo:
SignerInfo ::= SEQUENCE {
version CMSVersion,
sid SignerIdentifier,
digestAlgorithm DigestAlgorithmIdentifier,
signedAttrs [0] IMPLICIT SignedAttributes OPTIONAL,
signatureAlgorithm SignatureAlgorithmIdentifier,
signature SignatureValue,
unsignedAttrs [1] IMPLICIT UnsignedAttributes OPTIONAL }
SignerIdentifier ::= CHOICE {
issuerAndSerialNumber IssuerAndSerialNumber,
subjectKeyIdentifier [0] SubjectKeyIdentifier }
SignedAttributes ::= SET SIZE (1..MAX) OF Attribute
UnsignedAttributes ::= SET SIZE (1..MAX) OF Attribute
Attribute ::= SEQUENCE {
attrType OBJECT IDENTIFIER,
attrValues SET OF AttributeValue }
AttributeValue ::= ANY
SignatureValue ::= OCTET STRING