11 May 05:43 2005

### Critical bits and notations

David Shaw <dshaw <at> jabberwocky.com>

2005-05-11 03:43:19 GMT

2005-05-11 03:43:19 GMT

Here's an odd corner case, one that I'd be grateful for some thoughts on: what does the critical bit mean in the context of a signature notation? Does the critical bit refer to support of the notation subpacket in general, or to the specific notation given in the critical notation subpacket? For example, take an implementation that can read notations, and specifically understands and acts on the "foo" notation. Given that, it's very clear that this implementation should accept a critical notation "foo=1". Now try a critical notation of "bar=2". Should the implementation accept it because it knows what a notation is, and implements notations, or should it reject it because it doesn't know what the specific "bar" notation is? The draft has this to say on the subject of critical bits for signature subpackets: Bit 7 of the subpacket type is the "critical" bit. If set, it denotes that the subpacket is one that is critical for the evaluator of the signature to recognize. If a subpacket is encountered that is marked critical but is unknown to the evaluating software, the evaluator SHOULD consider the signature to be in error. An evaluator may "recognize" a subpacket, but not implement it. The purpose of the critical bit is to allow the signer to tell an evaluator that it would prefer a new, unknown feature to generate an error than be ignored.(Continue reading)