Hi, I have a problem with i2d_ASN1_BIT_STRING(). If I set data with a 0x00-terminated string , i2d_ASN1_BIT_STRING() will trim off 0x00.
Here is a sample code: -------------------------------------- #include <stdio.h> #include "openssl/asn1.h" int main() { unsigned char *buf = NULL,*temp_buf = NULL; int buflen = 128; ASN1_BIT_STRING *bs = NULL; int i = 0,j = 0; int ret = 0; unsigned char *p = NULL; buf = OPENSSL_malloc(150); temp_buf = OPENSSL_malloc(150); p = temp_buf; for (i=0;i<buflen-2;i++) { buf[i]='A'; } buf[buflen-2]='B'; buf[buflen-1]='\0'; // (1) printf("buflen=%d\n",buflen); for (i=0;i<buflen;i++) { printf("%02X%c",buf[i],(i+1)%20==0?'\n':' '); } printf("\n"); bs = ASN1_BIT_STRING_new(); ret = ASN1_BIT_STRING_set(bs, buf, buflen); printf("ASN1_BIT_STRING_set() = %d\n",ret); ret = i2d_ASN1_BIT_STRING(bs, &temp_buf); printf("i2d_ASN1_BIT_STRING() = %d\n",ret); for (i=0;i<ret;i++) { printf("%02X%c",p[i],(i+1)%20==0?'\n':' '); } printf("\n"); } -------------------------------------- The result is : ========================================= buflen=128 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 42 00 ASN1_BIT_STRING_set() = 1 i2d_ASN1_BIT_STRING() = 131 03 81 80 01 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 41 42 ========================================= You can see, the 0x00 is trimmed. Some articles said BIT STRING should not use null-terminate string , but in rfc2510 or rfc4210, the structure of PKIProtection is defined as : PKIProtection ::= BIT STRING PKIProtection's data should be binary string, maybe null-terminated. So I want to know how to use BIT STRING to handle 0x00-terminated string? Thank you in advance.