Hi,
I want to query the implementation of d2i_SSL_SESSION() (in ssl_asn1.c)
which doesn't seem correct to me.
d2i_SSL_SESSION() decodes an ASN1 encoding of an SSL_SESSION object
previously encoded by i2d_SSL_SESSION(). Various SSL_SESSION fields are
optional, and tags are used to identify which fields are present... so
far, so good. But in two cases when they are not present,
d2i_SSL_SESSION() actually sets values which were not in the original.
Specifically, if 'time' is not present (which means it was 0 when
i2d_SSL_SESSION() looked at it) it is set to the current time(). And if
'timeout' was not present, it is set to 3.
Shouldn't d2i_SSL_SESSION() be returning exactly the session data that was
passed into i2d_SSL_SESSION()?
This came to my attention because I am working on an embedded device, and
OpenSSL is used before the device has had its real time clock set, which
means time() is returning 0. This resulted in
ssl3_send_newsession_ticket() getting different asn1 sizes for a session
encoded with i2d_SSL_SESSION, and decoded with d2i_SSL_SESSION, resulting
in an error being returned due to this check:
if (slen > slen_full) /* shouldn't ever happen */
(because the decoded session now had a 'time' field the original did not
have).
While I know this won't affect big Linux/Unix/BSD users, it may affect
other embedded device users. The inconsistency with the 'timeout' field
could affect other people too though.
Thanks,
Jifl
--
eCosCentric Limited http://www.eCosCentric.com/ The eCos experts
Barnwell House, Barnwell Drive, Cambridge, UK. Tel: +44 1223 245571
Registered in England and Wales: Reg No 4422071.
------["Si fractum non sit, noli id reficere"]------ Opinions==mine
______________________________________________________________________
OpenSSL Project http://www.openssl.org
Development Mailing List [email protected]
Automated List Manager [email protected]