-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Has any thought been given to splitting off the OpenSSL non-SSL BIO (and
possibly ASN.1) code into a separate library?
The reason for doing this, obviously, is to make it easy to write
applications that can use the BIO routines without requiring the f
Le Saux, Eric wrote:
I am trying to understand why ZLIB is being used that way. Here is what
gives better results on a continuous reliable stream of data:
1) You create a z_stream for sending, and another z_stream for
receiving.
2) You call deflateInit() and inflateInit() on the
quire
a unencrypted private key or password kept elsewhere on the system.
Access to these functions would be restricted using the normal
database access control mechanism.
--
Bear Giles
bgiles (at) coyotesong (dot) com
__
A quick (and probably dumb) question - a lot of sites are using
DNS-style distinguished names for their LDAP and PKI intrastructure
now. Any reason why it's not in OpenSSL 9.6/CURRENT?
A DNS-style DN is
/CN=Bob Smith/DC=example/DC=com
("Bob Smith" at example.com, using LDAP order) instead of
Oops. The information *is* in , even if it's unused.
But again, shouldn't this be in crypto/asn1/a_strnid.c (and elsewhere)
so it's recognized by default?
__
OpenSSL Project http://www.openssl.org
t_default = com
2.domainComponent_min = 2
2.domainComponent_max = 4 # no longer 3 due to .info
This produces a DN that looks like
/CN=Bear Giles/OU=Persons/DC=example/DC=com
Before someone asks the obvious question :-), I didn't try
this first since I'm working on a databas
> bear> NID_domainComponent. So I'm still not sure that these tables
> bear> can be used to validate the input to these routines.
>
> Do I get it right, you're after having the string length limits and
> possibly the allowed string types for DC and more in that table?
What I'm ultimately trying
I came across a minor nit in the EVP_BlockDecode() logic.
As I understand the RFC for base64 encoding, characters outside
of the specified character range (and whitespace characters in
particular) should be ignored. Unfortunately, after stripping
off the leading and trailing whitespace this routi
arsed to the minute, not to the second,
and are presented as "abstime", not "datetime."
Future enhancements:
1) Make it possible to compare x509_name and asn1_integer objects
directly.
2) Add all arithmetic functions for asn1_integer.
Export stuff:
1) A copy of this noti
First, the serious stuff. Version 0.2 of my libpkixpq library is up at
http://www.dimensional.com/~bgiles. It mostly renames asn1_integer to
hugeint and x509_name to principal, and adds a slew of operators to each
type. This should make it possible to create indices on either type,
although
A request for some additional hashes I would submit some
patches myself, but this stuff is so simple it would probably
take longer to verify my patches than to code them directly. :-)
The hashes are mentioned in draft-ietf-pkix-certstore-http-00.txt,
available at http://www.imc.org/draft-ie
> I'll dig out the code. It was largely based around the PKCS#11
> functionality but with an OpenSSL flavour. That is you have a load of
> objects each of which is a set of attributes. You can then lookup based
> on exact matches of each attribute.
This is "query by example." It has some benefit
> > One classic approach is to have all lookup functions return a
> > list of unique keys. The caller then requests each object individually
> > via a lookup that guarantees uniqueness. Uniqueness is easy to guarantee
> > on any hashed or relational store - make it the primary key!
>
> An earli
> Not true. I've searched on the hash of the certificate when we are
> producing certificates that must maintain privacy and therefore have
> garbage in the Issuer and Subject fields.
Maybe I'm just dense, but I don't get this. If you simply mask the
issuer and subject fields, e.g., using a SN
> > Issuer and subject number should also be unique, and it's a common
> > search pattern. I don't think anyone searches on the hash of the
> > entire certificate.
>
> It should be unique but it might not be, either by accident or malicious
> intent.
This indirectly raises a question we've been
> To avoid duplication of code I'd say such concerns should be addressed
> either at the application level or on top of whatever OpenSSL plugin API
> is adopted.
I think that would be a serious mistake. I'm specifically thinking
of something like the CA cert repository/JSP code in my postgresql
> I can imagine that one might get the same certificate
> from several source, but I'm pretty sure it could be resolved but
> applying a little bit of automagic intelligence and tossing all
> duplicates except for the copy that has the highest trust attached to
> it.
I was assuming this would be
> Nothing. The trust settings aren't part of the certificate encoding. The
> current trust handling stores these after the main encoding only if the
> *TRUST() functions are used.
As an aside my postgresql stuff currently uses the standard X509 routines
when converting from internal to external f
> Bear Giles wrote:
> >
> > Of course, this opens the whole can-o-worms of "what constitutes
> > a duplicate cert?" Is it an exact match, or matching I+SN, or
> > some other criteria?
>
> There are some cases where only an exact match is accepta
> Bear Giles wrote:
> >
> > > To avoid duplication of code I'd say such concerns should be addressed
> > > either at the application level or on top of whatever OpenSSL plugin API
> > > is adopted.
> >
> > I think that would be a serious mist
> Richard Levitte - VMS Whacker wrote:
> > From: Bear Giles <[EMAIL PROTECTED]>
> >
> > bear> Of course, this opens the whole can-o-worms of "what constitutes
> > bear> a duplicate cert?" Is it an exact match, or matching I+SN, or
> >
> So how would protection against such "bad data" be done at the database
> level?
If you know it's a CA cert repository, you can require that the issuer
already be in the database.
If this a deferrable constraint, you can still add root certs (e.g., to
populate the database.) But if you remove
I've been looking at the Java KeyStore API (which reflects the work of
a other bright people looking at the same problem) and have a revised
schema and proposed API.
SECOND PROPOSAL
---
The primary key is an opaque string henceforce known as the "alias".
The plugin may treat this as
> As primary keys go, I'm certain that whole-cert hashes would come in
> quite handy.
It's a mindset thing. A key should lead you to more information.
Give me your name, I'll tell you your phone number and home address.
Read off the tag on the back of the car and I'll tell you the make
and mod
> In the full-blown PKI, you will also have things
> like "fetch me the cert corresponding to this name" and "fetch me the
> key (or a handle to the key) with this fingerprint".
Remember that there are actually two independent pieces of code here -
a "tab A" independent shared library and a "slot
> About the certificate and the chain: the chain will add a lot of
> redundant information, at least as I see it, since each component of
> the chain will most obviously be in separate certificate entries.
I tend to agree, but wanted to start close to any model API first.
Makes it easier to expla
Here's a quick prototype implementation of a typical backend.
The wrapper isn't shown - that's the code common to all plug-ins
that's located in the static part of the library, e.g., breaking a
cert chain into multiple certs, or reconstructing that cert chain.
The dynamic part is much more datab
> That, or have the plug-ins keep a very sharp eye on the version of the
> loading OpenSSL to avoid version clashes (something we do with engines
> today, sadly).
Other way around - the framework has to pay close attention to the
version of the plugin. I think the only dynamic library system tha
> bear> In contrast, with a plug-in the framework itself acts as the loader,
> bear> and it must detect mising symbols in older shared libraries. Depending
> bear> on the way it's hooked, it either has to resolve each symbol individually
> bear> or a single structure and then check the version nu
> On Unix as well as on VMS and others, it's
> often possible to upgrade a shareable library to a newer version
> without breaking the programs using it.
Of course.
> This requires the API to stay
> the same except for added functions, and for types and structures to
> never change (except for a
> > Are you in the US BTW if so can you resend you patch with a CC: to
> > [EMAIL PROTECTED]
Is that the preferred address now, instead of [EMAIL PROTECTED]?
I've tried checking the bxa.doc.gov website, but it's aimed at
commercial exporters instead of OSS exporters.
I remember mentioning this a while back, but don't think anything
ever came from it.
Are there any plans to add convenience functions for the hashes
specified in draft-ietf-pkix-certstore-http? (This proposed
document provides some implementation details for RFC2585, and
basically maps a URL of
> I'm developing an OpenSSL-based SSL sniffer that monitors decrypted
> SSL traffic using the webserver's private keys on real site traffic
> (similar to ssldump). For some reasons, part of the SSL traffic is
> not being decrypted.
>
> I'm looking for possible reasons for this. The ones I am
Scott Harris wrote:
> I need some help to change the Certificate I generated using Microsoft
> Certificate server in .*DER* format to convert to .*DB* format to
> use with Netscape API. Any body knows *how to convert a .DER certificate
> to .DB *. Any tools that that can do that...
There's not
Is there a way to programmatically obtain a list of available ciphers,
digests and algorithms? I looked at the header files, but may have
overlooked something.
Bear
__
OpenSSL Project http://www
Dr. Stephen Henson wrote:
On Sun, Jul 30, 2006, Girish Venkatachalam wrote:
--- Bear Giles <[EMAIL PROTECTED]> wrote:
Is there a way to programmatically obtain a list of
available ciphers,
digests and algorithms? I looked at the header
files, but may have
overlooked som
36 matches
Mail list logo