Easy... 0.if possible, create a new instance, keeping the original in tact until all records have been transferred 1.take the unique list of cnames, read it through a simple ldapsearch loop and pull all those records into an ldif. 2.import the ldif into the new database
Richard -----Original Message----- From: Stuart Cracraft [mailto:scracr...@apple.com] Sent: Thursday, October 13, 2011 7:38 AM To: perl-ldap@perl.org Subject: ldap question Hi, I have an LDAP directory with a very large number of records (some possibly duplicated in their entirety or partially as superset/subset) which I would like to condense and repair and correct insofar as the individual subrecords/fields within each record are concerned. The format of this LDAP directory, when dumped, is a set of millions of rows of data which when sorted and uniqued on the cname results in a small fraction of the original total (.00746% to be exact), though whether the duplicates themselves have same fields is another matter entirely. The records are in XML format and consist of key/value pairs. My suspicion is that this directory has never been properly maintained so I have some questions: + what are the accepted ways via automation to maintain this directory + what methods or code exist to condense and verify a hitherto-unmaintained LDAP directory + simplifying the data to the bare bones number of records, discarding the others after making a general full backup So what I am asking for is a general set of already written Perl tools using Net::LDAP which deal with LDAP directories intelligently. --Stuart