Hello
I was using parseLdifFile to parse my LDIF file.
When the LDIF size is small, then it is okay.
But if the LDIF file size is 500M, with 1 million entries.. then it runs
out of memory
even I assign 6G memory to java heap
Can you please tell me if I just don't give them enough memory (how much)
or
this library does not support large LDIF size (can you please point me out
what library I should use) ?
LdifReader reader = new LdifReader();
List<LdifEntry> entries = reader.parseLdifFile("/home/myname/export.ldif");
It is the hash map throwing the out of memory exception.
*Attached the exception*
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.HashMap.<init>(Unknown Source)
at java.util.LinkedHashMap.<init>(Unknown Source)
at java.util.HashSet.<init>(Unknown Source)
at java.util.LinkedHashSet.<init>(Unknown Source)
at
org.apache.directory.api.ldap.model.entry.DefaultAttribute.<init>(Def
aultAttribute.java:57)
at
org.apache.directory.api.ldap.model.entry.DefaultEntry.add(DefaultEnt
ry.java:921)
at
org.apache.directory.api.ldap.model.ldif.LdifEntry.addAttribute(LdifE
ntry.java:444)
at
org.apache.directory.api.ldap.model.ldif.LdifReader.parseAttributeVal
ue(LdifReader.java:942)
at
org.apache.directory.api.ldap.model.ldif.LdifReader.parseEntry(LdifRe
ader.java:1403)
at
org.apache.directory.api.ldap.model.ldif.LdifReader.nextInternal(Ldif
Reader.java:1755)
at
org.apache.directory.api.ldap.model.ldif.LdifReader.access$100(LdifRe
ader.java:168)
at
org.apache.directory.api.ldap.model.ldif.LdifReader$1.next(LdifReader
.java:1859)
at
org.apache.directory.api.ldap.model.ldif.LdifReader$1.next(LdifReader
.java:1850)
at
org.apache.directory.api.ldap.model.ldif.LdifReader.parseLdif(LdifRea
der.java:1911)
at
org.apache.directory.api.ldap.model.ldif.LdifReader.parseLdifFile(Ldi
fReader.java:1652)
at
org.apache.directory.api.ldap.model.ldif.LdifReader.parseLdifFile(Ldi
fReader.java:1616)
at
public_register_not_enrolled.main(public_register_not_enrolled.java:5
0)