[ 
https://issues.apache.org/jira/browse/OAK-8204?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16829176#comment-16829176
 ] 

Marcel Reutegger commented on OAK-8204:
---------------------------------------

I'm not able to reproduce with a the following code:
{noformat}
    @Test
    public void addNodes() throws Exception {
        File testFile = new File("dummy.pdf");
        DocumentNodeStore ns = newMongoDocumentNodeStoreBuilder()
                .setMongoDB("mongodb://localhost:27017/oak", "oak", 0)
                .build();
        Repository repo = new Jcr(new Oak(ns)).createRepository();

        Session session = repo.login(new SimpleCredentials("admin", 
"admin".toCharArray()));
        String folderName=null;
        String docName=null;
        Node root = session.getRootNode();
        for(int i=1;i<=10;i++){
            folderName="J"+i;

            if (root.hasNode(folderName)) {
                throw new Exception(folderName + " node is exist!");
            }
            Node node = root.addNode(folderName, JcrConstants.NT_FOLDER);
            session.save();
            System.out.println("====create folder "+folderName+" success===");

            for(int j=1;j<=200000;j++){
                docName=folderName+"_"+j+".pdf";
                Node docNode = JcrUtils.putFile(node, docName, 
"application/pdf", new FileInputStream(testFile));
                session.save();
                System.out.println("Created " + docNode.getPath());
            }
        }
    }
{noformat}

Can you please check if there's a memory leak in the code you omitted after 
{{// add content node}}? You may also want to create a heap dump when the JVM 
goes out of memory and analyze the dump.

> OutOfMemoryError: create nodes
> ------------------------------
>
>                 Key: OAK-8204
>                 URL: https://issues.apache.org/jira/browse/OAK-8204
>             Project: Jackrabbit Oak
>          Issue Type: Bug
>          Components: core
>    Affects Versions: 1.10.2
>            Reporter: zhouxu
>            Priority: Major
>
> Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit 
> exceede
> d
>  at java.lang.AbstractStringBuilder.<init>(AbstractStringBuilder.java:68)
> at java.lang.StringBuilder.<init>(StringBuilder.java:101)
>  at org.apache.jackrabbit.oak.commons.PathUtils.concat(PathUtils.java:318
> )
>  at org.apache.jackrabbit.oak.plugins.document.DocumentNodeState.getChild
> NodeDoc(DocumentNodeState.java:507)
>  at org.apache.jackrabbit.oak.plugins.document.DocumentNodeState.hasChild
> Node(DocumentNodeState.java:256)
>  at org.apache.jackrabbit.oak.plugins.memory.MutableNodeState.setChildNod
> e(MutableNodeState.java:111)
>  at org.apache.jackrabbit.oak.plugins.memory.MemoryNodeBuilder.setChildNo
> de(MemoryNodeBuilder.java:343)
>  at org.apache.jackrabbit.oak.plugins.document.AbstractDocumentNodeBuilde
> r.setChildNode(AbstractDocumentNodeBuilder.java:56)
>  at org.apache.jackrabbit.oak.spi.state.ApplyDiff.childNodeAdded(ApplyDif
> f.java:80)
>  at org.apache.jackrabbit.oak.plugins.memory.ModifiedNodeState.compareAga
> instBaseState(ModifiedNodeState.java:412)
>  at org.apache.jackrabbit.oak.spi.state.ApplyDiff.childNodeChanged(ApplyD
> iff.java:87)
>  at org.apache.jackrabbit.oak.plugins.memory.ModifiedNodeState.compareAga
> instBaseState(ModifiedNodeState.java:416)
>  at org.apache.jackrabbit.oak.spi.state.ApplyDiff.childNodeChanged(ApplyD
> iff.java:87)
>  at org.apache.jackrabbit.oak.plugins.memory.ModifiedNodeState.compareAga
> instBaseState(ModifiedNodeState.java:416)
>  at org.apache.jackrabbit.oak.spi.state.ApplyDiff.childNodeChanged(ApplyD
> iff.java:87)
>  at org.apache.jackrabbit.oak.plugins.memory.ModifiedNodeState.compareAga
> instBaseState(ModifiedNodeState.java:416)
>  at org.apache.jackrabbit.oak.plugins.document.ModifiedDocumentNodeState.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to