[ 
https://issues.apache.org/jira/browse/MAHOUT-1456?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13936259#comment-13936259
 ] 

mahmood commented on MAHOUT-1456:
---------------------------------

This is the stack trace from Hadoop-2.1.0-beta which is the closest version to 
1.2.1. If you want 2.3.0 let me know.

$ mahout wikipediaXMLSplitter -d ../enwiki-latest-pages-articles.xml -o 
wikipedia/chunks -c 64
Running on hadoop, using /export/home/hadoop/hadoop-2.1.0-beta/bin/hadoop and 
HADOOP_CONF_DIR=
MAHOUT-JOB: 
/export/home/hadoop/mahout-distribution-0.9/mahout-examples-0.9-job.jar
14/03/15 16:52:06 WARN driver.MahoutDriver: No wikipediaXMLSplitter.props found 
on classpath, will use command-line arguments only
14/03/15 16:52:06 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Arrays.java:2367)
        at 
java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
        at 
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114)
        at 
java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415)
        at java.lang.StringBuilder.append(StringBuilder.java:132)
        at 
org.apache.mahout.text.wikipedia.WikipediaXmlSplitter.main(WikipediaXmlSplitter.java:208)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at 
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
        at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152)
        at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)


> The wikipediaXMLSplitter example fails with "heap size" error
> -------------------------------------------------------------
>
>                 Key: MAHOUT-1456
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1456
>             Project: Mahout
>          Issue Type: Bug
>          Components: Examples
>    Affects Versions: 0.9
>         Environment: Solaris 11.1 \
> Hadoop 2.3.0 \
> Maven 3.2.1 \
> JDK 1.7.0_07-b10 \
>            Reporter: mahmood
>              Labels: Heap,, mahout,, wikipediaXMLSplitter
>
> 1- The XML file is 
> http://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2
> 2- When I run "mahout wikipediaXMLSplitter -d 
> enwiki-latest-pages-articles.xml -o wikipedia/chunks -c 64", it stuck at 
> chunk #571 and after 30 minutes it fails to continue with the java heap size 
> error. Previous chunks are created rapidly (10 chunks per second).
> 3- Increasing the heap size via "-Xmx4096m" option doesn't work.
> 4- No matter what is the configuration, it seems that there is a memory leak 
> that eat all space.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to