It might need more memory in certain situations / running certain
tests. If 3gb works for your relatively full build, yes you can open a
PR to change any occurrences of lower recommendations to 3gb.

On Tue, Sep 8, 2015 at 3:02 PM, Benjamin Zaitlen <quasi...@gmail.com> wrote:
> Ah, right.  Should've caught that.
>
> The docs seem to recommend 2gb.  Should that be increased as well?
>
> --Ben
>
> On Tue, Sep 8, 2015 at 9:33 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>> It shows you there that Maven is out of memory. Give it more heap. I use
>> 3gb.
>>
>> On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen <quasi...@gmail.com>
>> wrote:
>> > Hi All,
>> >
>> > I'm trying to build a distribution off of the latest in master and I
>> > keep
>> > getting errors on MQTT and the build fails.   I'm running the build on a
>> > m1.large which has 7.5 GB of RAM and no other major processes are
>> > running.
>> >
>> >> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
>> >> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz -Pyarn
>> >> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
>> >
>> >
>> >
>> >> INFO] Spark Project GraphX ............................... SUCCESS [
>> >> 33.345 s]
>> >> [INFO] Spark Project Streaming ............................ SUCCESS
>> >> [01:08
>> >> min]
>> >> [INFO] Spark Project Catalyst ............................. SUCCESS
>> >> [01:39
>> >> min]
>> >> [INFO] Spark Project SQL .................................. SUCCESS
>> >> [02:06
>> >> min]
>> >> [INFO] Spark Project ML Library ........................... SUCCESS
>> >> [02:16
>> >> min]
>> >> [INFO] Spark Project Tools ................................ SUCCESS [
>> >> 4.087 s]
>> >> [INFO] Spark Project Hive ................................. SUCCESS
>> >> [01:28
>> >> min]
>> >> [INFO] Spark Project REPL ................................. SUCCESS [
>> >> 16.291 s]
>> >> [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [
>> >> 13.671 s]
>> >> [INFO] Spark Project YARN ................................. SUCCESS [
>> >> 20.554 s]
>> >> [INFO] Spark Project Hive Thrift Server ................... SUCCESS [
>> >> 14.332 s]
>> >> [INFO] Spark Project Assembly ............................. SUCCESS
>> >> [03:33
>> >> min]
>> >> [INFO] Spark Project External Twitter ..................... SUCCESS [
>> >> 14.208 s]
>> >> [INFO] Spark Project External Flume Sink .................. SUCCESS [
>> >> 11.535 s]
>> >> [INFO] Spark Project External Flume ....................... SUCCESS [
>> >> 19.010 s]
>> >> [INFO] Spark Project External Flume Assembly .............. SUCCESS [
>> >> 5.210 s]
>> >> [INFO] Spark Project External MQTT ........................ FAILURE
>> >> [01:10
>> >> min]
>> >> [INFO] Spark Project External MQTT Assembly ............... SKIPPED
>> >> [INFO] Spark Project External ZeroMQ ...................... SKIPPED
>> >> [INFO] Spark Project External Kafka ....................... SKIPPED
>> >> [INFO] Spark Project Examples ............................. SKIPPED
>> >> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
>> >> [INFO]
>> >>
>> >> ------------------------------------------------------------------------
>> >> [INFO] BUILD FAILURE
>> >> [INFO]
>> >>
>> >> ------------------------------------------------------------------------
>> >> [INFO] Total time: 22:55 min
>> >> [INFO] Finished at: 2015-09-07T22:42:57+00:00
>> >> [INFO] Final Memory: 240M/455M
>> >> [INFO]
>> >>
>> >> ------------------------------------------------------------------------
>> >> [ERROR] GC overhead limit exceeded -> [Help 1]
>> >> [ERROR]
>> >> [ERROR] To see the full stack trace of the errors, re-run Maven with
>> >> the
>> >> -e switch.
>> >> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> >> [ERROR]
>> >> [ERROR] For more information about the errors and possible solutions,
>> >> please read the following articles:
>> >> [ERROR] [Help 1]
>> >> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
>> >> + return 1
>> >> + exit 1
>> >
>> >
>> > Any thoughts would be extremely helpful.
>> >
>> > --Ben
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to