[
https://issues.apache.org/jira/browse/SPARK-1879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14001451#comment-14001451
]
Matei Zaharia edited comment on SPARK-1879 at 5/19/14 7:25 AM:
---------------------------------------------------------------
BTW the warning on Java 8 is the following:
{code}
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m;
support was removed in 8.0
{code}
Not *that* scary. I think we can document it and live with this for a bit and
possibly test java -version in a later release.
was (Author: matei):
BTW the warning on Java 8 is the following:
{code}
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m;
support was removed in 8.0
{code}
Not *that* scary. I think we can live with this for a bit and possibly test
java -version in a later release.
> Default PermGen size too small when using Hadoop2 and Hive
> ----------------------------------------------------------
>
> Key: SPARK-1879
> URL: https://issues.apache.org/jira/browse/SPARK-1879
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: Matei Zaharia
> Assignee: Matei Zaharia
> Priority: Critical
> Fix For: 1.0.0
>
>
> If you launch a spark-shell with Hadoop 2 and Hive on the classpath, and try
> to use Hive therein, the PermGen quickly reaches 85 MB after a few commands,
> at which point Java gives up and freezes. We should pass a MaxPermSize to
> prevent this. Unfortunately passing this results in a warning on Java 8, but
> that's still better than not passing it.
> I don't think this affects stuff other than the shell; it's just the
> combination of Scala compiler + Hive + Hadoop 2 that pushes things over the
> edge.
--
This message was sent by Atlassian JIRA
(v6.2#6252)