ed. You should set
> MaxPermSize if anything, not PermSize. However the error indicates you are
> not using Java 8 everywhere on your cluster, and that's a potentially
> bigger problem.
>
> On Thu, Oct 13, 2016 at 10:26 AM Shady Xu wrote:
>
>> Solved the problem by
to specify them when submitting the
Spark job, which is wried. I don't know whether it has anything to do with
py4j as I am not familiar with it.
2016-10-13 17:00 GMT+08:00 Shady Xu :
> Hi,
>
> I have a problem when running Spark SQL by PySpark on Java 8. Below is the
> log.
&
Hi,
I have a problem when running Spark SQL by PySpark on Java 8. Below is the
log.
16/10/13 16:46:40 INFO spark.SparkContext: Starting job: sql at
NativeMethodAccessorImpl.java:-2
Exception in thread "dag-scheduler-event-loop"
java.lang.OutOfMemoryError: PermGen space
at java.lang.Class
tory /logs, but then it reported /bin/yarn
not found. Seems the installation and configuration of CDH distribution has
something different with the Apache one.
2016-03-04 18:27 GMT+08:00 Steve Loughran :
>
> On 3 Mar 2016, at 09:17, Shady Xu wrote:
>
> Hi all,
>
> I am running Spa
Hi all,
I am running Spark in yarn-client mode, but every time I access the web ui,
the browser redirect me to one of the worker nodes and shows nothing. The
url looks like
http://hadoop-node31.company.com:8088/proxy/application_1453797301246_120264
.
I googled a lot and found some possible bugs