Thanks Sean.

So how PySpark is supported. I thought PySpark needs jdk 1.6.

Chen

On Fri, Aug 21, 2015 at 11:16 AM, Sean Owen <so...@cloudera.com> wrote:

> Spark 1.4 requires Java 7.
>
> On Fri, Aug 21, 2015, 3:12 PM Chen Song <chen.song...@gmail.com> wrote:
>
>> I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
>> PySpark, I used JDK 1.6.
>>
>> I got the following error,
>>
>> [INFO] --- scala-maven-plugin:3.2.0:testCompile
>> (scala-test-compile-first) @ spark-streaming_2.10 ---
>>
>> java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable
>> : Unsupported major.minor version 51.0
>> at java.lang.ClassLoader.defineClass1(Native Method)
>> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
>> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>
>> I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
>> Anyone has done this before?
>>
>> Thanks,
>>
>> --
>> Chen Song
>>
>>


-- 
Chen Song

Reply via email to