thanks
------------------ ???????? ------------------ ??????: "Andy Davidson";<a...@santacruzintegration.com>; ????????: 2016??1??6??(??????) ????11:04 ??????: "Sea"<261810...@qq.com>; "user"<user@spark.apache.org>; ????: Re: How to use Java8 Hi Sea From: Sea <261810...@qq.com> Date: Tuesday, January 5, 2016 at 6:16 PM To: "user @spark" <user@spark.apache.org> Subject: How to use Java8 Hi, all I want to support java8, I use JDK1.8.0_65 in production environment, but it doesn't work. Should I build spark using jdk1.8, and set <java.version>1.8</java.version> in pom.xml? java.lang.UnsupportedClassVersionError: Unsupported major.minor version 52. Here are some notes I wrote about how to configure my data center to use java 8. You??ll probably need to do something like this Your mileage may vary Andy Setting Java_HOME ref: configure env vars install java 8 on all nodes (master and slave) install java 1.8 on master $ ssh -i $KEY_FILE root@$SPARK_MASTER # ?? how was this package download from oracle? curl? yum install jdk-8u65-linux-x64.rpm copy rpm to slaves and install java 1.8 on slaves for i in `cat /root/spark-ec2/slaves`;do scp /home/ec2-user/jdk-8u65-linux-x64.rpm $i:; done pssh -i -h /root/spark-ec2/slaves ls -l pssh -i -h /root/spark-ec2/slaves yum install -y jdk-8u65-linux-x64.rpm remove rpm from slaves. It is 153M pssh -i -h /root/spark-ec2/slaves rm jdk-8u65-linux-x64.rpm Configure spark to use java 1.8 ref: configure env vars Make a back up of of config file cp /root/spark/conf/spark-env.sh /root/spark/conf/spark-env.sh-`date +%Y-%m-%d:%H:%M:%S` pssh -i -h /root/spark-ec2/slaves cp /root/spark/conf/spark-env.sh /root/spark/conf/spark-env.sh-`date +%Y-%m-%d:%H:%M:%S` pssh -i -h /root/spark-ec2/slaves ls "/root/spark/conf/spark-env.sh*" Edit /root/spark/conf/spark-env.sh, add export JAVA_HOME=/usr/java/latest Copy spark-env.sh to slaves pssh -i -h /root/spark-ec2/slaves grep JAVA_HOME /root/spark/conf/spark-env.sh for i in `cat /root/spark-ec2/slaves`;do scp /root/spark/conf/spark-env.sh $i:/root/spark/conf/spark-env.sh; done pssh -i -h /root/spark-ec2/slaves grep JAVA_HOME /root/spark/conf/spark-env.sh