Hi, I am trying to create a Spark cluster using the spark-ec2 script which will support 2.5.0-cdh5.3.2 for HDFS as well as Hive. I created a cluster by adding --hadoop-major-version=2.5.0 which solved some of the errors I was getting. But now when I run select query on hive I get the following error:-
Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status Has anybody tried doing this? Is there a solution? I used this command to create my cluster:- ./spark-ec2 --key-pair=awskey --identity-file=awskey.pem --instance-type=m3.xlarge --spot-price=0.08 --region=us-west-2 --zone=us-west-2c --hadoop-major-version=2.5.0-cdh5.3.2 --spark-version=1.3.0 --slaves=1 launch spark-cluster Thank You for the help. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Create-a-Spark-cluster-with-cloudera-CDH-5-2-support-tp22168.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
