Hi I also have faced the same problem with shark 0.9.1 version and i have it fixed by sbt clean/packaging the shark with the rite hadoop version. You may execute the following commands to get it done.
*cd shark;export SHARK_HADOOP_VERSION=$(/root/ephemeral-hdfs/bin/hadoop version | head -n1 | cut -d" " -f 2-);sbt/sbt clean* *cd shark;export SHARK_HADOOP_VERSION=$(/root/ephemeral-hdfs/bin/hadoop version | head -n1 | cut -d" " -f 2-);sbt/sbt package* On Fri, Apr 25, 2014 at 11:16 PM, jesseerdmann <jerdm...@umn.edu> wrote: > I've run into a problem trying to launch a cluster using the provided ec2 > python script with --hadoop-major-version 2. The launch completes > correctly > with the exception of an Exception getting thrown for Tachyon 7 (I've > included it at the end of the message, but that is not the focus and seems > unrelated to my issue.) > > When I log in and try to run shark-withinfo, I get the following exception > and I'm not sure where to go from here. > > Exception in thread "main" java.lang.RuntimeException: > org.apache.hadoop.hive.ql.metadata.HiveException: > java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:278) > at shark.SharkCliDriver$.main(SharkCliDriver.scala:128) > at shark.SharkCliDriver.main(SharkCliDriver.scala) > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.reflect.InvocationTargetException > at > > org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:368) > at > org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:270) > ... 2 more > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.reflect.InvocationTargetException > at > > org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:53) > at > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) > at > > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) > at > > org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:365) > ... 3 more > Caused by: java.lang.RuntimeException: > java.lang.reflect.InvocationTargetException > at > > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131) > at org.apache.hadoop.security.Groups.<init>(Groups.java:64) > at > > org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240) > at > > org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255) > at > > org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232) > at > > org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718) > at > > org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703) > at > > org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605) > at > > org.apache.hadoop.hive.shims.HadoopShimsSecure.getUGIForConf(HadoopShimsSecure.java:491) > at > > org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:51) > ... 6 more > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) > at > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:526) > at > > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) > ... 15 more > Caused by: java.lang.UnsatisfiedLinkError: > org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V > at > org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native > Method) > at > > org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49) > at > > org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38) > ... 20 more > > > > > > For completeness, the Tachyon exception during cluster launch: > > Exception in thread "main" java.lang.RuntimeException: > org.apache.hadoop.ipc.RemoteException: Server IPC version 7 cannot > communicate with client version 4 > at tachyon.util.CommonUtils.runtimeException(CommonUtils.java:246) > at tachyon.UnderFileSystemHdfs.<init>(UnderFileSystemHdfs.java:73) > at > tachyon.UnderFileSystemHdfs.getClient(UnderFileSystemHdfs.java:53) > at tachyon.UnderFileSystem.get(UnderFileSystem.java:53) > at tachyon.Format.main(Format.java:54) > Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 7 > cannot communicate with client version 4 > at org.apache.hadoop.ipc.Client.call(Client.java:1070) > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225) > at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379) > at > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203) > at > > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89) > at > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254) > at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187) > at tachyon.UnderFileSystemHdfs.<init>(UnderFileSystemHdfs.java:69) > ... 3 more > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Shark-0-9-1-on-ec2-with-Hadoop-2-error-tp4837.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >