>
> Hi,
>
> This is my first code in shark 0.9.1. I am new to spark and shark. So I
> don't know where I went wrong. It will be really helpful, If some one out
> there can troubleshoot the problem.
> First of all I will give a glimpse on my code which is developed in
> IntellijIdea. This code is running perfectly in the editor
>
> *Code:*
>
> def main(args: Array[String]){
> val sparkConf = new
> SparkConf().setAppName("SharkTest").setMaster("local")
> .set("spark.executor.memory", "8g")
> .set("spark.worker.memory", "8g")
> .set("spark.executor.uri", "http://IP/spark/spark-0.9.1.tar.gz")
> .set("spark.mesos.coarse", "true")
>
> .setJars(List(args(1)+"/shark-assembly-0.9.1-hadoop2.0.0-cdh4.5.0.jar"))
> val shc = SharkEnv.initWithSharkContext(sparkConf)
> val q1="CREATE EXTERNAL TABLE IF NOT EXISTS table1(startIpNum
> string,endIpNum string,locId string) ROW FORMAT DELIMITED FIELDS TERMINATED
> BY '"+args(3)+"' LOCATION '"+args(2)+"' "
> val q3="SELECT * FROM table1"
> shc.runSql(q1)
> shc.runSql(q3)
> shc.sql2rdd(q3).map{resultSet=>
> val
> y=resultSet.colname2indexMap.values.map(index=>resultSet(index)).reduce((a,b)=>a+","+b)
> y
> }.saveAsTextFile(args(4))
> shc.sql("DROP TABLE IF EXISTS table1")
> }
>
> *build.sbt:*
>
>
> import AssemblyKeys._
>
> assemblySettings
>
> name := "appname"
>
> version := "1.0"
>
> scalaVersion := "2.10.3"
>
> mainClass := Some("classname")
>
> libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "0.9.1",
> "edu.berkeley.cs.shark" %% "shark" % "0.9.1",
> "org.apache.hive" % "hive-anttasks" % "0.11.0",
> "org.apache.hive" % "hive-beeline" % "0.11.0",
> "org.apache.hive" % "hive-cli" % "0.11.0",
> "org.apache.hive" % "hive-common" % "0.11.0",
> "org.apache.hive" % "hive-exec" % "0.11.0",
> "org.apache.hive" % "hive-hbase-handler" % "0.11.0",
> "org.apache.hive" % "hive-hwi" % "0.11.0",
> "org.apache.hive" % "hive-jdbc" % "0.11.0",
> "org.apache.hive" % "hive-metastore" % "0.11.0",
> "org.apache.hive" % "hive-serde" % "0.11.0",
> "org.apache.hive" % "hive-service" % "0.11.0",
> "org.apache.hive" % "hive-shims" % "0.11.0",
> "org.datanucleus" % "datanucleus-core" % "3.2.2",
> "org.datanucleus" % "datanucleus-rdbms" % "3.2.1",
> "org.datanucleus" % "datanucleus-api-jdo" % "3.2.1",
> "org.datanucleus" % "datanucleus-enhancer" % "3.1.1",
> "org.apache.derby" % "derby" % "10.10.1.1",
> "org.apache.hadoop" % "hadoop-client" % "2.0.0-cdh4.5.0")
>
> resolvers ++= Seq("Akka Repository" at "http://repo.akka.io/releases/",
> "Cloudera Repository" at "
> https://repository.cloudera.com/artifactory/cloudera-repos/")
>
> mergeStrategy in assembly := {
> case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
> case m if m.toLowerCase.matches("meta-inf.*\\.sf$") =>
> MergeStrategy.discard
> case "log4j.properties" => MergeStrategy.discard
> case m if m.toLowerCase.startsWith("meta-inf/services/") =>
> MergeStrategy.filterDistinctLines
> case "reference.conf" => MergeStrategy.concat
> case _ => MergeStrategy.first
> }
>
> sbt assembly plugin version : 0.10.2
>
> The problem is only when I am trying create the jar of the code.
>
> Steps followed to create the jar:
> 1. Sbt clean
> 2. Sbt assembly
>
> When I try to run the jar using the command "java -jar <jarName.jar>
> <parameters>" , an error comes as "invalid or corrupt jar"
> The same jar is accepted when executed as "java -cp
> <jarname.jar><classname><parameters>. But in this case a hive exception
> occurs as "unable to fetch the table tablename"
>
>
> 14/07/26 12:21:39 INFO Driver: <PERFLOG method=TimeToSubmit>
> 14/07/26 12:21:39 INFO Driver: <PERFLOG method=compile>
> 14/07/26 12:21:39 INFO ParseDriver: Parsing command: CREATE EXTERNAL TABLE
> IF NOT EXISTS table1(startIpNum string,endIpNum string,locId string) ROW
> FORMAT DELIMITED FIELDS TERMINATED BY ',' LOCATION
> '/home/user/foldername/Input/SharkTest'
> 14/07/26 12:21:39 INFO ParseDriver: Parse Completed
> 14/07/26 12:21:40 INFO SharkSemanticAnalyzer: Starting Semantic Analysis
> 14/07/26 12:21:40 INFO SharkSemanticAnalyzer: Creating table table1
> position=36
> 14/07/26 12:21:40 INFO HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 14/07/26 12:21:40 INFO ObjectStore: ObjectStore, initialize called
> org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table
> table1
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:957)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:904)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:9328)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8647)
> at
> shark.parse.SharkSemanticAnalyzer.analyzeInternal(SharkSemanticAnalyzer.scala:105)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:279)
> at shark.SharkDriver.compile(SharkDriver.scala:215)
>
> I would appreciate any comments about the cause of the above exception
>
> Regards,
>
> Bilna P
>
>