finally I use spark-sql-perf-0.4.3 :
./bin/spark-shell --jars
/home/dcos/spark-sql-perf-0.4.3/target/scala-2.11/spark-sql-perf_2.11-0.4.3.jar
--executor-cores 4 --executor-memory 10G --master spark://master1:7077
If I don't use "--jars" I will get error what I mentioned.
2016-07-29 21:17 GMT+08:00 Olivier Girardot :
> I have the same kind of issue (not using spark-sql-perf), just trying to
> deploy 2.0.0 on mesos.
> I'll keep you posted as I investigate
>
>
>
> On Wed, Jul 27, 2016 1:06 PM, kevin kiss.kevin...@gmail.com wrote:
>
>> hi,all:
>> I want to have a test about tpcds99 sql run on spark2.0.
>> I user https://github.com/databricks/spark-sql-perf
>>
>> about the master version ,when I run :val tpcds = new TPCDS (sqlContext =
>> sqlContext) I got error:
>>
>> scala> val tpcds = new TPCDS (sqlContext = sqlContext)
>> error: missing or invalid dependency detected while loading class file
>> 'Benchmarkable.class'.
>> Could not access term typesafe in package com,
>> because it (or its dependencies) are missing. Check your build definition
>> for
>> missing or conflicting dependencies. (Re-run with -Ylog-classpath to see
>> the problematic classpath.)
>> A full rebuild may help if 'Benchmarkable.class' was compiled against an
>> incompatible version of com.
>> error: missing or invalid dependency detected while loading class file
>> 'Benchmarkable.class'.
>> Could not access term scalalogging in value com.typesafe,
>> because it (or its dependencies) are missing. Check your build definition
>> for
>> missing or conflicting dependencies. (Re-run with -Ylog-classpath to see
>> the problematic classpath.)
>> A full rebuild may help if 'Benchmarkable.class' was compiled against an
>> incompatible version of com.typesafe.
>>
>> about spark-sql-perf-0.4.3 when I run
>> :tables.genData("hdfs://master1:9000/tpctest", "parquet", true, false,
>> false, false, false) I got error:
>>
>> Generating table catalog_sales in database to
>> hdfs://master1:9000/tpctest/catalog_sales with save mode Overwrite.
>> 16/07/27 18:59:59 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
>> slave1): java.lang.ClassCastException: cannot assign instance of
>> scala.collection.immutable.List$SerializationProxy to field
>> org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type
>> scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
>>
>>
>
> *Olivier Girardot* | AssociƩ
> o.girar...@lateral-thoughts.com
> +33 6 24 09 17 94
>