How to run specific sparkSQL test with maven

2014-08-01 Thread 田毅
Hi everyone!

Could any one tell me how to run specific sparkSQL test with maven?

For example:

I want to test HiveCompatibilitySuite.

I ran “mvm test -Dtest=HiveCompatibilitySuite”

It did not work. 

BTW, is there any information about how to build a test environment of sparkSQL?

I got this error when i ran the test.

It seems that the HiveCompatibilitySuite need a hadoop and hive environment, am 
I right?
 
Relative path in absolute URI: file:$%7Bsystem:test.tmp.dir%7D/tmp_showcrt1” 







Re: How to run specific sparkSQL test with maven

2014-08-01 Thread Michael Armbrust

 It seems that the HiveCompatibilitySuite need a hadoop and hive
 environment, am I right?

 Relative path in absolute URI:
 file:$%7Bsystem:test.tmp.dir%7D/tmp_showcrt1”


You should only need Hadoop and Hive if you are creating new tests that we
need to compute the answers for.  Existing tests are run with cached
answers.  There are details about the configuration here:
https://github.com/apache/spark/tree/master/sql


Re: How to run specific sparkSQL test with maven

2014-08-01 Thread Cheng Lian
It’s also useful to set hive.exec.mode.local.auto to true to accelerate the
test.
​


On Sat, Aug 2, 2014 at 1:36 AM, Michael Armbrust mich...@databricks.com
wrote:

 
  It seems that the HiveCompatibilitySuite need a hadoop and hive
  environment, am I right?
 
  Relative path in absolute URI:
  file:$%7Bsystem:test.tmp.dir%7D/tmp_showcrt1”
 

 You should only need Hadoop and Hive if you are creating new tests that we
 need to compute the answers for.  Existing tests are run with cached
 answers.  There are details about the configuration here:
 https://github.com/apache/spark/tree/master/sql