See <https://builds.apache.org/job/Phoenix-master/714/changes>
Changes: [ndimiduk] PHOENIX-1909 queryserver.py accept PHOENIX_QUERYSERVER_OPTS [maryannxue] PHOENIX-1773 Backward compatibility for joins fail after PHOENIX-1489 ------------------------------------------ [...truncated 1814 lines...] [WARNING] org.apache.phoenix:phoenix-spark:4.4.0-SNAPSHOT requires scala version: 2.10.4 [WARNING] org.scalatest:scalatest_2.10:2.2.4 requires scala version: 2.10.4 [WARNING] org.scalamock:scalamock-scalatest-support_2.10:3.1.4 requires scala version: 2.10.4 [WARNING] org.scalamock:scalamock-core_2.10:3.1.4 requires scala version: 2.10.4 [WARNING] com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-remote_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-actor_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-slf4j_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.apache.spark:spark-core_2.10:1.3.0 requires scala version: 2.10.4 [WARNING] org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.0 [WARNING] Multiple versions of scala libraries detected! [INFO] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/src/it/scala>:-1: info: compiling [INFO] Compiling 1 source files to <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/test-classes> at 1429844490618 [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.apache.hadoop.mapred.MiniMRCluster not found - continuing with a stub. [WARNING] 8 warnings found [INFO] prepare-compile in 0 s [INFO] compile in 10 s [INFO] [INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ phoenix-spark --- [INFO] Changes detected - recompiling the module! [INFO] [INFO] --- maven-surefire-plugin:2.18:test (default-test) @ phoenix-spark --- [INFO] [INFO] --- scalatest-maven-plugin:1.0:test (test) @ phoenix-spark --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix-spark --- [INFO] Building jar: <https://builds.apache.org/job/Phoenix-master/714/artifact/phoenix-spark/target/phoenix-spark-4.4.0-SNAPSHOT-sources.jar> [INFO] [INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix-spark --- [INFO] Building jar: <https://builds.apache.org/job/Phoenix-master/714/artifact/phoenix-spark/target/phoenix-spark-4.4.0-SNAPSHOT-tests.jar> [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-spark --- [INFO] Building jar: <https://builds.apache.org/job/Phoenix-master/714/artifact/phoenix-spark/target/phoenix-spark-4.4.0-SNAPSHOT.jar> [INFO] [INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ phoenix-spark --- [INFO] [INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ phoenix-spark --- WARNING: -c has been deprecated and will be reused for a different (but still very cool) purpose in ScalaTest 2.0. Please change all uses of -c to -P. [36mDiscovery starting.[0m [36mDiscovery completed in 317 milliseconds.[0m [36mRun starting. Expected test count is: 17[0m [32mPhoenixSparkIT:[0m Formatting using clusterid: testClusterID 0 [sparkDriver-akka.actor.default-dispatcher-2] INFO Remoting - Starting remoting 245 [sparkDriver-akka.actor.default-dispatcher-2] INFO Remoting - Remoting started; listening on addresses :[akka.tcp://sparkdri...@pietas.apache.org:49838] [32m- Can create valid SQL[0m [32m- Can convert Phoenix schema[0m [31m- Can create schema RDD and execute query *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="TABLE1"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$3.apply$mcV$sp(PhoenixSparkIT.scala:134)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$3.apply(PhoenixSparkIT.scala:131)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$3.apply(PhoenixSparkIT.scala:131)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [31m- Can create schema RDD and execute query on case sensitive table (no config) *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="table3"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$4.apply$mcV$sp(PhoenixSparkIT.scala:157)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$4.apply(PhoenixSparkIT.scala:153)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$4.apply(PhoenixSparkIT.scala:153)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [31m- Can create schema RDD and execute constrained query *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="TABLE1"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$5.apply$mcV$sp(PhoenixSparkIT.scala:172)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$5.apply(PhoenixSparkIT.scala:169)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$5.apply(PhoenixSparkIT.scala:169)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [31m- Using a predicate referring to a non-existent column should fail *** FAILED ***[0m [31m Expected exception java.lang.RuntimeException to be thrown, but org.apache.phoenix.schema.TableNotFoundException was thrown. (PhoenixSparkIT.scala:194)[0m [31m- Can create schema RDD with predicate that will never match *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="table3"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$7.apply$mcV$sp(PhoenixSparkIT.scala:213)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$7.apply(PhoenixSparkIT.scala:210)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$7.apply(PhoenixSparkIT.scala:210)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [31m- Can create schema RDD with complex predicate *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="DATE_PREDICATE_TEST_TABLE"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$8.apply$mcV$sp(PhoenixSparkIT.scala:229)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$8.apply(PhoenixSparkIT.scala:226)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$8.apply(PhoenixSparkIT.scala:226)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [31m- Can query an array table *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="ARRAY_TEST_TABLE"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$9.apply$mcV$sp(PhoenixSparkIT.scala:250)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$9.apply(PhoenixSparkIT.scala:247)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$9.apply(PhoenixSparkIT.scala:247)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [32m- Can read a table as an RDD[0m [32m- Can save to phoenix table[0m [32m- Can save Java and Joda dates to Phoenix (no config)[0m [31m- Can infer schema without defining columns *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="TABLE2"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$13.apply$mcV$sp(PhoenixSparkIT.scala:335)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$13.apply(PhoenixSparkIT.scala:333)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$13.apply(PhoenixSparkIT.scala:333)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [31m- Spark SQL can use Phoenix as a data source with no schema specified *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="TABLE1"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:40)[0m [31m at org.apache.spark.sql.sources.LogicalRelation.<init>(LogicalRelation.scala:30)[0m [31m at org.apache.spark.sql.SQLContext.load(SQLContext.scala:680)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$14.apply$mcV$sp(PhoenixSparkIT.scala:343)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$14.apply(PhoenixSparkIT.scala:341)[0m [31m ...[0m [31m- Spark SQL can use Phoenix as a data source with PrunedFilteredScan *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="TABLE1"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:40)[0m [31m at org.apache.spark.sql.sources.LogicalRelation.<init>(LogicalRelation.scala:30)[0m [31m at org.apache.spark.sql.SQLContext.load(SQLContext.scala:680)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$15.apply$mcV$sp(PhoenixSparkIT.scala:352)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$15.apply(PhoenixSparkIT.scala:350)[0m [31m ...[0m [31m- Can persist a dataframe using 'DataFrame.saveToPhoenix' *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="TABLE1"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:40)[0m [31m at org.apache.spark.sql.sources.LogicalRelation.<init>(LogicalRelation.scala:30)[0m [31m at org.apache.spark.sql.SQLContext.load(SQLContext.scala:680)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$16.apply$mcV$sp(PhoenixSparkIT.scala:371)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$16.apply(PhoenixSparkIT.scala:368)[0m [31m ...[0m [31m- Can persist a dataframe using 'DataFrame.save() *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName="TABLE1"[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:241)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:323)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:350)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:40)[0m [31m at org.apache.spark.sql.sources.LogicalRelation.<init>(LogicalRelation.scala:30)[0m [31m at org.apache.spark.sql.SQLContext.load(SQLContext.scala:680)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$17.apply$mcV$sp(PhoenixSparkIT.scala:399)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$17.apply(PhoenixSparkIT.scala:391)[0m [31m ...[0m [36mRun completed in 38 seconds, 401 milliseconds.[0m [36mTotal number of tests run: 17[0m [36mSuites: completed 2, aborted 0[0m [36mTests: succeeded 5, failed 12, canceled 0, ignored 0, pending 0[0m [31m*** 12 TESTS FAILED ***[0m [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Phoenix .................................... SUCCESS [1.932s] [INFO] Phoenix Core ...................................... SUCCESS [25:53.928s] [INFO] Phoenix - Flume ................................... SUCCESS [1:01.604s] [INFO] Phoenix - Pig ..................................... SUCCESS [2:20.686s] [INFO] Phoenix Query Server Client ....................... SUCCESS [0.993s] [INFO] Phoenix Query Server .............................. SUCCESS [1:47.522s] [INFO] Phoenix - Spark ................................... FAILURE [1:14.635s] [INFO] Phoenix Assembly .................................. SKIPPED [INFO] Phoenix - Pherf ................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 32:21.814s [INFO] Finished at: Fri Apr 24 03:02:30 UTC 2015 [INFO] Final Memory: 77M/962M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (integration-test) on project phoenix-spark: There are test failures -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :phoenix-spark Build step 'Invoke top-level Maven targets' marked build as failure Archiving artifacts Sending artifact delta relative to Phoenix | Master #701 Archived 1104 artifacts Archive block size is 32768 Received 19387 blocks and 494976864 bytes Compression is 56.2% Took 3 min 44 sec Updating PHOENIX-1773 Updating PHOENIX-1489 Updating PHOENIX-1909 Recording test results