See <https://builds.apache.org/job/Phoenix-master/812/changes>
Changes: [ravimagham] PHOENIX-2036 - PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime ------------------------------------------ [...truncated 123609 lines...] [INFO] --- maven-failsafe-plugin:2.18:integration-test (NeedTheirOwnClusterTests) @ phoenix-pherf --- [INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-pherf/target/failsafe-reports> [INFO] parallel='none', perCoreThreadCount=true, threadCount=0, useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, threadCountMethods=0, parallelOptimized=true ------------------------------------------------------- T E S T S ------------------------------------------------------- Results : Tests run: 0, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] --- maven-failsafe-plugin:2.18:verify (ClientManagedTimeTests) @ phoenix-pherf --- [INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-pherf/target/failsafe-reports> [INFO] [INFO] --- maven-failsafe-plugin:2.18:verify (HBaseManagedTimeTests) @ phoenix-pherf --- [INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-pherf/target/failsafe-reports> [INFO] [INFO] --- maven-failsafe-plugin:2.18:verify (NeedTheirOwnClusterTests) @ phoenix-pherf --- [INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-pherf/target/failsafe-reports> [INFO] [INFO] --- maven-install-plugin:2.5.1:install (default-install) @ phoenix-pherf --- [INFO] Installing <https://builds.apache.org/job/Phoenix-master/812/artifact/phoenix-pherf/target/phoenix-pherf-4.5.0-SNAPSHOT.jar> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix-pherf/4.5.0-SNAPSHOT/phoenix-pherf-4.5.0-SNAPSHOT.jar [INFO] Installing <https://builds.apache.org/job/Phoenix-master/ws/phoenix-pherf/pom.xml> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix-pherf/4.5.0-SNAPSHOT/phoenix-pherf-4.5.0-SNAPSHOT.pom [INFO] Installing <https://builds.apache.org/job/Phoenix-master/812/artifact/phoenix-pherf/target/phoenix-pherf-4.5.0-SNAPSHOT-sources.jar> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix-pherf/4.5.0-SNAPSHOT/phoenix-pherf-4.5.0-SNAPSHOT-sources.jar [INFO] Installing <https://builds.apache.org/job/Phoenix-master/812/artifact/phoenix-pherf/target/phoenix-pherf-4.5.0-SNAPSHOT-tests.jar> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix-pherf/4.5.0-SNAPSHOT/phoenix-pherf-4.5.0-SNAPSHOT-tests.jar [INFO] Installing <https://builds.apache.org/job/Phoenix-master/812/artifact/phoenix-pherf/target/phoenix-pherf-4.5.0-SNAPSHOT-jar-with-dependencies.jar> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix-pherf/4.5.0-SNAPSHOT/phoenix-pherf-4.5.0-SNAPSHOT-jar-with-dependencies.jar [INFO] Installing <https://builds.apache.org/job/Phoenix-master/ws/phoenix-pherf/target/phoenix-pherf-4.5.0-SNAPSHOT-cluster.zip> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix-pherf/4.5.0-SNAPSHOT/phoenix-pherf-4.5.0-SNAPSHOT-cluster.zip [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Phoenix - Spark 4.5.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ phoenix-spark --- [INFO] Deleting <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target> [INFO] [INFO] --- maven-checkstyle-plugin:2.13:check (validate) @ phoenix-spark --- [INFO] Starting audit... Audit done. [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ phoenix-spark --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ phoenix-spark --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/src/main/resources> [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @ phoenix-spark --- [INFO] Add Source directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/src/main/scala> [INFO] [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ phoenix-spark --- [WARNING] Expected all dependencies to require Scala version: 2.10.4 [WARNING] org.apache.phoenix:phoenix-spark:4.5.0-SNAPSHOT requires scala version: 2.10.4 [WARNING] com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-remote_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-actor_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-slf4j_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.apache.spark:spark-core_2.10:1.3.0 requires scala version: 2.10.4 [WARNING] org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.0 [WARNING] Multiple versions of scala libraries detected! [INFO] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/src/main/scala>:-1: info: compiling [INFO] Compiling 10 source files to <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/classes> at 1436036293837 [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: there were 5 feature warning(s); re-run with -feature for details [WARNING] 8 warnings found [INFO] prepare-compile in 0 s [INFO] compile in 9 s [INFO] [INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ phoenix-spark --- [INFO] Changes detected - recompiling the module! [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ phoenix-spark --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile) @ phoenix-spark --- [WARNING] Expected all dependencies to require Scala version: 2.10.4 [WARNING] org.apache.phoenix:phoenix-spark:4.5.0-SNAPSHOT requires scala version: 2.10.4 [WARNING] com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-remote_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-actor_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.spark-project.akka:akka-slf4j_2.10:2.3.4-spark requires scala version: 2.10.4 [WARNING] org.apache.spark:spark-core_2.10:1.3.0 requires scala version: 2.10.4 [WARNING] org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.0 [WARNING] Multiple versions of scala libraries detected! [INFO] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/src/it/scala>:-1: info: compiling [INFO] Compiling 1 source files to <https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/test-classes> at 1436036303787 [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.ToString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.joda.convert.FromString not found - continuing with a stub. [WARNING] warning: Class org.apache.hadoop.mapred.MiniMRCluster not found - continuing with a stub. [WARNING] 8 warnings found [INFO] prepare-compile in 0 s [INFO] compile in 10 s [INFO] [INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ phoenix-spark --- [INFO] Changes detected - recompiling the module! [INFO] [INFO] --- maven-surefire-plugin:2.18:test (default-test) @ phoenix-spark --- [INFO] [INFO] --- scalatest-maven-plugin:1.0:test (test) @ phoenix-spark --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix-spark --- [INFO] Building jar: <https://builds.apache.org/job/Phoenix-master/812/artifact/phoenix-spark/target/phoenix-spark-4.5.0-SNAPSHOT-sources.jar> [INFO] [INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix-spark --- [INFO] Building jar: <https://builds.apache.org/job/Phoenix-master/812/artifact/phoenix-spark/target/phoenix-spark-4.5.0-SNAPSHOT-tests.jar> [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-spark --- [INFO] Building jar: <https://builds.apache.org/job/Phoenix-master/812/artifact/phoenix-spark/target/phoenix-spark-4.5.0-SNAPSHOT.jar> [INFO] [INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ phoenix-spark --- [INFO] [INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ phoenix-spark --- WARNING: -c has been deprecated and will be reused for a different (but still very cool) purpose in ScalaTest 2.0. Please change all uses of -c to -P. [36mDiscovery starting.[0m [36mDiscovery completed in 277 milliseconds.[0m [36mRun starting. Expected test count is: 18[0m [32mPhoenixSparkIT:[0m Formatting using clusterid: testClusterID 0 [RpcServer.reader=1,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 35277 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 937 [RpcServer.reader=1,bindAddress=penates.apache.org,port=54530] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 39623 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 1159 [RpcServer.reader=2,bindAddress=penates.apache.org,port=54530] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 39625 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 1918 [RpcServer.reader=3,bindAddress=penates.apache.org,port=54530] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 39632 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 2126 [RpcServer.reader=4,bindAddress=penates.apache.org,port=54530] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 39635 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 2811 [RpcServer.reader=2,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44274 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 4103 [RpcServer.reader=5,bindAddress=penates.apache.org,port=54530] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 39643 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 9345 [RpcServer.reader=3,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44297 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 11767 [RpcServer.reader=4,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44311 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 13157 [RpcServer.reader=5,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44319 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 14569 [RpcServer.reader=6,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44324 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 15938 [RpcServer.reader=7,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44333 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 17309 [RpcServer.reader=8,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44340 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 18798 [RpcServer.reader=9,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44350 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 20194 [RpcServer.reader=0,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44358 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 21582 [RpcServer.reader=1,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44367 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 23010 [RpcServer.reader=2,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44376 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 25182 [sparkDriver-akka.actor.default-dispatcher-2] INFO Remoting - Starting remoting 25362 [sparkDriver-akka.actor.default-dispatcher-2] INFO Remoting - Remoting started; listening on addresses :[akka.tcp://sparkdri...@penates.apache.org:47269] [32m- Can create valid SQL[0m [32m- Can convert Phoenix schema[0m 27605 [RpcServer.reader=3,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44391 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 27614 [RpcServer.reader=6,bindAddress=penates.apache.org,port=54530] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 39757 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 27711 [RpcServer.reader=4,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44395 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 27867 [RpcServer.reader=5,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44398 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" 28016 [RpcServer.reader=6,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44402 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" [32m- Can create schema RDD and execute query[0m [31m- Can create schema RDD and execute query on case sensitive table (no config) *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName=TABLE3[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:260)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:331)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:357)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$4.apply$mcV$sp(PhoenixSparkIT.scala:157)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$4.apply(PhoenixSparkIT.scala:153)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$4.apply(PhoenixSparkIT.scala:153)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [32m- Can create schema RDD and execute constrained query[0m [31m- Using a predicate referring to a non-existent column should fail *** FAILED ***[0m [31m Expected exception java.lang.RuntimeException to be thrown, but org.apache.phoenix.schema.TableNotFoundException was thrown. (PhoenixSparkIT.scala:194)[0m [31m- Can create schema RDD with predicate that will never match *** FAILED ***[0m [31m org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName=TABLE3[0m [31m at org.apache.phoenix.schema.PMetaDataImpl.getTable(PMetaDataImpl.java:260)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:331)[0m [31m at org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:357)[0m [31m at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:263)[0m [31m at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:109)[0m [31m at org.apache.phoenix.spark.SparkSqlContextFunctions.phoenixTableAsDataFrame(SparkSqlContextFunctions.scala:37)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$7.apply$mcV$sp(PhoenixSparkIT.scala:213)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$7.apply(PhoenixSparkIT.scala:210)[0m [31m at org.apache.phoenix.spark.PhoenixSparkIT$$anonfun$7.apply(PhoenixSparkIT.scala:210)[0m [31m at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)[0m [31m ...[0m [32m- Can create schema RDD with complex predicate[0m [32m- Can query an array table[0m [32m- Can read a table as an RDD[0m [32m- Can save to phoenix table[0m [32m- Can save Java and Joda dates to Phoenix (no config)[0m [32m- Can infer schema without defining columns[0m [32m- Spark SQL can use Phoenix as a data source with no schema specified[0m [32m- Spark SQL can use Phoenix as a data source with PrunedFilteredScan[0m [32m- Can persist a dataframe using 'DataFrame.saveToPhoenix'[0m [32m- Can persist a dataframe using 'DataFrame.save()[0m [32m- Can save arrays back to phoenix[0m 36537 [RpcServer.reader=7,bindAddress=penates.apache.org,port=56467] INFO SecurityLogger.org.apache.hadoop.hbase.Server - Connection from 67.195.81.186 port: 44427 with version info: version: "1.1.0" url: "git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: "e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f" [36mRun completed in 1 minute, 7 seconds.[0m [36mTotal number of tests run: 18[0m [36mSuites: completed 2, aborted 0[0m [36mTests: succeeded 15, failed 3, canceled 0, ignored 0, pending 0[0m [31m*** 3 TESTS FAILED ***[0m [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Phoenix .................................... SUCCESS [3.370s] [INFO] Phoenix Core ...................................... SUCCESS [1:30:50.117s] [INFO] Phoenix - Flume ................................... SUCCESS [1:11.082s] [INFO] Phoenix - Pig ..................................... SUCCESS [3:21.533s] [INFO] Phoenix Query Server Client ....................... SUCCESS [1.377s] [INFO] Phoenix Query Server .............................. SUCCESS [1:59.474s] [INFO] Phoenix - Pherf ................................... SUCCESS [1:50.036s] [INFO] Phoenix - Spark ................................... FAILURE [1:42.517s] [INFO] Phoenix Assembly .................................. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:41:00.163s [INFO] Finished at: Sat Jul 04 18:59:52 UTC 2015 [INFO] Final Memory: 73M/1015M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (integration-test) on project phoenix-spark: There are test failures -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :phoenix-spark Build step 'Invoke top-level Maven targets' marked build as failure Archiving artifacts Sending artifact delta relative to Phoenix | Master #790 Archived 1049 artifacts Archive block size is 32768 Received 13250 blocks and 531978371 bytes Compression is 44.9% Took 2 min 37 sec Updating PHOENIX-2036 Recording test results