See <https://builds.apache.org/job/Phoenix-master/987/changes>

Changes:

[jtaylor] PHOENIX-2462 Add tephra test dependency to spark module

------------------------------------------
[...truncated 69335 lines...]
[DEBUG]   (f) project = MavenProject: 
org.apache.phoenix:phoenix-spark:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/pom.xml>
[DEBUG]   (f) session = org.apache.maven.execution.MavenSession@124786a2
[DEBUG]   (f) skipIfEmpty = false
[DEBUG]   (f) testClassesDirectory = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/test-classes>
[DEBUG]   (f) useDefaultManifestFile = false
[DEBUG] -- end configuration --
[DEBUG] isUp2date: false (Destination 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/phoenix-spark-4.7.0-HBase-1.1-SNAPSHOT-tests.jar>
 not found.)
[INFO] Building jar: 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/phoenix-spark-4.7.0-HBase-1.1-SNAPSHOT-tests.jar>
[DEBUG] adding directory META-INF/
[DEBUG] adding entry META-INF/MANIFEST.MF
[DEBUG] adding directory org/
[DEBUG] adding directory org/apache/
[DEBUG] adding directory org/apache/phoenix/
[DEBUG] adding directory org/apache/phoenix/spark/
[DEBUG] adding entry META-INF/LICENSE
[DEBUG] adding entry META-INF/DEPENDENCIES
[DEBUG] adding entry META-INF/NOTICE
[DEBUG] adding entry log4j.xml
[DEBUG] adding entry setup.sql
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$14.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$23.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$19$$anonfun$24.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$7.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$6.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$4.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$2.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$5.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$19.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$21.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$beforeAll$1.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkITHelper$.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$9.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$20.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$16.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$22.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$1.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$12.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkITHelper.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$8.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$3.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$15.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$5$$anonfun$apply$mcV$sp$1.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$18.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$13.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$10$$anonfun$apply$mcV$sp$2.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$10.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$17.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixSparkIT$$anonfun$11.class
[DEBUG] adding directory META-INF/maven/
[DEBUG] adding directory META-INF/maven/org.apache.phoenix/
[DEBUG] adding directory META-INF/maven/org.apache.phoenix/phoenix-spark/
[DEBUG] adding entry META-INF/maven/org.apache.phoenix/phoenix-spark/pom.xml
[DEBUG] adding entry 
META-INF/maven/org.apache.phoenix/phoenix-spark/pom.properties
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-spark ---
[DEBUG] Configuring mojo org.apache.maven.plugins:maven-jar-plugin:2.4:jar from 
plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-jar-plugin:2.4, 
parent: sun.misc.Launcher$AppClassLoader@76d88aa2]
[DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-jar-plugin:2.4:jar' 
with basic configurator -->
[DEBUG]   (s) addDefaultSpecificationEntries = true
[DEBUG]   (s) addDefaultImplementationEntries = true
[DEBUG]   (s) manifest = 
org.apache.maven.archiver.ManifestConfiguration@14f20328
[DEBUG]   (f) archive = 
org.apache.maven.archiver.MavenArchiveConfiguration@1cad41d6
[DEBUG]   (f) classesDirectory = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/classes>
[DEBUG]   (f) defaultManifestFile = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/classes/META-INF/MANIFEST.MF>
[DEBUG]   (f) finalName = phoenix-spark-4.7.0-HBase-1.1-SNAPSHOT
[DEBUG]   (f) forceCreation = false
[DEBUG]   (f) outputDirectory = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target>
[DEBUG]   (f) project = MavenProject: 
org.apache.phoenix:phoenix-spark:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/pom.xml>
[DEBUG]   (f) session = org.apache.maven.execution.MavenSession@124786a2
[DEBUG]   (f) skipIfEmpty = false
[DEBUG]   (f) useDefaultManifestFile = false
[DEBUG] -- end configuration --
[DEBUG] isUp2date: false (Destination 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/phoenix-spark-4.7.0-HBase-1.1-SNAPSHOT.jar>
 not found.)
[INFO] Building jar: 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/phoenix-spark-4.7.0-HBase-1.1-SNAPSHOT.jar>
[DEBUG] adding directory META-INF/
[DEBUG] adding entry META-INF/MANIFEST.MF
[DEBUG] adding directory org/
[DEBUG] adding directory org/apache/
[DEBUG] adding directory org/apache/phoenix/
[DEBUG] adding directory org/apache/phoenix/spark/
[DEBUG] adding entry META-INF/LICENSE
[DEBUG] adding entry META-INF/DEPENDENCIES
[DEBUG] adding entry META-INF/NOTICE
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixRecordWritable.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixRelation.class
[DEBUG] adding entry org/apache/phoenix/spark/package$.class
[DEBUG] adding entry 
org/apache/phoenix/spark/DataFrameFunctions$$anonfun$2$$anonfun$apply$1.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixRelation$.class
[DEBUG] adding entry org/apache/phoenix/spark/DefaultSource.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRelation$$anonfun$org$apache$phoenix$spark$PhoenixRelation$$buildFilter$1$$anonfun$apply$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRecordWritable$$anonfun$readFields$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/DataFrameFunctions$$anonfun$2$$anonfun$apply$1$$anonfun$apply$2.class
[DEBUG] adding entry org/apache/phoenix/spark/ConfigurationUtil.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRDD$$anonfun$toDataFrame$1.class
[DEBUG] adding entry org/apache/phoenix/spark/SparkContextFunctions.class
[DEBUG] adding entry org/apache/phoenix/spark/ConfigurationUtil$.class
[DEBUG] adding entry org/apache/phoenix/spark/SparkSqlContextFunctions.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRDD$$anonfun$toDataFrame$1$$anonfun$2.class
[DEBUG] adding entry 
org/apache/phoenix/spark/ProductRDDFunctions$$anonfun$1$$anonfun$apply$1.class
[DEBUG] adding entry org/apache/phoenix/spark/ProductRDDFunctions.class
[DEBUG] adding entry org/apache/phoenix/spark/DataFrameFunctions.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRDD$$anonfun$compute$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRDD$$anonfun$printPhoenixConfig$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/DataFrameFunctions$$anonfun$2.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRelation$$anonfun$org$apache$phoenix$spark$PhoenixRelation$$buildFilter$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRDD$$anonfun$phoenixSchemaToCatalystSchema$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/PhoenixRecordWritable$$anonfun$write$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/SparkContextFunctions$$anonfun$phoenixTableAsRDD$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/ProductRDDFunctions$$anonfun$1$$anonfun$apply$1$$anonfun$apply$2.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixRDD$.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixRDD$$anonfun$1.class
[DEBUG] adding entry org/apache/phoenix/spark/PhoenixRDD.class
[DEBUG] adding entry org/apache/phoenix/spark/package.class
[DEBUG] adding entry 
org/apache/phoenix/spark/ProductRDDFunctions$$anonfun$1.class
[DEBUG] adding entry 
org/apache/phoenix/spark/DataFrameFunctions$$anonfun$1.class
[DEBUG] adding directory META-INF/maven/
[DEBUG] adding directory META-INF/maven/org.apache.phoenix/
[DEBUG] adding directory META-INF/maven/org.apache.phoenix/phoenix-spark/
[DEBUG] adding entry META-INF/maven/org.apache.phoenix/phoenix-spark/pom.xml
[DEBUG] adding entry 
META-INF/maven/org.apache.phoenix/phoenix-spark/pom.properties
[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ 
phoenix-spark ---
[DEBUG] Configuring mojo 
org.apache.maven.plugins:maven-site-plugin:3.2:attach-descriptor from plugin 
realm ClassRealm[plugin>org.apache.maven.plugins:maven-site-plugin:3.2, parent: 
sun.misc.Launcher$AppClassLoader@76d88aa2]
[DEBUG] Configuring mojo 
'org.apache.maven.plugins:maven-site-plugin:3.2:attach-descriptor' with basic 
configurator -->
[DEBUG]   (f) basedir = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark>
[DEBUG]   (f) inputEncoding = UTF-8
[DEBUG]   (f) localRepository =        id: local
      url: file:///home/jenkins/.m2/repository/
   layout: none

[DEBUG]   (f) outputEncoding = UTF-8
[DEBUG]   (f) pomPackagingOnly = true
[DEBUG]   (f) reactorProjects = [MavenProject: 
org.apache.phoenix:phoenix:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/pom.xml,> MavenProject: 
org.apache.phoenix:phoenix-core:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/pom.xml,> 
MavenProject: org.apache.phoenix:phoenix-flume:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-flume/pom.xml,> 
MavenProject: org.apache.phoenix:phoenix-pig:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-pig/pom.xml,> 
MavenProject: org.apache.phoenix:phoenix-server-client:4.7.0-HBase-1.1-SNAPSHOT 
@ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-server-client/pom.xml,>
 MavenProject: org.apache.phoenix:phoenix-server:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-server/pom.xml,> 
MavenProject: org.apache.phoenix:phoenix-pherf:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-pherf/pom.xml,> 
MavenProject: org.apache.phoenix:phoenix-spark:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/pom.xml,> 
MavenProject: org.apache.phoenix:phoenix-assembly:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-assembly/pom.xml,> 
MavenProject: 
org.apache.phoenix:phoenix-tracing-webapp:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-tracing-webapp/pom.xml]>
[DEBUG]   (f) siteDirectory = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/src/site>
[DEBUG]   (f) project = MavenProject: 
org.apache.phoenix:phoenix-spark:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/pom.xml>
[DEBUG] -- end configuration --
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ phoenix-spark 
---
[DEBUG] Configuring mojo org.scalatest:scalatest-maven-plugin:1.0:test from 
plugin realm ClassRealm[plugin>org.scalatest:scalatest-maven-plugin:1.0, 
parent: sun.misc.Launcher$AppClassLoader@76d88aa2]
[DEBUG] Configuring mojo 'org.scalatest:scalatest-maven-plugin:1.0:test' with 
basic configurator -->
[DEBUG]   (f) argLine = -Xmx1536m -XX:MaxPermSize=512m 
-XX:ReservedCodeCacheSize=512m
[DEBUG]   (f) debugForkedProcess = false
[DEBUG]   (f) debuggerPort = 5005
[DEBUG]   (f) filereports = WDF TestSuite.txt
[DEBUG]   (f) forkMode = once
[DEBUG]   (f) forkedProcessTimeoutInSeconds = 0
[DEBUG]   (f) junitxml = .
[DEBUG]   (f) logForkedProcessCommand = false
[DEBUG]   (f) outputDirectory = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/classes>
[DEBUG]   (f) parallel = true
[DEBUG]   (f) project = MavenProject: 
org.apache.phoenix:phoenix-spark:4.7.0-HBase-1.1-SNAPSHOT @ 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/pom.xml>
[DEBUG]   (f) reportsDirectory = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/surefire-reports>
[DEBUG]   (f) tagsToExclude = Integration-Test
[DEBUG]   (f) testOutputDirectory = 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/test-classes>
[DEBUG] -- end configuration --
[DEBUG] [-R, 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/classes> 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/test-classes,>
 -l, Integration-Test, -c, -o, -fWDF, 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/surefire-reports/TestSuite.txt,>
 -u, 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/surefire-reports/.]>
[DEBUG] Forking ScalaTest via: /bin/sh -c cd 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark> && java 
-Dbasedir=<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark> 
-Xmx1536m -XX:MaxPermSize=512m -XX:ReservedCodeCacheSize=512m 
org.scalatest.tools.Runner -R 
'<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/classes> 
<https://builds.apache.org/job/Phoenix-master/ws/phoenix-spark/target/test-classes'>
 -l Integration-Test -c -o -fWDF 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/surefire-reports/TestSuite.txt>
 -u 
<https://builds.apache.org/job/Phoenix-master/987/artifact/phoenix-spark/target/surefire-reports/.>
WARNING: -c has been deprecated and will be reused for a different (but still 
very cool) purpose in ScalaTest 2.0. Please change all uses of -c to -P.
Discovery starting.
Discovery completed in 292 milliseconds.
Run starting. Expected test count is: 22
PhoenixSparkIT:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/jenkins/.m2/repository/ch/qos/logback/logback-classic/1.0.9/logback-classic-1.0.9.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Formatting using clusterid: testClusterID
0    [RpcServer.reader=1,bindAddress=proserpina.apache.org,port=54514] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 67.195.81.189 
port: 55254 with version info: version: "1.1.0" url: 
"git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: 
"e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 
14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f"
991  [RpcServer.reader=1,bindAddress=proserpina.apache.org,port=33528] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 67.195.81.189 
port: 43721 with version info: version: "1.1.0" url: 
"git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: 
"e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 
14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f"
1287 [RpcServer.reader=2,bindAddress=proserpina.apache.org,port=33528] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 67.195.81.189 
port: 43723 with version info: version: "1.1.0" url: 
"git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: 
"e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 
14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f"
2328 [RpcServer.reader=3,bindAddress=proserpina.apache.org,port=33528] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 67.195.81.189 
port: 43730 with version info: version: "1.1.0" url: 
"git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: 
"e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 
14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f"
2630 [RpcServer.reader=4,bindAddress=proserpina.apache.org,port=33528] INFO  
SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 67.195.81.189 
port: 43732 with version info: version: "1.1.0" url: 
"git://hw11397.local/Volumes/hbase-1.1.0RC2/hbase" revision: 
"e860c66d41ddc8231004b646098a58abca7fb523" user: "ndimiduk" date: "Tue May 12 
14:35:17 PDT 2015" src_checksum: "bcf4ec64372fbd348e6a97dc281c3b0f"
Exception encountered when invoking run on a nested suite - the temporary 
folder has not yet been created *** ABORTED ***
  java.lang.IllegalStateException: the temporary folder has not yet been 
created
  at org.junit.rules.TemporaryFolder.getRoot(TemporaryFolder.java:145)
  at 
org.junit.rules.TemporaryFolder.newFolder(TemporaryFolder.java:130)
  at 
org.apache.phoenix.query.BaseTest.setupTxManager(BaseTest.java:521)
  at 
org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:548)
  at 
org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:608)
  at 
org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:604)
  at 
org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:54)
  at 
org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(PhoenixSparkIT.scala:43)
  at 
org.apache.phoenix.spark.PhoenixSparkIT.beforeAll(PhoenixSparkIT.scala:67)
  at 
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
  ...
Run completed in 10 seconds, 425 milliseconds.
Total number of tests run: 0
Suites: completed 1, aborted 1
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
*** 1 SUITE ABORTED ***
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix .................................... SUCCESS [3.031s]
[INFO] Phoenix Core ...................................... SUCCESS 
[1:11:17.735s]
[INFO] Phoenix - Flume ................................... SUCCESS [1:22.810s]
[INFO] Phoenix - Pig ..................................... SUCCESS [2:49.875s]
[INFO] Phoenix Query Server Client ....................... SUCCESS [1.547s]
[INFO] Phoenix Query Server .............................. SUCCESS [1:44.807s]
[INFO] Phoenix - Pherf ................................... SUCCESS [1:58.202s]
[INFO] Phoenix - Spark ................................... FAILURE [43.616s]
[INFO] Phoenix Assembly .................................. SKIPPED
[INFO] Phoenix - Tracing Web Application ................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:20:02.152s
[INFO] Finished at: Mon Nov 30 01:43:54 UTC 2015
[INFO] Final Memory: 77M/749M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test 
(integration-test) on project phoenix-spark: There are test failures -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal 
org.scalatest:scalatest-maven-plugin:1.0:test (integration-test) on project 
phoenix-spark: There are test failures
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:213)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
        at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
        at 
org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
        at 
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
        at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
        at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
        at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
        at 
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
        at 
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
        at 
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures
        at org.scalatest.tools.maven.TestMojo.execute(TestMojo.java:107)
        at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
        ... 19 more
[ERROR] 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-spark
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Compressed 1.54 GB of artifacts by 32.2% relative to #975
Updating PHOENIX-2462
Recording test results

Reply via email to