See <https://builds.apache.org/job/Geode-spark-connector/25/changes>

Changes:

[klund] GEODE-837: update tests from JUnit3 to JUnit4

[klund] GEODE-837: delete temporary script

[klund] GEODE-837: update eclipse formatter to match intellij formatter

[ukohlmeyer] GEODE-1377: Refactoring as per review comments

[ukohlmeyer] GEODE-1377: Updating JavaDocs to point to the correct property

[ukohlmeyer] GEODE-1377: Renaming of DistributedSystemConfigProperties to

[klund] GEODE-837: add JUnit4 category

[klund] GEODE-1516: update Eclipse and IntelliJ handling of imports

[klund] GEODE-1416: rename profiles to be Apache Geode

[upthewaterspout] Adding etc/eclipseOrganizeImports.importorder to rat excludes

------------------------------------------
[...truncated 1557 lines...]

M[info] Resolving 
org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...

M[info] Resolving 
org.apache.hadoop#hadoop-yarn-server-nodemanager;2.2.0 ...

M[info] Resolving 
org.apache.hadoop#hadoop-yarn-server;2.2.0 ...

M[info] Resolving org.apache.spark#spark-sql_2.10;1.3.0 
...

M[info] Resolving 
org.apache.spark#spark-catalyst_2.10;1.3.0 ...

M[info] Resolving org.scala-lang#scala-compiler;2.10.4 
...

M[info] Resolving org.scalamacros#quasiquotes_2.10;2.0.1 
...

M[info] Resolving com.twitter#parquet-column;1.6.0rc3 
...

M[info] Resolving com.twitter#parquet-common;1.6.0rc3 
...

M[info] Resolving com.twitter#parquet-encoding;1.6.0rc3 
...

M[info] Resolving com.twitter#parquet-generator;1.6.0rc3 
...

M[info] Resolving commons-codec#commons-codec;1.5 ...

M[info] Resolving com.twitter#parquet-hadoop;1.6.0rc3 
...

M[info] Resolving com.twitter#parquet-format;2.2.0-rc1 
...

M[info] Resolving com.twitter#parquet-jackson;1.6.0rc3 
...

M[info] Resolving 
org.codehaus.jackson#jackson-mapper-asl;1.9.11 ...

M[info] Resolving 
org.codehaus.jackson#jackson-core-asl;1.9.11 ...

M[info] Resolving org.jodd#jodd-core;3.6.3 ...

M[info] Resolving commons-net#commons-net;3.1 ...

M[info] Resolving 
org.scoverage#scalac-scoverage-runtime_2.10;1.0.4 ...

M[info] Resolving 
org.scoverage#scalac-scoverage-plugin_2.10;1.0.4 ...

M[info] Resolving org.scala-lang#jline;2.10.4 ...

M[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] there were 7 feature warning(s); re-run with -feature 
for details
[warn] one warning found
[warn] Note: Some input files use unchecked or unsafe 
operations.
[warn] Note: Recompile with -Xlint:unchecked for details.
[info] Packaging 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/scala-2.10/geode-spark-connector_2.10-0.5.0.jar
 ...
[info] Compiling 1 Scala source and 5 Java sources to 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/classes...
[info] Done packaging.
[info] Packaging 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-demos/basic-demos/target/scala-2.10/basic-demos_2.10-0.5.0.jar
 ...
[info] Done packaging.
[success] Total time: 67 s, completed Jun 10, 2016 1:35:28 
PM
[info] Compiling 11 Scala sources and 1 Java source to 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/scala-2.10/test-classes...
[warn] there were 2 feature warning(s); re-run with -feature 
for details
[warn] one warning found
[info 2016/06/10 13:35:53.015 UTC 
<pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest>
 tid=0x107] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, 
java.lang.String), type.size=159, data.size=151, row.avg.size=15.1

[info 2016/06/10 13:35:53.757 UTC 
<pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest>
 tid=0x107] StructStreamingResultSender: 10 rows, type=(java.lang.Integer, 
java.lang.String), type.size=0, data.size=151, row.avg.size=15.1

[info 2016/06/10 13:35:53.761 UTC 
<pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest>
 tid=0x107] StructStreamingResultSender: 0 rows, type=null, type.size=0, 
data.size=0, row.avg.size=NaN

[info 2016/06/10 13:35:53.791 UTC 
<pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest>
 tid=0x107] StructStreamingResultSender: 10000 rows, type=(java.lang.Integer), 
type.size=131, data.size=60015, row.avg.size=6.0

[info 2016/06/10 13:35:53.862 UTC <ForkJoinPool-1-worker-43> tid=0x116] 
sender2: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, 
row.avg.size=6.0

[info 2016/06/10 13:35:53.863 UTC <ForkJoinPool-1-worker-29> tid=0x115] 
sender1: 150 rows, type=(java.lang.Integer), type.size=131, data.size=901, 
row.avg.size=6.0

[info 2016/06/10 13:35:53.879 UTC <ForkJoinPool-1-worker-43> tid=0x116] 
sender1: 500 rows, type=(java.lang.Integer), type.size=131, data.size=3001, 
row.avg.size=6.0

[info 2016/06/10 13:35:53.927 UTC 
<pool-7-thread-13-ScalaTest-running-StructStreamingResultSenderAndCollectorTest>
 tid=0x107] StructStreamingResultSender: 1000 rows, type=(java.lang.String, 
java.lang.String), type.size=159, data.size=23792, row.avg.size=23.8

[info] StructStreamingResultSenderAndCollectorTest:
[info] - transfer simple data
[info] - transfer simple data with no type info
[info] - transfer data with 0 row
[info] - transfer data with 10K rows
[info] - transfer data with 10K rows with 2 sender
[info] - transfer data with 10K rows with 2 sender with 
error
[info] - transfer data with Exception
[info] - transfer string pair data with 200 rows
[info] - DataSerializer usage
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/home/jenkins/.ivy2/cache/org.apache.logging.log4j/log4j-slf4j-impl/jars/log4j-slf4j-impl-2.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[info 2016/06/10 13:36:00.506 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD 
id=0 to region test

[info 2016/06/10 13:36:00.558 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD 
id=0 to region test

[info 2016/06/10 13:36:00.603 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD 
id=0 to region test

[info 2016/06/10 13:36:00.642 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRDDFunctionsTest> tid=0x107] Save RDD 
id=0 to region test

[info] GeodeRDDFunctionsTest:
[info] - test PairRDDFunction Implicit
[info] - test RDDFunction Implicit
[info] - test GeodePairRDDWriter
[info] - test GeodeNonPairRDDWriter
[info] - test PairRDDFunctions.saveToGeode
[info] - test PairRDDFunctions.saveToGeode w/ 
opConf
[info] - test RDDFunctions.saveToGeode
[info] - test RDDFunctions.saveToGeode w/ opConf
r.type=java.lang.String r=List(/obj_obj_region)
[info] QueryParserTest:
[info] - select * from /r1
[info] - select c2 from /r1
[info] - select key, value from /r1.entries
[info] - select c1, c2 from /r1 where col1 > 100 and col2 
<= 120 or c3 = 2
[info] - select * from /r1/r2 where c1 >= 200
[info] - import io.pivotal select c1, c2, c3 from /r1/r2, 
/r3/r4 where c1 <= 15 and c2 = 100
[info] - SELECT distinct f1, f2 FROM /r1/r2 WHere f = 
100
[info] - IMPORT io.pivotal.geode IMPORT com.mypackage 
SELECT key,value FROM /root/sub.entries WHERE status = 'active' ORDER BY id 
desc
[info] - select distinct p.ID, p.status from /region p 
where p.ID > 5 order by p.status
[info] - SELECT DISTINCT * FROM /QueryRegion1 r1,  
/QueryRegion2 r2 WHERE r1.ID = r2.ID
[info] - SELECT id, "type", positions, status FROM 
/obj_obj_region WHERE status = 'active'
[info] - SELECT r.id, r."type", r.positions, r.status FROM 
/obj_obj_region r, r.positions.values f WHERE r.status = 'active' and f.secId = 
'MSFT'
[info] GeodeConnectionConfTest:
[info] - apply(SparkConf) w/ GeodeLocator property and 
empty geodeProps
[info] - apply(SparkConf) w/ GeodeLocator property and 
geode properties
[info] - apply(SparkConf) w/o GeodeLocator property
[info] - apply(SparkConf) w/ invalid GeodeLocator 
property
[info] - apply(locatorStr, geodeProps) w/ valid locatorStr 
and non geodeProps
[info] - apply(locatorStr, geodeProps) w/ valid locatorStr 
and non-empty geodeProps
[info] - apply(locatorStr, geodeProps) w/ invalid 
locatorStr
[info] - constructor w/ empty (host,port) pairs
[info] - getConnection() normal
[info] - getConnection() failure
[info] DefaultGeodeConnectionManagerTest:
[info] - DefaultGeodeConnectionFactory 
get/closeConnection
[info] - DefaultGeodeConnectionFactory newConnection(...) 
throws RuntimeException
[info] - DefaultGeodeConnectionFactory close() w/ 
non-exist connection
[warn 2016/06/10 13:36:01.883 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRDDPartitionerTest> tid=0x107] Invalid 
preferred partitioner name dummy.

[info] GeodeRDDPartitionerTest:
[info] - default partitioned region partitioner
[info] - default replicated region partitioner
[info] - GeodeRDDPartitioner.apply method
[info] - OnePartitionPartitioner
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & no 
empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & 1 
empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & 2 
empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1 & 5 
empty bucket
[info] - ServerSplitsPartitioner.doPartitions(): n=1, 4 
empty-bucket, non-continuous IDs
[info] - ServerSplitsPartitioner.doPartitions(): n=2, no 
empty buckets, 3 servers have 1, 2, and 3 buckets
[info] - ServerSplitsPartitioner.doPartitions(): n=3, no 
empty buckets, 4 servers have 0, 2, 3, and 4 buckets
[info] - ServerSplitsPartitioner.partitions(): metadata = 
None 
[info] - ServerSplitsPartitioner.partitions(): replicated 
region 
[info] - ServerSplitsPartitioner.partitions(): partitioned 
region w/o data 
[info] - ServerSplitsPartitioner.partitions(): partitioned 
region w/ some data 
[info 2016/06/10 13:36:02.002 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeFunctionDeployerTest> tid=0x107] 
Invalid jar 
file:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/file:/somemissingjarfilethatdoesnot.exist

[info 2016/06/10 13:36:02.004 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeFunctionDeployerTest> tid=0x107] 
Invalid jar 
file:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/file:/somemissingjarfilethatdoesnot.exist

[info 2016/06/10 13:36:02.005 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeFunctionDeployerTest> tid=0x107] 
Invalid jar 
file:/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/file:/somemissingjarfilethatdoesnot.exist

[info] GeodeFunctionDeployerTest:
[info] - jmx url creation
[info] - missing jar file
[info] - deploy with missing jar
[info] - successful mocked deploy
host name: hemera
canonical host name: hemera.apache.org
canonical host name 2: hemera.apache.org
[info] LocatorHelperTest:
[info] - locatorStr2HostPortPair hostname w/o 
domain
[info] - locatorStr2HostPortPair hostname w/ domain
[info] - locatorStr2HostPortPair w/ invalid host 
name
[info] - locatorStr2HostPortPair w/ valid port
[info] - locatorStr2HostPortPair w/ invalid port
[info] - parseLocatorsString with valid locator(s)
[info] - parseLocatorsString with invalid 
locator(s)
[info] - pickPreferredGeodeServers: shared servers and one 
gf-server per host
[info] - pickPreferredGeodeServers: shared servers, one 
gf-server per host, un-sorted list
[info] - pickPreferredGeodeServers: shared servers and two 
gf-server per host
[info] - pickPreferredGeodeServers: shared servers, two 
gf-server per host, un-sorted server list
[info] - pickPreferredGeodeServers: no shared servers and 
one gf-server per host
[info] - pickPreferredGeodeServers: no shared servers, one 
gf-server per host, and less gf-server
[info] - pickPreferredGeodeServers: ad-hoc
[info] ConnectorImplicitsTest:
[info] - implicit map2Properties
[info] - Test Implicit SparkContext Conversion
[info] - Test Implicit SQLContext Conversion
[info 2016/06/10 13:36:02.355 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 
region=test conn=, env=Map()

[info 2016/06/10 13:36:02.376 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 
region=test conn=, env=Map(preferred.partitioner -> OnePartition)

[info 2016/06/10 13:36:02.396 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 
region=test conn=, env=Map()

[info 2016/06/10 13:36:02.435 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeRegionRDDTest> tid=0x107] RDD id=0 
region=test conn=, env=Map()

[info] GeodeRegionRDDTest:
[info] - create GeodeRDD with non-existing region
[info] - getPartitions with non-existing region
[info] - getPartitions with replicated region and not 
preferred env
[info] - getPartitions with replicated region and 
preferred OnePartitionPartitioner
[info] - getPartitions with partitioned region and not 
preferred env
[info] - GeodeRDD.compute() method
[info 2016/06/10 13:36:02.708 UTC <pool-7-thread-13> tid=0x107] Save DStream 
region=testregion conn=

[info 2016/06/10 13:36:02.798 UTC <pool-7-thread-13> tid=0x107] Save RDD id=0 
to region testregion

[info 2016/06/10 13:36:02.839 UTC <pool-7-thread-13> tid=0x107] Save DStream 
region=testregion conn=

[info 2016/06/10 13:36:02.909 UTC <pool-7-thread-13> tid=0x107] Save RDD id=0 
to region testregion

[info] JavaAPITest:
[info] - testSparkContextFunction
[info] - testJavaPairDStreamFunctions
[info] - testSQLContextFunction
[info] - testJavaSparkContextFunctions
[info] - testJavaRDDFunctions
[info] - testJavaDStreamFunctions
[info] - 
testJavaPairDStreamFunctionsWithTuple2DStream
[info] - testJavaPairRDDFunctions
[info 2016/06/10 13:36:02.949 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0x107] Save 
DStream region=test conn=

[info 2016/06/10 13:36:02.965 UTC 
<pool-7-thread-13-ScalaTest-running-GeodeDStreamFunctionsTest> tid=0x107] Save 
DStream region=test conn=

[info] GeodeDStreamFunctionsTest:
[info] - test GeodePairDStreamFunctions Implicit
[info] - test GeodeDStreamFunctions Implicit
[info] - test 
GeodePairDStreamFunctions.saveToGeode()
[info] - test GeodeDStreamFunctions.saveToGeode()
[info] ScalaTest
[info] Run completed in 19 seconds, 652 
milliseconds.
[info] Total number of tests run: 96
[info] Suites: completed 12, aborted 0
[info] Tests: succeeded 96, failed 0, canceled 0, ignored 
0, pending 0
[info] All tests passed.
[info] Passed: Total 96, Failed 0, Errors 0, Passed 96
[success] Total time: 34 s, completed Jun 10, 2016 1:36:03 
PM
[info] Compiling 9 Scala sources and 4 Java sources to 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/target/scala-2.10/it-classes...
[error] 
/x1/jenkins/jenkins-slave/workspace/Geode-spark-connector/geode-spark-connector/geode-spark-connector/src/it/java/ittest/io/pivotal/geode/spark/connector/JavaApiIntegrationTest.java:20:
 error: cannot find symbol
[error] import 
com.gemstone.gemfire.distributed.DistributedSystemConfigProperties;
[error]                                        ^
[error]   symbol:   class DistributedSystemConfigProperties
[error]   location: package 
com.gemstone.gemfire.distributed
[error] 1 error
[error] (geode-spark-connector/it:compile) javac 
returned nonzero exit code
[error] Total time: 11 s, completed Jun 10, 2016 1:36:13 PM
Build step 'Execute shell' marked build as failure
Recording test results
Skipped archiving because build is not successful

Reply via email to