See 
<https://builds.apache.org/job/beam_PostCommit_Python37/1300/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-8701] Remove unused commons-io_1x dependency

[iemejia] [BEAM-8701] Update commons-io to version 2.6


------------------------------------------
[...truncated 2.43 MB...]
[CHAIN MapPartition (MapPartition at [3]write/Write/WriteImpl/{WriteBundles, 
Pair, WindowInto(WindowIntoFn)}) -> FlatMap (FlatMap at ExtractOutput[0]) -> 
Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN 
MapPartition (MapPartition at [3]write/Write/WriteImpl/{WriteBundles, Pair, 
WindowInto(WindowIntoFn)}) -> FlatMap (FlatMap at ExtractOutput[0]) -> Map (Key 
Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (2/2) 
(9c328326757b2b20aef0b9e6658cf9db).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task CHAIN 
MapPartition (MapPartition at [3]write/Write/WriteImpl/{WriteBundles, Pair, 
WindowInto(WindowIntoFn)}) -> FlatMap (FlatMap at ExtractOutput[0]) -> Map (Key 
Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) 
94e18b5e567c5320709307d5cd8944b3.
[CHAIN MapPartition (MapPartition at [3]write/Write/WriteImpl/{WriteBundles, 
Pair, WindowInto(WindowIntoFn)}) -> FlatMap (FlatMap at ExtractOutput[0]) -> 
Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task CHAIN MapPartition (MapPartition at 
[3]write/Write/WriteImpl/{WriteBundles, Pair, WindowInto(WindowIntoFn)}) -> 
FlatMap (FlatMap at ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine 
(GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key 
Extractor) (2/2) (9c328326757b2b20aef0b9e6658cf9db) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task CHAIN 
MapPartition (MapPartition at [3]write/Write/WriteImpl/{WriteBundles, Pair, 
WindowInto(WindowIntoFn)}) -> FlatMap (FlatMap at ExtractOutput[0]) -> Map (Key 
Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) 
9c328326757b2b20aef0b9e6658cf9db.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [3]write/Write/WriteImpl/{WriteBundles, Pair, 
WindowInto(WindowIntoFn)}) -> FlatMap (FlatMap at ExtractOutput[0]) -> Map (Key 
Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/2) 
(94e18b5e567c5320709307d5cd8944b3) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [3]write/Write/WriteImpl/{WriteBundles, Pair, 
WindowInto(WindowIntoFn)}) -> FlatMap (FlatMap at ExtractOutput[0]) -> Map (Key 
Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (2/2) 
(9c328326757b2b20aef0b9e6658cf9db) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (2/2) (3dee20bd6fe4b70a9339018cae29393e) switched from 
CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (2/2) (3dee20bd6fe4b70a9339018cae29393e) switched from 
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (2/2) (attempt #0) to 
812a031f-10d5-4aaa-8a9c-02d721011fed @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (2/2).
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(2/2) (3dee20bd6fe4b70a9339018cae29393e) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak 
safety net for task CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(2/2) (3dee20bd6fe4b70a9339018cae29393e) [DEPLOYING]
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (2/2) (3dee20bd6fe4b70a9339018cae29393e) 
[DEPLOYING].
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (2/2) (3dee20bd6fe4b70a9339018cae29393e) 
[DEPLOYING].
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(2/2) (3dee20bd6fe4b70a9339018cae29393e) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (2/2) (3dee20bd6fe4b70a9339018cae29393e) switched from 
DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at 
write/Write/WriteImpl/GroupByKey) (2/2) (da6ada38b62bfcf26058bac05cca1e17) 
switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (2/2) 
(da6ada38b62bfcf26058bac05cca1e17).
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(2/2) (3dee20bd6fe4b70a9339018cae29393e) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (2/2) (3dee20bd6fe4b70a9339018cae29393e).
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(2/2) (3dee20bd6fe4b70a9339018cae29393e) [FINISHED]
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) 
(2/2) (da6ada38b62bfcf26058bac05cca1e17) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) 3dee20bd6fe4b70a9339018cae29393e.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at 
write/Write/WriteImpl/GroupByKey) (1/2) (d7de2e81df8cb138f27233e4c5ebede4) 
switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/2) 
(d7de2e81df8cb138f27233e4c5ebede4).
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) 
(1/2) (d7de2e81df8cb138f27233e4c5ebede4) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) switched from 
CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupReduce 
(GroupReduce at write/Write/WriteImpl/GroupByKey) 
da6ada38b62bfcf26058bac05cca1e17.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) switched from 
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/2) (attempt #0) to 
812a031f-10d5-4aaa-8a9c-02d721011fed @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupReduce 
(GroupReduce at write/Write/WriteImpl/GroupByKey) 
d7de2e81df8cb138f27233e4c5ebede4.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (2/2) (3dee20bd6fe4b70a9339018cae29393e) switched from 
RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce 
(GroupReduce at write/Write/WriteImpl/GroupByKey) (2/2) 
(da6ada38b62bfcf26058bac05cca1e17) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce 
(GroupReduce at write/Write/WriteImpl/GroupByKey) (1/2) 
(d7de2e81df8cb138f27233e4c5ebede4) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/2).
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak 
safety net for task CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) [DEPLOYING]
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) 
[DEPLOYING].
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) 
[DEPLOYING].
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) switched from 
DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/2) (1bf66e6eb46fd5f995b00f90f68a9a31).
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at ExtractOutput[0]) 
(1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap 
(FlatMap at ExtractOutput[0]) 1bf66e6eb46fd5f995b00f90f68a9a31.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/Extract) -> FlatMap (FlatMap at 
ExtractOutput[0]) (1/2) (1bf66e6eb46fd5f995b00f90f68a9a31) switched from 
RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/PreFinalize) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/PreFinalize) -> FlatMap (FlatMap at ExtractOutput[0]) 
(1/2) (8e6ff47418836b2f9856786096b8f00c) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/PreFinalize) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/PreFinalize) -> FlatMap 
(FlatMap at ExtractOutput[0]) (1/2) (8e6ff47418836b2f9856786096b8f00c).
[CHAIN MapPartition (MapPartition at [1]write/Write/WriteImpl/PreFinalize) -> 
FlatMap (FlatMap at ExtractOutput[0]) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task CHAIN MapPartition (MapPartition at 
[1]write/Write/WriteImpl/PreFinalize) -> FlatMap (FlatMap at ExtractOutput[0]) 
(1/2) (8e6ff47418836b2f9856786096b8f00c) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task CHAIN 
MapPartition (MapPartition at [1]write/Write/WriteImpl/PreFinalize) -> FlatMap 
(FlatMap at ExtractOutput[0]) 8e6ff47418836b2f9856786096b8f00c.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition 
(MapPartition at [1]write/Write/WriteImpl/PreFinalize) -> FlatMap (FlatMap at 
ExtractOutput[0]) (1/2) (8e6ff47418836b2f9856786096b8f00c) switched from 
RUNNING to FINISHED.

> Task :sdks:python:test-suites:direct:py37:hdfsIntegrationTest
namenode_1  | Jan 07, 2020 3:40:39 PM 
com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class 
org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Jan 07, 2020 3:40:39 PM 
com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Jan 07, 2020 3:40:39 PM 
com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 
09/02/2011 11:17 AM'
namenode_1  | Jan 07, 2020 3:40:41 PM com.sun.jersey.spi.inject.Errors 
processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource 
and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public 
javax.ws.rs.core.Response 
org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam)
 throws java.io.IOException,java.lang.InterruptedException, with URI template, 
"/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public 
javax.ws.rs.core.Response 
org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam)
 throws java.io.IOException,java.lang.InterruptedException, with URI template, 
"/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public 
javax.ws.rs.core.Response 
org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam)
 throws java.io.IOException,java.lang.InterruptedException, with URI template, 
"/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public 
javax.ws.rs.core.Response 
org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam)
 throws java.io.IOException,java.lang.InterruptedException, with URI template, 
"/", is treated as a resource method
test_1      | DEBUG     http://namenode:50070 "GET 
/webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG     Uploading 1 files using 1 thread(s).
test_1      | DEBUG     Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO      Writing to '/kinglear.txt'.
test_1      | DEBUG     Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG     http://namenode:50070 "PUT 
/webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG     Starting new HTTP connection (1): datanode:50075
datanode_1  | 20/01/07 15:40:43 INFO datanode.webhdfs: 172.28.0.4 PUT 
/webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root
 201
namenode_1  | 20/01/07 15:40:43 INFO hdfs.StateChange: BLOCK* allocate 
blk_1073741825_1001, replicas=172.28.0.3:50010 for /kinglear.txt
datanode_1  | 20/01/07 15:40:43 INFO datanode.DataNode: Receiving 
BP-1736364195-172.28.0.2-1578411583220:blk_1073741825_1001 src: 
/172.28.0.3:59032 dest: /172.28.0.3:50010
datanode_1  | 20/01/07 15:40:43 INFO DataNode.clienttrace: src: 
/172.28.0.3:59032, dest: /172.28.0.3:50010, bytes: 157283, op: HDFS_WRITE, 
cliID: DFSClient_NONMAPREDUCE_-1301359970_67, offset: 0, srvID: 
4a8467d2-2836-426b-92bf-66430cca2b5c, blockid: 
BP-1736364195-172.28.0.2-1578411583220:blk_1073741825_1001, duration: 15456399
datanode_1  | 20/01/07 15:40:43 INFO datanode.DataNode: PacketResponder: 
BP-1736364195-172.28.0.2-1578411583220:blk_1073741825_1001, 
type=LAST_IN_PIPELINE terminating
namenode_1  | 20/01/07 15:40:44 INFO namenode.FSNamesystem: BLOCK* 
blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) 
in file /kinglear.txt
namenode_1  | 20/01/07 15:40:44 INFO namenode.EditLogFileOutputStream: Nothing 
to flush
namenode_1  | 20/01/07 15:40:44 INFO hdfs.StateChange: DIR* completeFile: 
/kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1301359970_67
test_1      | DEBUG     Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline 
using the default runner: DirectRunner.
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function annotate_downstream_side_inputs at 0x7f1b12da2170> 
====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function fix_side_input_pcoll_coders at 0x7f1b12da2290> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function lift_combiners at 0x7f1b12da2320> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_sdf at 0x7f1b12da23b0> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_gbk at 0x7f1b12da2440> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sink_flattens at 0x7f1b12da2560> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function greedily_fuse at 0x7f1b12da25f0> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function read_to_impulse at 0x7f1b12da2680> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function impulse_to_input at 0x7f1b12da2710> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function inject_timer_pcollections at 0x7f1b12da28c0> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sort_stages at 0x7f1b12da2950> ====================
test_1      | 
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function window_pcollection_coders at 0x7f1b12da29e0> ====================
test_1      | INFO:apache_beam.runners.worker.statecache:Creating state cache 
with size 100
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Created Worker 
handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler 
object at 0x7f1afdc67ed0> for environment urn: "beam:env:embedded_python:v1"
test_1      | 
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running 
((((ref_PCollection_PCollection_1_split/Read)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_split_7))+(ref_AppliedPTransform_pair_with_one_8))+(group/Write)
datanode_1  | 20/01/07 15:40:55 INFO datanode.webhdfs: 172.28.0.4 GET 
/webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0
 200
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running 
(((((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Impulse_19)+(ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:2591>)_20))+(ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22))+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23))+(ref_PCollection_PCollection_12/Write))+(ref_PCollection_PCollection_13/Write)
namenode_1  | 20/01/07 15:40:57 INFO namenode.FSEditLog: Number of 
transactions: 7 Total time for transactions(ms): 21 Number of transactions 
batched in Syncs: 1 Number of syncs: 6 SyncTimes(ms): 37 
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running 
((((((group/Read)+(ref_AppliedPTransform_count_13))+(ref_AppliedPTransform_format_14))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24))+(ref_AppliedPTransform_write/Write/WriteImpl/Pair_25))+(ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26))+(write/Write/WriteImpl/GroupByKey/Write)
test_1      | WARNING:apache_beam.io.hadoopfilesystem:Mime types are not 
supported. Got non-default mime_type: text/plain
datanode_1  | 20/01/07 15:40:58 INFO datanode.webhdfs: 172.28.0.4 PUT 
/webhdfs/v1/beam-temp-py-wordcount-integration-1945543a316411eaacba0242ac1c0004/488b20ac-fb87-4e99-bd7b-e83a0aa37943.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root
 201
namenode_1  | 20/01/07 15:40:58 INFO hdfs.StateChange: BLOCK* allocate 
blk_1073741826_1002, replicas=172.28.0.3:50010 for 
/beam-temp-py-wordcount-integration-1945543a316411eaacba0242ac1c0004/488b20ac-fb87-4e99-bd7b-e83a0aa37943.py-wordcount-integration
datanode_1  | 20/01/07 15:40:58 INFO datanode.DataNode: Receiving 
BP-1736364195-172.28.0.2-1578411583220:blk_1073741826_1002 src: 
/172.28.0.3:59958 dest: /172.28.0.3:50010
datanode_1  | 20/01/07 15:40:58 INFO DataNode.clienttrace: src: 
/172.28.0.3:59958, dest: /172.28.0.3:50010, bytes: 48944, op: HDFS_WRITE, 
cliID: DFSClient_NONMAPREDUCE_1581175773_69, offset: 0, srvID: 
4a8467d2-2836-426b-92bf-66430cca2b5c, blockid: 
BP-1736364195-172.28.0.2-1578411583220:blk_1073741826_1002, duration: 3816079
datanode_1  | 20/01/07 15:40:58 INFO datanode.DataNode: PacketResponder: 
BP-1736364195-172.28.0.2-1578411583220:blk_1073741826_1002, 
type=LAST_IN_PIPELINE terminating
namenode_1  | 20/01/07 15:40:58 INFO hdfs.StateChange: DIR* completeFile: 
/beam-temp-py-wordcount-integration-1945543a316411eaacba0242ac1c0004/488b20ac-fb87-4e99-bd7b-e83a0aa37943.py-wordcount-integration
 is closed by DFSClient_NONMAPREDUCE_1581175773_69
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running 
((write/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/Extract_31))+(ref_PCollection_PCollection_20/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_PCollection_PCollection_12/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32))+(ref_PCollection_PCollection_21/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running 
(ref_PCollection_PCollection_12/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33)
test_1      | INFO:apache_beam.io.filebasedsink:Starting finalize_write threads 
with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.12 
seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python37-1300_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python37-1300_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-1300_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-1300_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python37-1300_namenode_1 ... done
Aborting on container exit...

real    2m0.724s
user    0m1.232s
sys     0m0.147s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python37-1300 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python37-1300_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-1300_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-1300_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-1300_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-1300_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-1300_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python37-1300_test_net

real    0m5.309s
user    0m0.685s
sys     0m0.105s

> Task :sdks:python:test-suites:direct:py37:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner 
>>> --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=build/apache-beam.tar.gz 
>>> --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: 
>>> --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test
>>>  --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/build/gradleenv/1398941891/lib/python3.7/site-packages/setuptools/dist.py:476:
 UserWarning: Normalizing '2.19.0.dev' to '2.19.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1416:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py:257:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1603:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py:153:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1603:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1416:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py:769:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: This test doesn't work on DirectRunner.
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: 
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/nosetests.xml
----------------------------------------------------------------------
Ran 17 tests in 23.778s

OK (SKIP=1)
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel "unknown": Remote call on 
JNLP4-connect connection from 
165.171.154.104.bc.googleusercontent.com/104.154.171.165:49196 failed. The 
channel is closing down or has closed down
        at hudson.remoting.Channel.call(Channel.java:950)
        at 
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
        at com.sun.proxy.$Proxy141.isAlive(Unknown Source)
        at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1150)
        at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1142)
        at hudson.Launcher$ProcStarter.join(Launcher.java:470)
        at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
        at hudson.model.Build$BuildExecution.build(Build.java:206)
        at hudson.model.Build$BuildExecution.doRun(Build.java:163)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
        at hudson.model.Run.execute(Run.java:1815)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:97)
        at hudson.model.Executor.run(Executor.java:429)
Caused by: java.nio.channels.ClosedChannelException
        at 
org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
        at 
org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
        at 
org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
        at 
org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
        at 
org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
        at 
org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
        at 
org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
        at hudson.remoting.Channel.close(Channel.java:1452)
        at hudson.remoting.Channel.close(Channel.java:1405)
        at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:847)
        at hudson.slaves.SlaveComputer.access$800(SlaveComputer.java:108)
        at hudson.slaves.SlaveComputer$3.run(SlaveComputer.java:756)
        at 
jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
        at 
jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-12 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to