See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6787/display/redirect?page=changes>
Changes: [huangry] Add instructions to post-commit policy web page, according to [huangry] Update website/src/contribute/postcommits-policies-details.md ------------------------------------------ [...truncated 308.32 KB...] namenode_1 | INFO: Root resource classes found: namenode_1 | class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods namenode_1 | Dec 10, 2018 1:18:04 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses namenode_1 | INFO: Provider classes found: namenode_1 | class org.apache.hadoop.hdfs.web.resources.ExceptionHandler namenode_1 | class org.apache.hadoop.hdfs.web.resources.UserProvider namenode_1 | Dec 10, 2018 1:18:04 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate namenode_1 | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' namenode_1 | Dec 10, 2018 1:18:05 PM com.sun.jersey.spi.inject.Errors processErrorMessages namenode_1 | WARNING: The following warnings have been detected with resource and/or provider classes: namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method namenode_1 | WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method test_1 | DEBUG http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None test_1 | DEBUG Uploading 1 files using 1 thread(s). test_1 | DEBUG Uploading 'kinglear.txt' to '/kinglear.txt'. test_1 | INFO Writing to '/kinglear.txt'. test_1 | DEBUG Resolved path '/kinglear.txt' to '/kinglear.txt'. test_1 | DEBUG http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0 test_1 | DEBUG Starting new HTTP connection (1): datanode:50075 datanode_1 | 18/12/10 13:18:05 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201 namenode_1 | 18/12/10 13:18:05 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt datanode_1 | 18/12/10 13:18:05 INFO datanode.DataNode: Receiving BP-595030248-172.18.0.2-1544447836759:blk_1073741825_1001 src: /172.18.0.3:35098 dest: /172.18.0.3:50010 datanode_1 | 18/12/10 13:18:06 INFO DataNode.clienttrace: src: /172.18.0.3:35098, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_135923172_67, offset: 0, srvID: ad2f573f-ab75-4b5d-bfc9-20bbb58a664f, blockid: BP-595030248-172.18.0.2-1544447836759:blk_1073741825_1001, duration: 16169505 datanode_1 | 18/12/10 13:18:06 INFO datanode.DataNode: PacketResponder: BP-595030248-172.18.0.2-1544447836759:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating namenode_1 | 18/12/10 13:18:06 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /kinglear.txt namenode_1 | 18/12/10 13:18:06 INFO namenode.EditLogFileOutputStream: Nothing to flush namenode_1 | 18/12/10 13:18:06 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_135923172_67 test_1 | DEBUG Upload of 'kinglear.txt' to '/kinglear.txt' complete. test_1 | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner. test_1 | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7faf1032c500> ==================== test_1 | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7faf1032c398> ==================== test_1 | INFO:root:==================== <function lift_combiners at 0x7faf106996e0> ==================== test_1 | INFO:root:==================== <function expand_gbk at 0x7faf10699488> ==================== test_1 | INFO:root:==================== <function sink_flattens at 0x7faf10699578> ==================== test_1 | INFO:root:==================== <function greedily_fuse at 0x7faf10641410> ==================== test_1 | INFO:root:==================== <function impulse_to_input at 0x7faf106999b0> ==================== test_1 | INFO:root:==================== <function inject_timer_pcollections at 0x7faf106417d0> ==================== test_1 | INFO:root:==================== <function sort_stages at 0x7faf10641578> ==================== test_1 | INFO:root:==================== <function window_pcollection_coders at 0x7faf106415f0> ==================== test_1 | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write) test_1 | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write))) datanode_1 | 18/12/10 13:18:07 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200 test_1 | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write))) test_1 | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain datanode_1 | 18/12/10 13:18:09 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-08bab586fc7e11e880bb0242ac120004/5e06c397-54a3-4c68-8bfd-379b2ff72dd5.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201 namenode_1 | 18/12/10 13:18:09 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-08bab586fc7e11e880bb0242ac120004/5e06c397-54a3-4c68-8bfd-379b2ff72dd5.py-wordcount-integration datanode_1 | 18/12/10 13:18:09 INFO datanode.DataNode: Receiving BP-595030248-172.18.0.2-1544447836759:blk_1073741826_1002 src: /172.18.0.3:35116 dest: /172.18.0.3:50010 datanode_1 | 18/12/10 13:18:09 INFO DataNode.clienttrace: src: /172.18.0.3:35116, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1393186134_69, offset: 0, srvID: ad2f573f-ab75-4b5d-bfc9-20bbb58a664f, blockid: BP-595030248-172.18.0.2-1544447836759:blk_1073741826_1002, duration: 5879629 datanode_1 | 18/12/10 13:18:09 INFO datanode.DataNode: PacketResponder: BP-595030248-172.18.0.2-1544447836759:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating namenode_1 | 18/12/10 13:18:09 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-08bab586fc7e11e880bb0242ac120004/5e06c397-54a3-4c68-8bfd-379b2ff72dd5.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_1393186134_69 test_1 | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write)) test_1 | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write) test_1 | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27) test_1 | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1 test_1 | INFO:root:Renamed 1 shards in 0.14 seconds. test_1 | INFO:root:number of empty lines: 1663 test_1 | INFO:root:average word length: 4 hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_1 exited with code 0 Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ... Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ... Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ... done Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ... done Aborting on container exit... real 1m17.745s user 0m1.005s sys 0m0.191s + finally + docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6787 --no-ansi down Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_1 ... Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ... Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ... Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_1 ... done Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_namenode_1 ... done Removing hdfs_it-jenkins-beam_postcommit_python_verify-6787_datanode_1 ... done Removing network hdfs_it-jenkins-beam_postcommit_python_verify-6787_test_net real 0m0.649s user 0m0.259s sys 0m0.086s :beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 2 mins 28.165 secs. :beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started. > Task :beam-sdks-python:postCommitIT Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task Task ':beam-sdks-python:postCommitIT' is not up-to-date because: Task has not declared any outputs despite executing actions. Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT" Successfully started process 'command 'sh'' ########################################################################### # Build pipeline options if not provided in --pipeline_opts from commandline if [[ -z $PIPELINE_OPTS ]]; then # Check that the script is running in a known directory. if [[ $PWD != *sdks/python* ]]; then echo 'Unable to locate Apache Beam Python SDK root directory' exit 1 fi # Go to the Apache Beam Python SDK root if [[ "*sdks/python" != $PWD ]]; then cd $(pwd | sed 's/sdks\/python.*/sdks\/python/') fi # Create a tarball if not exists if [[ $(find ${SDK_LOCATION}) ]]; then SDK_LOCATION=$(find ${SDK_LOCATION}) else python setup.py -q sdist SDK_LOCATION=$(find dist/apache-beam-*.tar.gz) fi # Install test dependencies for ValidatesRunner tests. echo "pyhamcrest" > postcommit_requirements.txt echo "mock" >> postcommit_requirements.txt # Options used to run testing pipeline on Cloud Dataflow Service. Also used for # running on DirectRunner (some options ignored). opts=( "--runner=$RUNNER" "--project=$PROJECT" "--staging_location=$GCS_LOCATION/staging-it" "--temp_location=$GCS_LOCATION/temp-it" "--output=$GCS_LOCATION/py-it-cloud/output" "--sdk_location=$SDK_LOCATION" "--requirements_file=postcommit_requirements.txt" "--num_workers=$NUM_WORKERS" "--sleep_secs=$SLEEP_SECS" ) # Add --streaming if provided if [[ "$STREAMING" = true ]]; then opts+=("--streaming") fi # Add --dataflow_worker_jar if provided if [[ ! -z "$WORKER_JAR" ]]; then opts+=("--dataflow_worker_jar=$WORKER_JAR") fi PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}") fi pwd | sed 's/sdks\/python.*/sdks\/python/' find ${SDK_LOCATION} find ${SDK_LOCATION} IFS=" " ; echo "${opts[*]}" ########################################################################### # Run tests and validate that jobs finish successfully. echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS" python setup.py nosetests \ --test-pipeline-options="$PIPELINE_OPTS" \ $TEST_OPTS >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=build/apache-beam.tar.gz >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0' normalized_version, running nosetests running egg_info writing requirements to apache_beam.egg-info/requires.txt writing apache_beam.egg-info/PKG-INFO writing top-level names to apache_beam.egg-info/top_level.txt writing dependency_links to apache_beam.egg-info/dependency_links.txt writing entry points to apache_beam.egg-info/entry_points.txt reading manifest file 'apache_beam.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'README.md' warning: no files found matching 'NOTICE' warning: no files found matching 'LICENSE' writing manifest file 'apache_beam.egg-info/SOURCES.txt' test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 18 tests in 2753.798s OK Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-16782271963442100812?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_26_06-5215672516099076790?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_33_29-8186462617952180136?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_39_15-4753622910723972224?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_45_07-2327722867894913170?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_51_43-10077632862119004298?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_58_15-13670819857058327589?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_48-1874236607803171656?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-491409402266755890?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_47-8603110114272425538?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_31_27-6974289105382371380?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_37_40-16107392902980871693?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_43_33-12224031943308507181?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-8989471799269995084?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_43-9164888784766002425?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_24_52-14940826842618409191?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_32_15-1197881686568611222?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_44-15971653544795751510?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_26_52-11007745241646050502?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_34_14-7056313100898702335?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_40_05-14487547893361462227?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_18_45-11602252167180115616?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_05_26_46-11781859920948038434?project=apache-beam-testing. :beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 45 mins 54.621 secs. FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159 * What went wrong: Execution failed for task ':beam-sdks-python:directRunnerIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 49m 22s 6 actionable tasks: 6 executed Publishing build scan... https://gradle.com/s/3jpomnohtxsio Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
