Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1114

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 18.49 MB...]
Jul 29, 2018 12:40:15 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s40
Jul 29, 2018 12:40:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-0729004011-88df246b/output/results/staging/
Jul 29, 2018 12:40:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <115993 bytes, hash THpzdAR6HHNaq6cwtMDiuw> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-0729004011-88df246b/output/results/staging/pipeline-THpzdAR6HHNaq6cwtMDiuw.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures 
STANDARD_OUT
Dataflow SDK version: 2.7.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures 
STANDARD_ERROR
Jul 29, 2018 12:40:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-28_17_40_16-1285898957841904133?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures 
STANDARD_OUT
Submitted job: 2018-07-28_17_40_16-1285898957841904133

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures 
STANDARD_ERROR
Jul 29, 2018 12:40:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-07-28_17_40_16-1285898957841904133
Jul 29, 2018 12:40:16 AM 
org.apache.beam.runners.dataflow.TestDataflowRunner run
INFO: Running Dataflow job 2018-07-28_17_40_16-1285898957841904133 with 0 
expected assertions.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:16.059Z: Autoscaling is enabled for job 
2018-07-28_17_40_16-1285898957841904133. The number of workers will be between 
1 and 1000.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:16.093Z: Autoscaling was automatically enabled for 
job 2018-07-28_17_40_16-1285898957841904133.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:18.833Z: Checking required Cloud APIs are enabled.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:19.169Z: Checking permissions granted to controller 
Service Account.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:24.687Z: Worker configuration: n1-standard-1 in 
us-central1-b.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.152Z: Expanding CoGroupByKey operations into 
optimizable parts.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.424Z: Expanding GroupByKey operations into 
optimizable parts.
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.470Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.725Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.764Z: Elided trivial flatten 
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.810Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Wait.OnSignal/Wait/Map into SpannerIO.Write/Write 
mutations to Cloud Spanner/Create seed/Read(CreateSource)
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.845Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write 
mutations to Cloud Spanner/Wait.OnSignal/Wait/Map
Jul 29, 2018 12:40:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-29T00:40:25.888Z: Fusing consumer 

Build failed in Jenkins: beam_PostCommit_Python_Verify #5605

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.08 MB...]
test_repr (apache_beam.typehints.typehints_test.GeneratorHintTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.IterableHintTestCase) 
... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_tuple_compatibility 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_must_be_iterable 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... 

Build failed in Jenkins: beam_PreCommit_Java_Cron #159

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 12.13 MB...]

org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testUnnestLiteral 
STANDARD_ERROR
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `EXPR$0`.`EXPR$0`
FROM UNNEST(ARRAY['a', 'b', 'c']) AS `EXPR$0`
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(EXPR$0=[$0])
  Uncollect
LogicalProject(EXPR$0=[ARRAY('a', 'b', 'c')])
  LogicalValues(tuples=[[{ 0 }]])

Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0=[{inputs}], EXPR$0=[$t0])
  BeamUncollectRel
BeamCalcRel(expr#0=[{inputs}], expr#1=['a'], expr#2=['b'], 
expr#3=['c'], expr#4=[ARRAY($t1, $t2, $t3)], EXPR$0=[$t4])
  BeamValuesRel(tuples=[[{ 0 }]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testUnnestNamedLiteral 
STANDARD_ERROR
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `t`.`f_string`
FROM UNNEST(ARRAY['a', 'b', 'c']) AS `t` (`f_string`)
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(f_string=[$0])
  Uncollect
LogicalProject(EXPR$0=[ARRAY('a', 'b', 'c')])
  LogicalValues(tuples=[[{ 0 }]])

Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0=[{inputs}], f_string=[$t0])
  BeamUncollectRel
BeamCalcRel(expr#0=[{inputs}], expr#1=['a'], expr#2=['b'], 
expr#3=['c'], expr#4=[ARRAY($t1, $t2, $t3)], EXPR$0=[$t4])
  BeamValuesRel(tuples=[[{ 0 }]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > 
testSelectSingleRowFromArrayOfRows STANDARD_ERROR
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `PCOLLECTION`.`f_arrayOfRows`[1]
FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(EXPR$0$0=[ITEM($1, 1).f_rowString], EXPR$0$1=[ITEM($1, 
1).f_rowInt])
  BeamIOSourceRel(table=[[beam, PCOLLECTION]])

Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[ITEM($t1, $t2)], 
expr#4=[$t3.f_rowString], expr#5=[$t3.f_rowInt], EXPR$0$0=[$t4], EXPR$0$1=[$t5])
  BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testProjectArrayField 
STANDARD_ERROR
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `PCOLLECTION`.`f_int`, `PCOLLECTION`.`f_stringArr`
FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(f_int=[$0], f_stringArr=[$1])
  BeamIOSourceRel(table=[[beam, PCOLLECTION]])

Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..1=[{inputs}], proj#0..1=[{exprs}])
  BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > 
testNestedRowArrayFieldAccess STANDARD_ERROR
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`
FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(f_nestedArray=[$4])
  LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], 
f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], 
f_nestedArray=[$1.f_nestedArray])
BeamIOSourceRel(table=[[beam, PCOLLECTION]])

Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], 
f_nestedArray=[$t2])
  BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > 
testRowConstructorBraces STANDARD_ERROR
Jul 29, 2018 12:09:43 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS 

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1113

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 20.51 MB...]
INFO: 2018-07-28T18:46:38.961Z: Autoscaling is enabled for job 
2018-07-28_11_46_38-14507219244388826776. The number of workers will be between 
1 and 1000.
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:39.000Z: Autoscaling was automatically enabled for 
job 2018-07-28_11_46_38-14507219244388826776.
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:41.905Z: Checking required Cloud APIs are enabled.
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:42.096Z: Checking permissions granted to controller 
Service Account.
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:46.186Z: Worker configuration: n1-standard-1 in 
us-central1-b.
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:46.672Z: Expanding CoGroupByKey operations into 
optimizable parts.
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:46.939Z: Expanding GroupByKey operations into 
optimizable parts.
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:46.991Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.286Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.326Z: Elided trivial flatten 
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.375Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Wait.OnSignal/Wait/Map into SpannerIO.Write/Write 
mutations to Cloud Spanner/Create seed/Read(CreateSource)
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.414Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write 
mutations to Cloud Spanner/Wait.OnSignal/Wait/Map
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.454Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.502Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) 
into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.540Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
 into SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.588Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Jul 28, 2018 6:46:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T18:46:47.628Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 into 

Build failed in Jenkins: beam_PostCommit_Python_Verify #5604

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.08 MB...]
test_repr (apache_beam.typehints.typehints_test.GeneratorHintTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.IterableHintTestCase) 
... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_tuple_compatibility 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_must_be_iterable 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... 

[jira] [Work logged] (BEAM-4225) Integrate Nexmark with perfkit dashboards

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-4225?focusedWorklogId=128475=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-128475
 ]

ASF GitHub Bot logged work on BEAM-4225:


Author: ASF GitHub Bot
Created on: 28/Jul/18 18:29
Start Date: 28/Jul/18 18:29
Worklog Time Spent: 10m 
  Work Description: Ardagan commented on issue #4976: [BEAM-4225] Add 
Nexmark PostCommit runs for spark, flink and direct runner and export to 
Bigquery
URL: https://github.com/apache/beam/pull/4976#issuecomment-408626655
 
 
   I see. Thank you for explaining.
   
   I believe this PR is not the best place to do design discussion, but will 
still leave some ideas here. If you find any of these useful, we can move 
discussion to dev list.
   
   I believe we should follow the common pattern of sandboxing our pre/post 
commit tests to the state when we can run the same set of tests in parallel. It 
brings some of extra complication of implementation, but brings big benefits, 
of which these I find the biggest:
   1. we can run tests in parallel
   2. If test cases have dependencies, it becomes really hard to maintain state 
properly as amount of tests grows.
   
   I wonder if we can change the test to create BQ table with unique name when 
we start the test(test set) and remove it once test(test set) completes. This 
will decouple tests to the state of allowing independent parallel execution of 
test sets.
   
   And if these tests do want to aggregate performance statistics, then we 
should look to renaming job to PerformanceTests or have a separate job that 
does this work. I know that Alan Myrvold is working on dashboarding tools and 
aggregates Jenkins jobs statistics into BQ table. You might want to sync with 
him, probably tools he adds can be reused/generalized for this case.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 128475)
Time Spent: 11h 40m  (was: 11.5h)

> Integrate Nexmark with perfkit dashboards
> -
>
> Key: BEAM-4225
> URL: https://issues.apache.org/jira/browse/BEAM-4225
> Project: Beam
>  Issue Type: Improvement
>  Components: examples-nexmark
>Reporter: Etienne Chauchot
>Assignee: Etienne Chauchot
>Priority: Major
> Fix For: Not applicable
>
>  Time Spent: 11h 40m
>  Remaining Estimate: 0h
>
> The aim is to run Nexmark as post-commits and export the results to the 
> performance dashboards:
> see the threads:
> [https://lists.apache.org/thread.html/9f8fe1c6df7d8bfe2697332e69722ca4edd2810adc6a914cdf32da29@%3Cdev.beam.apache.org%3E]
> https://lists.apache.org/thread.html/701196efd6e74b7715785d43019a4a73e8a093997f59662fdadf8f2a@%3Cdev.beam.apache.org%3E



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4006) Futurize and fix python 2 compatibility for transforms subpackage

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-4006?focusedWorklogId=128455=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-128455
 ]

ASF GitHub Bot logged work on BEAM-4006:


Author: ASF GitHub Bot
Created on: 28/Jul/18 16:09
Start Date: 28/Jul/18 16:09
Worklog Time Spent: 10m 
  Work Description: Fematich commented on a change in pull request #5729: 
[BEAM-4006] Futurize transforms subpackage
URL: https://github.com/apache/beam/pull/5729#discussion_r205946917
 
 

 ##
 File path: sdks/python/apache_beam/transforms/window.py
 ##
 @@ -246,10 +263,33 @@ def __init__(self, value, timestamp):
 self.value = value
 self.timestamp = Timestamp.of(timestamp)
 
-  def __cmp__(self, other):
-if type(self) is not type(other):
-  return cmp(type(self), type(other))
-return cmp((self.value, self.timestamp), (other.value, other.timestamp))
+  def __eq__(self, other):
 
 Review comment:
   Using `total_ordering` results in unexpected behavior. Concretely the test 
`test_reshuffle_windows_unchanged` fails. 
   
   I have tried to locate the exact cause by implementing all OPs (with the 
`total_ordering` decorator in place) and subsequently leaving out the OPs one 
by one:
   
   1) adding the `total_ordering` decorator itself doesn't introduce issues
   2) only use `total_ordering` to fill in  `__lt__` works, using other 
combinations always fail.
   3) I am currently testing the OPs by manually using the conversion rules 
defined by 
[total_ordering](https://github.com/python/cpython/blob/master/Lib/functools.py)
 to see if I can locate the exact problem.
   
   @tvalentyn: Given [the 
note](https://docs.python.org/3/library/functools.html#functools.total_ordering)
 on performance impact of the total_ordering decorator, it might make sense to 
implement all OPs instead of using the decorator? That works already, in the 
meantime I will continue the testing (step3) to see if I can give more info.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 128455)
Time Spent: 13h  (was: 12h 50m)

> Futurize and fix python 2 compatibility for transforms subpackage
> -
>
> Key: BEAM-4006
> URL: https://issues.apache.org/jira/browse/BEAM-4006
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Matthias Feys
>Priority: Major
>  Time Spent: 13h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1112

2018-07-28 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_Verify #5603

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.08 MB...]
test_repr (apache_beam.typehints.typehints_test.GeneratorHintTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.IterableHintTestCase) 
... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_tuple_compatibility 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_must_be_iterable 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #306

2018-07-28 Thread Apache Jenkins Server
See 


Changes:

[qinyeli] Introducing version into CacheManager

[lukasz.gajowy] [BEAM-5037] Fix deserialization problem due to transient field

[lukasz.gajowy] Remove redundant static field

[aaltay] [BEAM-4747] mkdirs if they don't exist in localfilesystem (#5903)

--
[...truncated 1.41 MB...]
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-802024266811206-tmpdir/word-count-beam/.temp-beam-2018-07-28_11-23-25-1/3ce33b8b-c8b5-4f79-b986-4f834d56beee,
 shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@ca9ffe2, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-802024266811206-tmpdir/word-count-beam/counts-3-of-5
Jul 28, 2018 11:23:29 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-802024266811206-tmpdir/word-count-beam/.temp-beam-2018-07-28_11-23-25-1/39ce244c-1201-4008-81a8-7f1664d48630
Jul 28, 2018 11:23:29 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-802024266811206-tmpdir/word-count-beam/.temp-beam-2018-07-28_11-23-25-1/3ce33b8b-c8b5-4f79-b986-4f834d56beee
Jul 28, 2018 11:23:29 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-802024266811206-tmpdir/word-count-beam/.temp-beam-2018-07-28_11-23-25-1/4321fe46-7494-4335-ab92-944e2ba681d2
Jul 28, 2018 11:23:29 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-802024266811206-tmpdir/word-count-beam/.temp-beam-2018-07-28_11-23-25-1/e27bd382-144e-47a9-8152-43631f45927d
Jul 28, 2018 11:23:29 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-802024266811206-tmpdir/word-count-beam/.temp-beam-2018-07-28_11-23-25-1/838b8397-e703-4bc8-916e-5896a8c0ee8e
grep Foundation counts*
counts-4-of-5:Foundation: 1
Verified Foundation: 1
[SUCCESS]
:beam-runners-direct-java:runQuickstartJavaDirect (Thread[Daemon 
worker,5,main]) completed. Took 1 mins 33.832 secs.

> Task :beam-runners-google-cloud-dataflow-java:runQuickstartJavaDataflow
CommandException: No URLs matched: 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/count*
No files
mvn compile exec:java -q   
-Dexec.mainClass=org.apache.beam.examples.WordCount   
-Dexec.args="--runner=DataflowRunner
--project=apache-beam-testing
--gcpTempLocation=gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/tmp

--output=gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/counts
--inputFile=gs://apache-beam-samples/shakespeare/*" 
-Pdataflow-runner
Using maven /home/jenkins/tools/maven/apache-maven-3.5.2
Jul 28, 2018 11:24:49 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 28, 2018 11:24:49 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 114 files. Enable logging at DEBUG level to see which 
files will be staged.
Jul 28, 2018 11:24:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Jul 28, 2018 11:24:51 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 114 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Jul 28, 2018 11:24:52 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/tmp/groovy-generated-4569284166536255079-tmpdir/word-count-beam/target/classes 
to 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/tmp/staging/classes-9m45Wl3vzvv4Vts-0ReIbw.jar
Jul 28, 2018 11:24:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 113 files cached, 1 files newly uploaded
Jul 28, 2018 11:24:53 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ReadLines/Read as step s1
Jul 28, 2018 11:24:53 AM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern gs://apache-beam-samples/shakespeare/* matched 43 files with 
total size 5284696
Jul 28, 2018 

[jira] [Work logged] (BEAM-4225) Integrate Nexmark with perfkit dashboards

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-4225?focusedWorklogId=128407=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-128407
 ]

ASF GitHub Bot logged work on BEAM-4225:


Author: ASF GitHub Bot
Created on: 28/Jul/18 08:52
Start Date: 28/Jul/18 08:52
Worklog Time Spent: 10m 
  Work Description: lgajowy edited a comment on issue #4976: [BEAM-4225] 
Add Nexmark PostCommit runs for spark, flink and direct runner and export to 
Bigquery
URL: https://github.com/apache/beam/pull/4976#issuecomment-408592102
 
 
   @Ardagan the idea was not to mix data gathered from running the job on 
master (periodic runs) and data gathered from other branches (running from PRs) 
and hence have "false alarms" on plots eg. 
[here](https://apache-beam-testing.appspot.com/explore?dashboard=5099379773931520).
 IMO we could enable the phase triggering here as long as the data from other 
branches lands in some other big query table. 
   
   AFAIK we can accomplish this (quite easily, I guess) in two ways: 
- add a special "switch" in Jenkins job that will change the destination 
big query table based on branch (pseudo code: if branch == master then table == 
bgTableA else table == bqTableB)
- add another Jenkins job that is parametrized differently and phrase 
triggered only
   
   I agree (and recent events confirm that) that we should have a way to check 
if Nexmark tests aren't broken too. Probably the second solution is better 
because in case the job fails it's the phrase triggered job and we could 
configure it to not send emails to commits@ but publish the status only on 
github PR checks (and possibly change some other useful parameters - I didn't 
dig into this very deeply).
   
   All this is something that should be done with "PerformanceTest" tasks but 
unfortunately, we didn't do it yet.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 128407)
Time Spent: 11.5h  (was: 11h 20m)

> Integrate Nexmark with perfkit dashboards
> -
>
> Key: BEAM-4225
> URL: https://issues.apache.org/jira/browse/BEAM-4225
> Project: Beam
>  Issue Type: Improvement
>  Components: examples-nexmark
>Reporter: Etienne Chauchot
>Assignee: Etienne Chauchot
>Priority: Major
> Fix For: Not applicable
>
>  Time Spent: 11.5h
>  Remaining Estimate: 0h
>
> The aim is to run Nexmark as post-commits and export the results to the 
> performance dashboards:
> see the threads:
> [https://lists.apache.org/thread.html/9f8fe1c6df7d8bfe2697332e69722ca4edd2810adc6a914cdf32da29@%3Cdev.beam.apache.org%3E]
> https://lists.apache.org/thread.html/701196efd6e74b7715785d43019a4a73e8a093997f59662fdadf8f2a@%3Cdev.beam.apache.org%3E



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4225) Integrate Nexmark with perfkit dashboards

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-4225?focusedWorklogId=128406=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-128406
 ]

ASF GitHub Bot logged work on BEAM-4225:


Author: ASF GitHub Bot
Created on: 28/Jul/18 08:45
Start Date: 28/Jul/18 08:45
Worklog Time Spent: 10m 
  Work Description: lgajowy edited a comment on issue #4976: [BEAM-4225] 
Add Nexmark PostCommit runs for spark, flink and direct runner and export to 
Bigquery
URL: https://github.com/apache/beam/pull/4976#issuecomment-408592102
 
 
   @Ardagan the idea was not to mix data gathered from running the job on 
master (periodic runs) and data gathered from other branches (running from PRs) 
and hence have "false alarms" on plots eg. 
[here](https://apache-beam-testing.appspot.com/explore?dashboard=5099379773931520).
 IMO we could enable the phase triggering here as long as the data from other 
branches lands in some other big query table. 
   
   AFAIK we can accomplish this (quite easily, I guess) in two ways: 
- add a special "switch" in Jenkins job that will change the destination 
big query table based on branch (pseudo code: if branch == master then table == 
bgTableA else table == bqTableB)
- add another Jenkins job that is parametrized differently and phrase 
triggered only
   
   I agree (and recent events confirm that) that we should have a way to check 
if Nexmark tests aren't broken too. Probably the second solution is better 
because in case the job fails it's the phrase triggered job and we could 
configure it to not send emails to commits@ but publish the status only on 
github PR checks (and possibly change some other useful parameters - I didn't 
dig into this very deeply).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 128406)
Time Spent: 11h 20m  (was: 11h 10m)

> Integrate Nexmark with perfkit dashboards
> -
>
> Key: BEAM-4225
> URL: https://issues.apache.org/jira/browse/BEAM-4225
> Project: Beam
>  Issue Type: Improvement
>  Components: examples-nexmark
>Reporter: Etienne Chauchot
>Assignee: Etienne Chauchot
>Priority: Major
> Fix For: Not applicable
>
>  Time Spent: 11h 20m
>  Remaining Estimate: 0h
>
> The aim is to run Nexmark as post-commits and export the results to the 
> performance dashboards:
> see the threads:
> [https://lists.apache.org/thread.html/9f8fe1c6df7d8bfe2697332e69722ca4edd2810adc6a914cdf32da29@%3Cdev.beam.apache.org%3E]
> https://lists.apache.org/thread.html/701196efd6e74b7715785d43019a4a73e8a093997f59662fdadf8f2a@%3Cdev.beam.apache.org%3E



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4225) Integrate Nexmark with perfkit dashboards

2018-07-28 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-4225?focusedWorklogId=128405=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-128405
 ]

ASF GitHub Bot logged work on BEAM-4225:


Author: ASF GitHub Bot
Created on: 28/Jul/18 08:27
Start Date: 28/Jul/18 08:27
Worklog Time Spent: 10m 
  Work Description: lgajowy commented on issue #4976: [BEAM-4225] Add 
Nexmark PostCommit runs for spark, flink and direct runner and export to 
Bigquery
URL: https://github.com/apache/beam/pull/4976#issuecomment-408592102
 
 
   @Ardagan the idea was not to mix data gathered from running the job on 
master (periodic runs) and data gathered from other branches (running from PRs) 
and hence have "false alarms" on plots eg. 
[here](https://apache-beam-testing.appspot.com/explore?dashboard=5099379773931520).
 IMO we could enable the phase triggering here as long as the data from other 
branches lands in some other big query table. 
   
   AFAIK we can accomplish this (quite easily, I guess) in two ways: 
- add a special "switch" in Jenkins job that will change the destination 
big query table based on branch (pseudo code: if branch == master then table == 
bgTableA else table == bqTableB)
- add another Jenkins job that is parametrized differently and phrase 
triggered only
   
   I agree (and recent events confirm that) that we should have a way to check 
if Nexmark tests aren't broken too. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 128405)
Time Spent: 11h 10m  (was: 11h)

> Integrate Nexmark with perfkit dashboards
> -
>
> Key: BEAM-4225
> URL: https://issues.apache.org/jira/browse/BEAM-4225
> Project: Beam
>  Issue Type: Improvement
>  Components: examples-nexmark
>Reporter: Etienne Chauchot
>Assignee: Etienne Chauchot
>Priority: Major
> Fix For: Not applicable
>
>  Time Spent: 11h 10m
>  Remaining Estimate: 0h
>
> The aim is to run Nexmark as post-commits and export the results to the 
> performance dashboards:
> see the threads:
> [https://lists.apache.org/thread.html/9f8fe1c6df7d8bfe2697332e69722ca4edd2810adc6a914cdf32da29@%3Cdev.beam.apache.org%3E]
> https://lists.apache.org/thread.html/701196efd6e74b7715785d43019a4a73e8a093997f59662fdadf8f2a@%3Cdev.beam.apache.org%3E



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1111

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 19.25 MB...]
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures 
STANDARD_ERROR
Jul 28, 2018 6:44:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-27_23_44_39-7644345744185480523?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures 
STANDARD_OUT
Submitted job: 2018-07-27_23_44_39-7644345744185480523

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures 
STANDARD_ERROR
Jul 28, 2018 6:44:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-07-27_23_44_39-7644345744185480523
Jul 28, 2018 6:44:39 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-07-27_23_44_39-7644345744185480523 with 0 
expected assertions.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:39.086Z: Autoscaling is enabled for job 
2018-07-27_23_44_39-7644345744185480523. The number of workers will be between 
1 and 1000.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:39.123Z: Autoscaling was automatically enabled for 
job 2018-07-27_23_44_39-7644345744185480523.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:41.772Z: Checking required Cloud APIs are enabled.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:42.087Z: Checking permissions granted to controller 
Service Account.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:46.894Z: Worker configuration: n1-standard-1 in 
us-central1-b.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:47.323Z: Expanding CoGroupByKey operations into 
optimizable parts.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:47.579Z: Expanding GroupByKey operations into 
optimizable parts.
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:47.627Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:47.916Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:47.952Z: Elided trivial flatten 
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:48.088Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Wait.OnSignal/Wait/Map into SpannerIO.Write/Write 
mutations to Cloud Spanner/Create seed/Read(CreateSource)
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:48.127Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write 
mutations to Cloud Spanner/Wait.OnSignal/Wait/Map
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:48.170Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
Jul 28, 2018 6:44:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-07-28T06:44:48.213Z: Fusing consumer SpannerIO.Write/Write 
mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) 
into SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
Jul 28, 2018 

Build failed in Jenkins: beam_PostCommit_Python_Verify #5602

2018-07-28 Thread Apache Jenkins Server
See 


--
[...truncated 1.08 MB...]
test_repr (apache_beam.typehints.typehints_test.GeneratorHintTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.IterableHintTestCase) 
... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_tuple_compatibility 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_must_be_iterable 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_invalid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... 

Jenkins build is back to normal : beam_PreCommit_Java_Cron #156

2018-07-28 Thread Apache Jenkins Server
See