Jenkins build is back to normal : beam_PerformanceTests_MongoDBIO_IT #12

2018-04-05 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_XmlIOIT #106

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 26.46 KB...]
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 
from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded 
jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #17

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 2.18 KB...]
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2837958353079201861.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2866252505365865488.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1522904463735
namespace "filebasedioithdfs-1522904463735" created
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5737188218089572141.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1522904463735
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3879090292638939439.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7553835627086643546.sh
+ rm -rf .env
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7408367566939172021.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins766282711874605416.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#md5=ca299c7acd13a72e1171a3697f2b99bc
Downloading/unpacking pip from 
https://pypi.python.org/packages/ac/95/a05b56bb975efa78d3557efa36acaf9cf5d2fd0ee0062060493687432e03/pip-9.0.3-py2.py3-none-any.whl#md5=d512ceb964f38ba31addb8142bc657cb
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6863240451551082709.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins1051277673276700256.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: colorama; extra == 

Build failed in Jenkins: beam_PerformanceTests_Spark #1552

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 95.26 KB...]
'apache-beam-testing:bqjob_r68c50cdb89187fb4_01629475d320_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-04-05 06:19:34,283 57196319 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 06:19:59,159 57196319 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 06:20:01,271 57196319 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_r2f9a87590e4b9b5e_016294763c37_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r2f9a87590e4b9b5e_016294763c37_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r2f9a87590e4b9b5e_016294763c37_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-04-05 06:20:01,272 57196319 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 06:20:28,236 57196319 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 06:20:30,238 57196319 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_r61f26089bce9f163_01629476add2_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r61f26089bce9f163_01629476add2_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r61f26089bce9f163_01629476add2_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-04-05 06:20:30,238 57196319 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 06:20:52,603 57196319 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 06:20:54,769 57196319 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_rd2c66bb87b51091_016294770cfc_1 ... (0s) Current status: 
RUNNING 

Jenkins build is still unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #5284

2018-04-05 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #103

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 1.33 KB...]
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins3684671170551440007.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running gcloud.container.clusters.get-credentials with 
Namespace(__calliope_internal_deepest_parser=ArgumentParser(prog='gcloud.container.clusters.get-credentials',
 usage=None, description='Updates a kubeconfig file with appropriate 
credentials to point\nkubectl at a Container Engine Cluster. By default, 
credentials\nare written to HOME/.kube/config. You can provide an 
alternate\npath by setting the KUBECONFIG environment variable.\n\nSee 
[](https://cloud.google.com/container-engine/docs/kubectl) for\nkubectl 
documentation.', version=None, formatter_class=, conflict_handler='error', add_help=False), 
account=None, api_version=None, authority_selector=None, 
authorization_token_file=None, 
calliope_command=, command_path=['gcloud', 'container', 'clusters', 
'get-credentials'], configuration=None, credential_file_override=None, 
document=None, flatten=None, format=None, h=None, help=None, http_timeout=None, 
log_http=None, name='io-datastores', project=None, quiet=None, 
trace_email=None, trace_log=None, trace_token=None, user_output_enabled=None, 
verbosity='debug', version=None, zone='us-central1-a').
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins2888967128889343168.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins3286287107919754640.sh
+ kubectl 
--kubeconfig=
 create namespace hadoopinputformatioit-1522904473673
namespace "hadoopinputformatioit-1522904473673" created
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins312848425247962558.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=hadoopinputformatioit-1522904473673
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins8061087835858783299.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins3498118597165469817.sh
+ rm -rf .env
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins5620653085581022050.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins801367948690061949.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#md5=ca299c7acd13a72e1171a3697f2b99bc
Downloading/unpacking pip from 
https://pypi.python.org/packages/ac/95/a05b56bb975efa78d3557efa36acaf9cf5d2fd0ee0062060493687432e03/pip-9.0.3-py2.py3-none-any.whl#md5=d512ceb964f38ba31addb8142bc657cb
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins7425282899250112399.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #10

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 902 B...]
Checking out Revision 50139b4395584513099094445f57c495b515 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 50139b4395584513099094445f57c495b515
Commit message: "Merge pull request #4790 from 
rmannibucau/fix/BEAM-3409_wait-for-teardown-execution-in-direct-runner"
 > git rev-list --no-walk 5caa883f191ca4cd9158694ae94a673edbc7b4d5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6423070961003326956.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running gcloud.container.clusters.get-credentials with 
Namespace(__calliope_internal_deepest_parser=ArgumentParser(prog='gcloud.container.clusters.get-credentials',
 usage=None, description='Updates a kubeconfig file with appropriate 
credentials to point\nkubectl at a Container Engine Cluster. By default, 
credentials\nare written to HOME/.kube/config. You can provide an 
alternate\npath by setting the KUBECONFIG environment variable.\n\nSee 
[](https://cloud.google.com/container-engine/docs/kubectl) for\nkubectl 
documentation.', version=None, formatter_class=, conflict_handler='error', add_help=False), 
account=None, api_version=None, authority_selector=None, 
authorization_token_file=None, 
calliope_command=, command_path=['gcloud', 'container', 'clusters', 
'get-credentials'], configuration=None, credential_file_override=None, 
document=None, flatten=None, format=None, h=None, help=None, http_timeout=None, 
log_http=None, name='io-datastores', project=None, quiet=None, 
trace_email=None, trace_log=None, trace_token=None, user_output_enabled=None, 
verbosity='debug', version=None, zone='us-central1-a').
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7624480610345280474.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7481394402757567812.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1522904463774
namespace "filebasedioithdfs-1522904463774" created
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8482952389076382691.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1522904463774
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5181750691448407041.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2168162618368449583.sh
+ rm -rf .env
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4997376092010102745.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2888464945787120100.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#md5=ca299c7acd13a72e1171a3697f2b99bc
Downloading/unpacking pip from 
https://pypi.python.org/packages/ac/95/a05b56bb975efa78d3557efa36acaf9cf5d2fd0ee0062060493687432e03/pip-9.0.3-py2.py3-none-any.whl#md5=d512ceb964f38ba31addb8142bc657cb
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4

Build failed in Jenkins: beam_PerformanceTests_JDBC #414

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 47.18 KB...]
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 
from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded 
jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] 

[jira] [Work logged] (BEAM-3250) Migrate ValidatesRunner Jenkins PostCommits to Gradle

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3250?focusedWorklogId=87897=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-87897
 ]

ASF GitHub Bot logged work on BEAM-3250:


Author: ASF GitHub Bot
Created on: 05/Apr/18 06:04
Start Date: 05/Apr/18 06:04
Worklog Time Spent: 10m 
  Work Description: herohde commented on a change in pull request #5029: 
[BEAM-3250] Migrate Dataflow ValidatesRunner test to Gradle
URL: https://github.com/apache/beam/pull/5029#discussion_r179356996
 
 

 ##
 File path: runners/google-cloud-dataflow-java/build.gradle
 ##
 @@ -70,23 +77,60 @@ dependencies {
   shadow library.java.jackson_annotations
   shadow library.java.jackson_databind
   shadow library.java.slf4j_api
-  testCompile library.java.hamcrest_core
-  testCompile library.java.junit
-  testCompile 
project(":sdks:java:io:google-cloud-platform").sourceSets.test.output
-  testCompile project(path: ":sdks:java:core", configuration: "shadowTest")
-  testCompile 
project(":sdks:java:extensions:google-cloud-platform-core").sourceSets.test.output
-  testCompile library.java.guava_testlib
-  testCompile library.java.slf4j_jdk14
-  testCompile library.java.mockito_core
-  testCompile library.java.google_cloud_dataflow_java_proto_library_all
-  testCompile library.java.datastore_v1_protos
-  testCompile library.java.jackson_dataformat_yaml
+  shadowTest library.java.hamcrest_core
+  shadowTest library.java.junit
+  shadowTest 
project(":sdks:java:io:google-cloud-platform").sourceSets.test.output
+  shadowTest project(path: ":sdks:java:core", configuration: "shadowTest")
+  shadowTest 
project(":sdks:java:extensions:google-cloud-platform-core").sourceSets.test.output
+  shadowTest library.java.guava_testlib
+  shadowTest library.java.slf4j_jdk14
+  shadowTest library.java.mockito_core
+  shadowTest library.java.google_cloud_dataflow_java_proto_library_all
+  shadowTest library.java.datastore_v1_protos
+  shadowTest library.java.jackson_dataformat_yaml
+  validatesRunner project(path: ":sdks:java:core", configuration: "shadowTest")
+  validatesRunner project(path: project.path, configuration: "shadow")
 }
 
 test {
   systemProperties = [ "beamUseDummyRunner" : "true" ]
 }
 
+task validatesRunnerTest(type: Test) {
+  group = "Verification"
+  def dataflowProject = project.findProperty('dataflowProject') ?: 
'apache-beam-testing'
+  def dataflowTempRoot = project.findProperty('dataflowTempRoot') ?: 
'gs://temp-storage-for-validates-runner-tests/'
+  systemProperty "beamTestPipelineOptions", JsonOutput.toJson([
+  "--runner=TestDataflowRunner",
+  "--project=${dataflowProject}",
+  "--tempRoot=${dataflowTempRoot}",
+  ])
+
+
+  classpath = configurations.validatesRunner
+  testClassesDirs = 
files(project(":sdks:java:core").sourceSets.test.output.classesDirs)
+  useJUnit {
+includeCategories 'org.apache.beam.sdk.testing.ValidatesRunner'
+excludeCategories 'org.apache.beam.sdk.testing.LargeKeys$Above10MB'
 
 Review comment:
   I got it from here: 
https://github.com/apache/beam/blob/a2a5d3d7aa59b5cfde1c47a6286bcb3ccd7f8c85/runners/google-cloud-dataflow-java/pom.xml#L283
   
   Happy to remove them, if I misread the pom.
   
   On a separate note, I see I missed the parallelism -- it's executing 
sequentially (which was useful for testing it), but is not what we want in 
Jenkins.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 87897)
Time Spent: 2.5h  (was: 2h 20m)

> Migrate ValidatesRunner Jenkins PostCommits to Gradle
> -
>
> Key: BEAM-3250
> URL: https://issues.apache.org/jira/browse/BEAM-3250
> Project: Beam
>  Issue Type: Sub-task
>  Components: build-system, testing
>Reporter: Luke Cwik
>Assignee: Henning Rohde
>Priority: Major
>  Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> Update these targets to execute ValidatesRunner tests: 
> https://github.com/apache/beam/search?l=Groovy=ValidatesRunner==%E2%9C%93



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #5283

2018-04-05 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Python #1108

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 1.68 KB...]
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2623527860617858768.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins9169363831443975735.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8961877075134670843.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#md5=ca299c7acd13a72e1171a3697f2b99bc
Downloading/unpacking pip from 
https://pypi.python.org/packages/ac/95/a05b56bb975efa78d3557efa36acaf9cf5d2fd0ee0062060493687432e03/pip-9.0.3-py2.py3-none-any.whl#md5=d512ceb964f38ba31addb8142bc657cb
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5224391872517450872.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8989479736936102922.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy==1.13.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: colorama; extra == "windows" in 
/usr/lib/python2.7/dist-packages (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: xmltodict in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests-ntlm>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: ntlm-auth>=1.0.2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #11

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] [BEAM-3409] waitUntilFinish should wait teardown even for the 
direct

[ehudm] Update Python Gradle tasks to run in a venv.

[ehudm] Add Gradle based Python precommit.

[aaltay] [BEAM-3250] Migrate Apex and Gearpump ValidatesRunner tests to Gradle

[robertwb] Secure GRPC channel for SDK worker (#4984)

--
[...truncated 270.94 KB...]
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:248)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:235)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy60.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy61.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:248)
at 

[jira] [Work logged] (BEAM-3973) Allow to disable batch API in SpannerIO

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3973?focusedWorklogId=88021=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88021
 ]

ASF GitHub Bot logged work on BEAM-3973:


Author: ASF GitHub Bot
Created on: 05/Apr/18 12:49
Start Date: 05/Apr/18 12:49
Worklog Time Spent: 10m 
  Work Description: iemejia commented on issue #4946: [BEAM-3973] Adds a 
parameter to the Cloud Spanner read connector that can disable batch API
URL: https://github.com/apache/beam/pull/4946#issuecomment-378924305
 
 
   @chamikaramj It should be ok now, Can you please take a second look and 
merge if you agree.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88021)
Time Spent: 2h 10m  (was: 2h)

> Allow to disable batch API in SpannerIO
> ---
>
> Key: BEAM-3973
> URL: https://issues.apache.org/jira/browse/BEAM-3973
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Mairbek Khadikov
>Assignee: Mairbek Khadikov
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> In 2.4.0, SpannerIO#read has been migrated to use batch API. The batch API 
> provides abstractions to scale out reads from Spanner, but it requires the 
> query to be root-partitionable. The root-partitionable queries cover majority 
> of the use cases, however there are examples when running arbitrary query is 
> useful. For example, reading all the table names from the 
> information_schema.* and reading the content of those tables in the next 
> step. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Dataflow #5285

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Comment Edited] (BEAM-4016) Direct runner incorrect lifecycle, @SplitRestriction should execute after @Setup on SplittableDoFn

2018-04-05 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-4016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426988#comment-16426988
 ] 

Ismaël Mejía edited comment on BEAM-4016 at 4/5/18 2:13 PM:


[~jkff] Can you confirm if the lifecycle should be as I suggest?


was (Author: iemejia):
[~jkff] Can you confirm if the lifecycle is as I mention?

> Direct runner incorrect lifecycle, @SplitRestriction should execute after 
> @Setup on SplittableDoFn
> --
>
> Key: BEAM-4016
> URL: https://issues.apache.org/jira/browse/BEAM-4016
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.4.0
>Reporter: Ismaël Mejía
>Assignee: Thomas Groh
>Priority: Major
> Attachments: sdf-splitrestriction-lifeycle-test.patch
>
>
> SplitRestriction is the method where we can split in advance a SDF. It makes 
> sense to execute this after the @Setup method given that usually connections 
> are established at Setup and can be used to ask the different data stores 
> about the partitioning strategy. I added a test for this in the 
> SplittableDoFnTest.SDFWithLifecycle test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-4016) Direct runner incorrect lifecycle, @SplitRestriction should execute after @Setup on SplittableDoFn

2018-04-05 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-4016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-4016:
---
Description: The method annotated with @SplitRestriction is the method 
where we can define the RestrictionTrackers (splits) in advance in a SDF. It 
makes sense to execute this after the @Setup method given that usually 
connections are established at Setup and can be used to ask the different data 
stores about the partitioning strategy. I added a test for this in the 
SplittableDoFnTest.SDFWithLifecycle test.  (was: SplitRestriction is the method 
where we can split in advance a SDF. It makes sense to execute this after the 
@Setup method given that usually connections are established at Setup and can 
be used to ask the different data stores about the partitioning strategy. I 
added a test for this in the SplittableDoFnTest.SDFWithLifecycle test.)

> Direct runner incorrect lifecycle, @SplitRestriction should execute after 
> @Setup on SplittableDoFn
> --
>
> Key: BEAM-4016
> URL: https://issues.apache.org/jira/browse/BEAM-4016
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.4.0
>Reporter: Ismaël Mejía
>Assignee: Thomas Groh
>Priority: Major
> Attachments: sdf-splitrestriction-lifeycle-test.patch
>
>
> The method annotated with @SplitRestriction is the method where we can define 
> the RestrictionTrackers (splits) in advance in a SDF. It makes sense to 
> execute this after the @Setup method given that usually connections are 
> established at Setup and can be used to ask the different data stores about 
> the partitioning strategy. I added a test for this in the 
> SplittableDoFnTest.SDFWithLifecycle test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3973) Allow to disable batch API in SpannerIO

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3973?focusedWorklogId=88024=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88024
 ]

ASF GitHub Bot logged work on BEAM-3973:


Author: ASF GitHub Bot
Created on: 05/Apr/18 12:50
Start Date: 05/Apr/18 12:50
Worklog Time Spent: 10m 
  Work Description: iemejia commented on a change in pull request #4946: 
[BEAM-3973] Adds a parameter to the Cloud Spanner read connector that can 
disable batch API
URL: https://github.com/apache/beam/pull/4946#discussion_r179449485
 
 

 ##
 File path: 
sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/spanner/SpannerReadIT.java
 ##
 @@ -193,6 +184,59 @@ public void testQuery() throws Exception {
 p.run();
   }
 
+  private SpannerConfig createSpannerConfig() {
+return SpannerConfig.create()
+.withProjectId(project)
+.withInstanceId(options.getInstanceId())
+.withDatabaseId(databaseName);
+  }
+
+  @Test
+  public void testReadAllRecordsInDb() throws Exception {
+DatabaseClient databaseClient = getDatabaseClient();
+
+List mutations = new ArrayList<>();
 
 Review comment:
   I referred also to this part too, and what I expected was more in the line 
of a makeTableData method like [BigtableIOTest's 
one](https://github.com/apache/beam/blob/50a84326581941bc1edf573a0ad2b798ecb0f6a1/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIOTest.java#L995)
 given that all the tests are using the same data.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88024)
Time Spent: 2h 20m  (was: 2h 10m)

> Allow to disable batch API in SpannerIO
> ---
>
> Key: BEAM-3973
> URL: https://issues.apache.org/jira/browse/BEAM-3973
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Mairbek Khadikov
>Assignee: Mairbek Khadikov
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 20m
>  Remaining Estimate: 0h
>
> In 2.4.0, SpannerIO#read has been migrated to use batch API. The batch API 
> provides abstractions to scale out reads from Spanner, but it requires the 
> query to be root-partitionable. The root-partitionable queries cover majority 
> of the use cases, however there are examples when running arbitrary query is 
> useful. For example, reading all the table names from the 
> information_schema.* and reading the content of those tables in the next 
> step. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #9

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Simplify the Beam and/or SQL Expressions

--
[...truncated 109.74 MB...]
INFO: Un-registering task and sending final execution state FINISHED to 
JobManager for task Combine.perKey(TestCombineFnWithContext) -> 
PAssert$167/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$167/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$167/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$167/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (c2669920972c101b4181a7420d58cacb)
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
PAssert$167/GroupGlobally/GroupDummyAndContents -> 
PAssert$167/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$167/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$167/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$167/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$167/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(1/1) (907a8d564a5c2679de75d029329ce543) [FINISHED]
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: Sum/ProduceDefault/ParMultiDo(Anonymous) (1/1) 
(23f98dde903c7271fb8ee1065f961e4f) switched from RUNNING to FINISHED.
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: Source: PAssert$166/GroupGlobally/Create.Values/Read(CreateSource) -> 
PAssert$166/GroupGlobally/WindowIntoDummy/Window.Assign.out (1/1) 
(389abca6d5b5c886228332a81c13eba2) switched from RUNNING to FINISHED.
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/05/2018 14:01:23   Sum/ProduceDefault/ParMultiDo(Anonymous)(1/1) 
switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/05/2018 14:01:23 Sum/ProduceDefault/ParMultiDo(Anonymous)(1/1) switched 
to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/05/2018 14:01:23   Source: 
PAssert$166/GroupGlobally/Create.Values/Read(CreateSource) -> 
PAssert$166/GroupGlobally/WindowIntoDummy/Window.Assign.out(1/1) switched to 
FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/05/2018 14:01:23 Source: 
PAssert$166/GroupGlobally/Create.Values/Read(CreateSource) -> 
PAssert$166/GroupGlobally/WindowIntoDummy/Window.Assign.out(1/1) switched to 
FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: PAssert$166/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$166/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$166/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$166/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous) (1/1) 
(6475a05ca4d4877ed5ff93ead99b522f) switched from RUNNING to FINISHED.
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: PAssert$166/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$166/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$166/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$166/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (b01c99d58587991a24d45b4ecc012023) switched from 
RUNNING to FINISHED.
Apr 05, 2018 2:01:23 PM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/05/2018 14:01:23   
PAssert$166/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$166/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$166/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$166/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous)(1/1) 
switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/05/2018 14:01:23 PAssert$166/GroupGlobally/GatherAllOutputs/GroupByKey 
-> 
PAssert$166/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$166/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$166/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous)(1/1) 
switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 05, 2018 2:01:23 PM 

[jira] [Created] (BEAM-4016) Direct runner incorrect lifecycle, @SplitRestriction should execute after @Setup on SplittableDoFn

2018-04-05 Thread JIRA
Ismaël Mejía created BEAM-4016:
--

 Summary: Direct runner incorrect lifecycle, @SplitRestriction 
should execute after @Setup on SplittableDoFn
 Key: BEAM-4016
 URL: https://issues.apache.org/jira/browse/BEAM-4016
 Project: Beam
  Issue Type: Bug
  Components: runner-direct
Affects Versions: 2.4.0
Reporter: Ismaël Mejía
Assignee: Thomas Groh


SplitRestriction is the method where we can split in advance a SDF. It makes 
sense to execute this after the @Setup method given that usually connections 
are established at Setup and can be used to ask the different data stores about 
the partitioning strategy. I added a test for this in the 
SplittableDoFnTest.SDFWithLifecycle test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #5031 from coheigea/simplify_expressions

2018-04-05 Thread jbonofre
This is an automated email from the ASF dual-hosted git repository.

jbonofre pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit a3252ee2def23da453d8c5643e6431fd117413b8
Merge: 50139b4 5a0b95f
Author: Jean-Baptiste Onofré 
AuthorDate: Thu Apr 5 15:48:43 2018 +0200

Merge pull request #5031 from coheigea/simplify_expressions

Simplify the Beam and/or SQL Expressions

 .../sql/impl/interpreter/operator/logical/BeamSqlAndExpression.java   | 4 ++--
 .../sql/impl/interpreter/operator/logical/BeamSqlOrExpression.java| 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
jbono...@apache.org.


[beam] branch master updated (50139b4 -> a3252ee)

2018-04-05 Thread jbonofre
This is an automated email from the ASF dual-hosted git repository.

jbonofre pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 50139b4  Merge pull request #4790 from 
rmannibucau/fix/BEAM-3409_wait-for-teardown-execution-in-direct-runner
 add 5a0b95f  Simplify the Beam and/or SQL Expressions
 new a3252ee  Merge pull request #5031 from coheigea/simplify_expressions

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../sql/impl/interpreter/operator/logical/BeamSqlAndExpression.java   | 4 ++--
 .../sql/impl/interpreter/operator/logical/BeamSqlOrExpression.java| 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
jbono...@apache.org.


[jira] [Updated] (BEAM-4016) Direct runner incorrect lifecycle, @SplitRestriction should execute after @Setup on SplittableDoFn

2018-04-05 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-4016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-4016:
---
Attachment: sdf-splitrestriction-lifeycle-test.patch

> Direct runner incorrect lifecycle, @SplitRestriction should execute after 
> @Setup on SplittableDoFn
> --
>
> Key: BEAM-4016
> URL: https://issues.apache.org/jira/browse/BEAM-4016
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.4.0
>Reporter: Ismaël Mejía
>Assignee: Thomas Groh
>Priority: Major
> Attachments: sdf-splitrestriction-lifeycle-test.patch
>
>
> SplitRestriction is the method where we can split in advance a SDF. It makes 
> sense to execute this after the @Setup method given that usually connections 
> are established at Setup and can be used to ask the different data stores 
> about the partitioning strategy. I added a test for this in the 
> SplittableDoFnTest.SDFWithLifecycle test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4016) Direct runner incorrect lifecycle, @SplitRestriction should execute after @Setup on SplittableDoFn

2018-04-05 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-4016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16426988#comment-16426988
 ] 

Ismaël Mejía commented on BEAM-4016:


[~jkff] Can you confirm if the lifecycle is as I mention?

> Direct runner incorrect lifecycle, @SplitRestriction should execute after 
> @Setup on SplittableDoFn
> --
>
> Key: BEAM-4016
> URL: https://issues.apache.org/jira/browse/BEAM-4016
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.4.0
>Reporter: Ismaël Mejía
>Assignee: Thomas Groh
>Priority: Major
> Attachments: sdf-splitrestriction-lifeycle-test.patch
>
>
> SplitRestriction is the method where we can split in advance a SDF. It makes 
> sense to execute this after the @Setup method given that usually connections 
> are established at Setup and can be used to ask the different data stores 
> about the partitioning strategy. I added a test for this in the 
> SplittableDoFnTest.SDFWithLifecycle test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Spark #1553

2018-04-05 Thread Apache Jenkins Server
See 


--
[...truncated 89.06 KB...]
'apache-beam-testing:bqjob_r18d3283aa90acab3_016295c112ef_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-04-05 12:21:23,089 749927a7 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 12:21:52,240 749927a7 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 12:21:54,358 749927a7 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_r706f890cafe4d83b_016295c18cf7_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r706f890cafe4d83b_016295c18cf7_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r706f890cafe4d83b_016295c18cf7_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-04-05 12:21:54,359 749927a7 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 12:22:13,138 749927a7 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 12:22:15,423 749927a7 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_r5242409a03e833a9_016295c1dea3_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r5242409a03e833a9_016295c1dea3_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r5242409a03e833a9_016295c1dea3_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-04-05 12:22:15,423 749927a7 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 12:22:32,152 749927a7 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 12:22:34,184 749927a7 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: Upload complete.
Waiting on bqjob_r935a56fd3bc2b16_016295c228e2_1 ... (0s) Current status: 
RUNNING 
Waiting on bqjob_r935a56fd3bc2b16_016295c228e2_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r935a56fd3bc2b16_016295c228e2_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 

[jira] [Work logged] (BEAM-3973) Allow to disable batch API in SpannerIO

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3973?focusedWorklogId=88019=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88019
 ]

ASF GitHub Bot logged work on BEAM-3973:


Author: ASF GitHub Bot
Created on: 05/Apr/18 12:46
Start Date: 05/Apr/18 12:46
Worklog Time Spent: 10m 
  Work Description: iemejia commented on a change in pull request #4946: 
[BEAM-3973] Adds a parameter to the Cloud Spanner read connector that can 
disable batch API
URL: https://github.com/apache/beam/pull/4946#discussion_r179449485
 
 

 ##
 File path: 
sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/spanner/SpannerReadIT.java
 ##
 @@ -193,6 +184,59 @@ public void testQuery() throws Exception {
 p.run();
   }
 
+  private SpannerConfig createSpannerConfig() {
+return SpannerConfig.create()
+.withProjectId(project)
+.withInstanceId(options.getInstanceId())
+.withDatabaseId(databaseName);
+  }
+
+  @Test
+  public void testReadAllRecordsInDb() throws Exception {
+DatabaseClient databaseClient = getDatabaseClient();
+
+List mutations = new ArrayList<>();
 
 Review comment:
   I referred also to this part too, and what I expected was more in the line 
of a makeTableData method like [BigtableIOTest's 
one](https://github.com/apache/beam/blob/50a84326581941bc1edf573a0ad2b798ecb0f6a1/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIOTest.java#L995).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88019)
Time Spent: 2h  (was: 1h 50m)

> Allow to disable batch API in SpannerIO
> ---
>
> Key: BEAM-3973
> URL: https://issues.apache.org/jira/browse/BEAM-3973
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Mairbek Khadikov
>Assignee: Mairbek Khadikov
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h
>  Remaining Estimate: 0h
>
> In 2.4.0, SpannerIO#read has been migrated to use batch API. The batch API 
> provides abstractions to scale out reads from Spanner, but it requires the 
> query to be root-partitionable. The root-partitionable queries cover majority 
> of the use cases, however there are examples when running arbitrary query is 
> useful. For example, reading all the table names from the 
> information_schema.* and reading the content of those tables in the next 
> step. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Python_ValidatesRunner_Dataflow #1263

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Assigned] (BEAM-4014) Migrate MavenInstall Jenkins PostCommits to Gradle

2018-04-05 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4014?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh reassigned BEAM-4014:
-

Assignee: Thomas Groh

> Migrate MavenInstall Jenkins PostCommits to Gradle
> --
>
> Key: BEAM-4014
> URL: https://issues.apache.org/jira/browse/BEAM-4014
> Project: Beam
>  Issue Type: Sub-task
>  Components: build-system, testing
>Reporter: Henning Rohde
>Assignee: Thomas Groh
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-2823) Beam Windows MavenInstall tests failing

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2823?focusedWorklogId=88114=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88114
 ]

ASF GitHub Bot logged work on BEAM-2823:


Author: ASF GitHub Bot
Created on: 05/Apr/18 16:22
Start Date: 05/Apr/18 16:22
Worklog Time Spent: 10m 
  Work Description: alanmyrvold commented on issue #5033: [BEAM-2823] 
Delete failing Beam Windows MavenInstall tests
URL: https://github.com/apache/beam/pull/5033#issuecomment-378994225
 
 
   +R: @aaltay 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88114)
Time Spent: 20m  (was: 10m)

> Beam Windows MavenInstall tests failing
> ---
>
> Key: BEAM-2823
> URL: https://issues.apache.org/jira/browse/BEAM-2823
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Affects Versions: 2.1.0
> Environment: Windows
>Reporter: Reuven Lax
>Assignee: Jason Kuster
>Priority: Minor
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_MavenInstall_Windows/417
> Install fails with
> java.io.FileNotFoundException: 
> F:\jenkins\jenkins-slave\workspace\beam_PostCommit_Java_MavenInstall_Windows\sdks\common\runner-api\target\protoc-plugins\protoc-3.2.0-windows-x86_64.exe
>  (The process cannot access the file because it is being used by another 
> process)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-2823) Beam Windows MavenInstall tests failing

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2823?focusedWorklogId=88113=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88113
 ]

ASF GitHub Bot logged work on BEAM-2823:


Author: ASF GitHub Bot
Created on: 05/Apr/18 16:22
Start Date: 05/Apr/18 16:22
Worklog Time Spent: 10m 
  Work Description: alanmyrvold opened a new pull request #5033: 
[BEAM-2823] Delete failing Beam Windows MavenInstall tests
URL: https://github.com/apache/beam/pull/5033
 
 
   Delete failing Beam Windows MavenInstall tests
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88113)
Time Spent: 10m
Remaining Estimate: 0h

> Beam Windows MavenInstall tests failing
> ---
>
> Key: BEAM-2823
> URL: https://issues.apache.org/jira/browse/BEAM-2823
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Affects Versions: 2.1.0
> Environment: Windows
>Reporter: Reuven Lax
>Assignee: Jason Kuster
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_MavenInstall_Windows/417
> Install fails with
> java.io.FileNotFoundException: 
> F:\jenkins\jenkins-slave\workspace\beam_PostCommit_Java_MavenInstall_Windows\sdks\common\runner-api\target\protoc-plugins\protoc-3.2.0-windows-x86_64.exe
>  (The process cannot access the file because it is being used by another 
> process)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3910) Support floating point values in Go SDK

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3910?focusedWorklogId=88075=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88075
 ]

ASF GitHub Bot logged work on BEAM-3910:


Author: ASF GitHub Bot
Created on: 05/Apr/18 15:11
Start Date: 05/Apr/18 15:11
Worklog Time Spent: 10m 
  Work Description: wcn3 commented on issue #4941: BEAM-3910: Add float 
support for the Go SDK.
URL: https://github.com/apache/beam/pull/4941#issuecomment-378970726
 
 
   PTAL, this is ready for review.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88075)
Time Spent: 1h 10m  (was: 1h)
Remaining Estimate: 22h 50m  (was: 23h)

> Support floating point values in Go SDK
> ---
>
> Key: BEAM-3910
> URL: https://issues.apache.org/jira/browse/BEAM-3910
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-go
>Reporter: Bill Neubauer
>Assignee: Bill Neubauer
>Priority: Major
>   Original Estimate: 24h
>  Time Spent: 1h 10m
>  Remaining Estimate: 22h 50m
>
> The Go SDK supports all the integer types of the language, but does not 
> support floats.
> My plan for coding is to use the same technique the gob package uses, which 
> results in a compact encoding for simple values.
> [https://golang.org/src/encoding/gob/encode.go?#L210|https://golang.org/src/encoding/gob/encode.go#L210]
>  with rationale explained in 
> https://golang.org/pkg/encoding/gob/#hdr-Encoding_Details
> The resulting uint is then encoded using the existing coders in coderx.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4019) Refactor HBaseIO splitting to produce ByteKeyRange objects

2018-04-05 Thread JIRA
Ismaël Mejía created BEAM-4019:
--

 Summary: Refactor HBaseIO splitting to produce ByteKeyRange objects
 Key: BEAM-4019
 URL: https://issues.apache.org/jira/browse/BEAM-4019
 Project: Beam
  Issue Type: Improvement
  Components: io-java-hbase
Reporter: Ismaël Mejía
Assignee: Ismaël Mejía


This allows to reuse the splitting logic for a future SDF-based implementation 
by reusing it as part of the @SplitRestriction method.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4018) Add a ByteKeyRangeTracker based on RestrictionTracker for SDF

2018-04-05 Thread JIRA
Ismaël Mejía created BEAM-4018:
--

 Summary: Add a ByteKeyRangeTracker based on RestrictionTracker for 
SDF
 Key: BEAM-4018
 URL: https://issues.apache.org/jira/browse/BEAM-4018
 Project: Beam
  Issue Type: New Feature
  Components: sdk-java-core
Reporter: Ismaël Mejía
Assignee: Ismaël Mejía


We can have a RestrictionTracker for ByteKey ranges as part of the core sdk so 
it can be reused by future SDF based IOs like Bigtable, HBase among others.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=88070=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88070
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 05/Apr/18 14:43
Start Date: 05/Apr/18 14:43
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on issue #4905: [BEAM-3848] 
Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#issuecomment-378960656
 
 
   @iemejia - I'll test if the recent commits to master fix the hack for 
awaiting thread termination when we come to rebase this PR if that's ok with 
you?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88070)
Time Spent: 6h 40m  (was: 6.5h)

> SolrIO: Improve retrying mechanism in client writes
> ---
>
> Key: BEAM-3848
> URL: https://issues.apache.org/jira/browse/BEAM-3848
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-solr
>Affects Versions: 2.2.0, 2.3.0
>Reporter: Tim Robertson
>Assignee: Tim Robertson
>Priority: Minor
>  Time Spent: 6h 40m
>  Remaining Estimate: 0h
>
> A busy SOLR server is prone to return RemoteSOLRException on writing which 
> currently fails a complete task (e.g. a partition of a spark RDD being 
> written to SOLR).
> A good addition would be the ability to provide a retrying mechanism for the 
> batch in flight, rather than failing fast, which will most likely trigger a 
> much larger retry of more writes.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4020) Add an HBaseIO implementation based on SDF

2018-04-05 Thread JIRA
Ismaël Mejía created BEAM-4020:
--

 Summary: Add an HBaseIO implementation based on SDF
 Key: BEAM-4020
 URL: https://issues.apache.org/jira/browse/BEAM-4020
 Project: Beam
  Issue Type: New Feature
  Components: io-java-hbase
Reporter: Ismaël Mejía
Assignee: Ismaël Mejía


Since the support from runners is still limited, it is probably wise to create 
a first IO based on the current SDF batch implementation in Java to 
validate/test it with a real data-store. Since HBase partitioning model is 
quite straightforward it is a perfect candidate.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4021) "No such file or directory" in beam_PreCommit_Python_GradleBuild

2018-04-05 Thread Udi Meiri (JIRA)
Udi Meiri created BEAM-4021:
---

 Summary: "No such file or directory" in 
beam_PreCommit_Python_GradleBuild
 Key: BEAM-4021
 URL: https://issues.apache.org/jira/browse/BEAM-4021
 Project: Beam
  Issue Type: Bug
  Components: testing
Reporter: Udi Meiri
Assignee: Udi Meiri


Seems to only happen in this working directory:
{{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild}}
but not this:
{{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild@2}}


{{ERROR: invocation failed (errno 2), args: 
['/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/target/.tox/py27-cython2/bin/pip',
 'install', 'cython==0.26.1'], cwd: 
/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python}}
{{ Traceback (most recent call last):}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/bin/tox",
 line 11, in }}
{{ sys.exit(run_main())}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 40, in run_main}}
{{ main(args)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 46, in main}}
{{ retcode = Session(config).runcommand()}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 415, in runcommand}}
{{ return self.subcommand_test()}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 599, in subcommand_test}}
{{ if self.setupenv(venv):}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 491, in setupenv}}
{{ status = venv.update(action=action)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 171, in update}}
{{ self.hook.tox_testenv_install_deps(action=action, venv=self)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
 line 617, in __call__}}
{{ return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
 line 222, in _hookexec}}
{{ return self._inner_hookexec(hook, methods, kwargs)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
 line 216, in }}
{{ firstresult=hook.spec_opts.get('firstresult'),}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
 line 201, in _multicall}}
{{ return outcome.get_result()}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
 line 77, in get_result}}
{{ _reraise(*ex) # noqa}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
 line 180, in _multicall}}
{{ res = hook_impl.function(*args)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 452, in tox_testenv_install_deps}}
{{ venv._install(deps, action=action)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 331, in _install}}
{{ action=action)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 303, in run_install_command}}
{{ action=action, redirect=self.session.report.verbosity < 2)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 409, in _pcall}}
{{ redirect=redirect, ignore_ret=ignore_ret)}}
{{ File 

[jira] [Work logged] (BEAM-3355) Make Go SDK runtime harness hooks pluggable

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3355?focusedWorklogId=88080=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88080
 ]

ASF GitHub Bot logged work on BEAM-3355:


Author: ASF GitHub Bot
Created on: 05/Apr/18 15:17
Start Date: 05/Apr/18 15:17
Worklog Time Spent: 10m 
  Work Description: wcn3 commented on issue #4311: [BEAM-3355] Diagnostic 
interfaces
URL: https://github.com/apache/beam/pull/4311#issuecomment-378972753
 
 
   PTAL, this is ready for review.
   
   This PR also fixes the Dataflow job submission process to work with the 
options changes made to support the Flink runner. I've verified CPU profiling 
works on Cloud Dataflow. The session runner interface needs to be reconsidered, 
as one monolithic output file is not a good match to most cloud storage 
systems. Changing the output to chunked files is the most likely improvement, 
but requires changes to the session runner. I've opened BEAM-4015 to track this 
development, so the session hook is TBD in Dataflow for now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88080)
Time Spent: 4.5h  (was: 4h 20m)

> Make Go SDK runtime harness hooks pluggable
> ---
>
> Key: BEAM-3355
> URL: https://issues.apache.org/jira/browse/BEAM-3355
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Bill Neubauer
>Priority: Minor
>  Time Spent: 4.5h
>  Remaining Estimate: 0h
>
> We currently hardcode cpu profiling and session recording in the harness. We 
> should make it pluggable instead.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=88090=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88090
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 05/Apr/18 15:33
Start Date: 05/Apr/18 15:33
Worklog Time Spent: 10m 
  Work Description: iemejia commented on issue #4905: [BEAM-3848] Enables 
ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#issuecomment-378977996
 
 
   Yes please can you rebase and squash the extra commits please. I will take a 
look just after that (again sorry for my delay).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88090)
Time Spent: 6h 50m  (was: 6h 40m)

> SolrIO: Improve retrying mechanism in client writes
> ---
>
> Key: BEAM-3848
> URL: https://issues.apache.org/jira/browse/BEAM-3848
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-solr
>Affects Versions: 2.2.0, 2.3.0
>Reporter: Tim Robertson
>Assignee: Tim Robertson
>Priority: Minor
>  Time Spent: 6h 50m
>  Remaining Estimate: 0h
>
> A busy SOLR server is prone to return RemoteSOLRException on writing which 
> currently fails a complete task (e.g. a partition of a spark RDD being 
> written to SOLR).
> A good addition would be the ability to provide a retrying mechanism for the 
> batch in flight, rather than failing fast, which will most likely trigger a 
> much larger retry of more writes.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4017) Go session runner should write multiple files

2018-04-05 Thread Bill Neubauer (JIRA)
Bill Neubauer created BEAM-4017:
---

 Summary: Go session runner should write multiple files
 Key: BEAM-4017
 URL: https://issues.apache.org/jira/browse/BEAM-4017
 Project: Beam
  Issue Type: Improvement
  Components: sdk-go
Reporter: Bill Neubauer
Assignee: Bill Neubauer


The Go session runner allows a worker to "play back" a previous execution, 
which can be useful for debugging or profiling sessions. However, the recording 
facility produces one file for the entire lifetime of the worker. While this is 
useful for local debugging, it won't work well for workers at scale.

Having the session capture facility make the output chunkable will help larger 
systems scale. I suggest that the interface for session writing be expanded 
from a io.WriteCloser to include a sequence number that systems can use to 
produce an ordered set of files for playback.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4021) "No such file or directory" in beam_PreCommit_Python_GradleBuild

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4021?focusedWorklogId=88112=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88112
 ]

ASF GitHub Bot logged work on BEAM-4021:


Author: ASF GitHub Bot
Created on: 05/Apr/18 16:19
Start Date: 05/Apr/18 16:19
Worklog Time Spent: 10m 
  Work Description: udim opened a new pull request #5032: [BEAM-4021] 
Always recreate tox virtualenvs.
URL: https://github.com/apache/beam/pull/5032
 
 
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88112)
Time Spent: 10m
Remaining Estimate: 0h

> "No such file or directory" in beam_PreCommit_Python_GradleBuild
> 
>
> Key: BEAM-4021
> URL: https://issues.apache.org/jira/browse/BEAM-4021
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: Udi Meiri
>Assignee: Udi Meiri
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Seems to only happen in this working directory:
> {{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild}}
> but not this:
> {{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild@2}}
> {{ERROR: invocation failed (errno 2), args: 
> ['/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/target/.tox/py27-cython2/bin/pip',
>  'install', 'cython==0.26.1'], cwd: 
> /home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python}}
> {{ Traceback (most recent call last):}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/bin/tox",
>  line 11, in }}
> {{ sys.exit(run_main())}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 40, in run_main}}
> {{ main(args)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 46, in main}}
> {{ retcode = Session(config).runcommand()}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 415, in runcommand}}
> {{ return self.subcommand_test()}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 599, in subcommand_test}}
> {{ if self.setupenv(venv):}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 491, in setupenv}}
> {{ status = venv.update(action=action)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
>  line 171, in update}}
> {{ self.hook.tox_testenv_install_deps(action=action, venv=self)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
>  line 617, in __call__}}

[jira] [Work logged] (BEAM-2823) Beam Windows MavenInstall tests failing

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2823?focusedWorklogId=88115=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88115
 ]

ASF GitHub Bot logged work on BEAM-2823:


Author: ASF GitHub Bot
Created on: 05/Apr/18 16:25
Start Date: 05/Apr/18 16:25
Worklog Time Spent: 10m 
  Work Description: aaltay closed pull request #5033: [BEAM-2823] Delete 
failing Beam Windows MavenInstall tests
URL: https://github.com/apache/beam/pull/5033
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git 
a/.test-infra/jenkins/job_beam_PostCommit_Java_MavenInstall_Windows.groovy 
b/.test-infra/jenkins/job_beam_PostCommit_Java_MavenInstall_Windows.groovy
deleted file mode 100644
index 8dfaa85f9f3..000
--- a/.test-infra/jenkins/job_beam_PostCommit_Java_MavenInstall_Windows.groovy
+++ /dev/null
@@ -1,46 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-import common_job_properties
-
-// This is the Java postcommit which runs maven install targeting Jenkins 
running on Windows.
-mavenJob('beam_PostCommit_Java_MavenInstall_Windows') {
-  description('Runs postcommit tests on Windows for the Java SDK.')
-
-  // Execute concurrent builds if necessary.
-  concurrentBuild()
-
-  // Set common parameters. Note the usage of the Windows label to filter 
Jenkins executors.
-  common_job_properties.setTopLevelMainJobProperties(delegate, 'master', 100, 
'Windows')
-
-  // Set Maven parameters. Note the usage of the Windows Maven installation
-  common_job_properties.setMavenConfig(delegate, 'Maven 3.5.2 (Windows)')
-
-  // Sets that this is a PostCommit job.
-  // TODO(BEAM-1042, BEAM-1045, BEAM-2269, BEAM-2299) Turn notifications back 
on once fixed.
-  common_job_properties.setPostCommit(delegate, '0 */6 * * *', false, '', 
false)
-
-  // Allows triggering this build against pull requests.
-  common_job_properties.enablePhraseTriggeringFromPullRequest(
-  delegate,
-  'Java SDK Windows PostCommit Tests',
-  'Run Java Windows PostCommit')
-
-  // Maven goals for this job.
-  goals('-B -e -Prelease,direct-runner -DrepoToken=$COVERALLS_REPO_TOKEN 
-DpullRequest=$ghprbPullId help:effective-settings clean install')
-}


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88115)
Time Spent: 0.5h  (was: 20m)

> Beam Windows MavenInstall tests failing
> ---
>
> Key: BEAM-2823
> URL: https://issues.apache.org/jira/browse/BEAM-2823
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Affects Versions: 2.1.0
> Environment: Windows
>Reporter: Reuven Lax
>Assignee: Jason Kuster
>Priority: Minor
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_MavenInstall_Windows/417
> Install fails with
> java.io.FileNotFoundException: 
> F:\jenkins\jenkins-slave\workspace\beam_PostCommit_Java_MavenInstall_Windows\sdks\common\runner-api\target\protoc-plugins\protoc-3.2.0-windows-x86_64.exe
>  (The process cannot access the file because it is being used by another 
> process)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-2852) Add support for Kafka as source/sink on Nexmark

2018-04-05 Thread Alexey Romanenko (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2852?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexey Romanenko reassigned BEAM-2852:
--

Assignee: Alexey Romanenko  (was: Kai Jiang)

> Add support for Kafka as source/sink on Nexmark
> ---
>
> Key: BEAM-2852
> URL: https://issues.apache.org/jira/browse/BEAM-2852
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Ismaël Mejía
>Assignee: Alexey Romanenko
>Priority: Minor
>  Labels: newbie, nexmark, starter
>  Time Spent: 2h 50m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-2852) Add support for Kafka as source/sink on Nexmark

2018-04-05 Thread Alexey Romanenko (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16427272#comment-16427272
 ] 

Alexey Romanenko commented on BEAM-2852:


I'd take this one since there was not update for a while. I closed old PR and 
created new one with keeping of all authorship of [~vectorijk] of original 
commits (only squashed them into one).

> Add support for Kafka as source/sink on Nexmark
> ---
>
> Key: BEAM-2852
> URL: https://issues.apache.org/jira/browse/BEAM-2852
> Project: Beam
>  Issue Type: Improvement
>  Components: testing
>Reporter: Ismaël Mejía
>Assignee: Kai Jiang
>Priority: Minor
>  Labels: newbie, nexmark, starter
>  Time Spent: 2h 50m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3437) Support schema in PCollections

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3437?focusedWorklogId=88169=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88169
 ]

ASF GitHub Bot logged work on BEAM-3437:


Author: ASF GitHub Bot
Created on: 05/Apr/18 17:54
Start Date: 05/Apr/18 17:54
Worklog Time Spent: 10m 
  Work Description: reuvenlax commented on issue #4964: [BEAM-3437] 
Introduce Schema class, and use it in BeamSQL
URL: https://github.com/apache/beam/pull/4964#issuecomment-379022343
 
 
   @akedin TypeName and FieldType are in Schema.java. If you think we should 
merge, is that a LGTM for now?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88169)
Time Spent: 7h 40m  (was: 7.5h)

> Support schema in PCollections
> --
>
> Key: BEAM-3437
> URL: https://issues.apache.org/jira/browse/BEAM-3437
> Project: Beam
>  Issue Type: Wish
>  Components: beam-model
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>  Time Spent: 7h 40m
>  Remaining Estimate: 0h
>
> As discussed with some people in the team, it would be great to add schema 
> support in {{PCollections}}. It will allow us:
> 1. To expect some data type in {{PTransforms}}
> 2. Improve some runners with additional features (I'm thinking about Spark 
> runner with data frames for instance).
> A technical draft document has been created: 
> https://docs.google.com/document/d/1tnG2DPHZYbsomvihIpXruUmQ12pHGK0QIvXS1FOTgRc/edit?disco=BhykQIs=5a203b46=comment_email_document
> I also started a PoC on a branch, I will update this Jira with a "discussion" 
> PR.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #6386

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=88163=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88163
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 05/Apr/18 17:40
Start Date: 05/Apr/18 17:40
Worklog Time Spent: 10m 
  Work Description: apilloud commented on issue #4991: [BEAM-3983] [SQL] 
Tables interface supports BigQuery
URL: https://github.com/apache/beam/pull/4991#issuecomment-379018105
 
 
   run java precommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88163)
Time Spent: 1h  (was: 50m)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #5033 from alanmyrvold/alan-delete-windows

2018-04-05 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 446a3752718dc3353832e64a38f25ee1803e75da
Merge: a3252ee 921ffe3
Author: Ahmet Altay 
AuthorDate: Thu Apr 5 09:25:03 2018 -0700

Merge pull request #5033 from alanmyrvold/alan-delete-windows

[BEAM-2823] Delete failing Beam Windows MavenInstall tests

 ...eam_PostCommit_Java_MavenInstall_Windows.groovy | 46 --
 1 file changed, 46 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[beam] branch master updated (a3252ee -> 446a375)

2018-04-05 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from a3252ee  Merge pull request #5031 from coheigea/simplify_expressions
 add 921ffe3  [BEAM-2823] Delete failing Beam Windows MavenInstall tests
 new 446a375  Merge pull request #5033 from alanmyrvold/alan-delete-windows

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 ...eam_PostCommit_Java_MavenInstall_Windows.groovy | 46 --
 1 file changed, 46 deletions(-)
 delete mode 100644 
.test-infra/jenkins/job_beam_PostCommit_Java_MavenInstall_Windows.groovy

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #10

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=88146=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88146
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 05/Apr/18 17:23
Start Date: 05/Apr/18 17:23
Worklog Time Spent: 10m 
  Work Description: RobbeSneyders commented on issue #4990: [BEAM-3981] 
[WIP] Futurize and fix python 2 compatibility for coders subpackage
URL: https://github.com/apache/beam/pull/4990#issuecomment-378065718
 
 
   Thanks for the review @charlesccychen.
   
   Some general points based on your feedback and my answers:
   
   - The imports:
   `from __future__ import absolute_import`
   `from __future__ import division`
   `from __future__ import print_function`
   were added at the top of each updated module to prevent regression before 
full python 3 support is added. This way no new code can be added using for 
instance the old python 2 division. Another benefit is the consistency of 
division and print across modules.
   
   - `from builtins import ...` imports from future.builtins on python 2 and 
has no effect on python 3. future.builtins contains a bunch of backported 
python 3 builtins for compatibility.
   
   - The bytes type annotation was removed in the stream cython files because 
it was breaking due to a mismatch between the cython bytes type and the future 
bytes type. However, this is not meant to be merged this way, but I wanted to 
submit the pull request with working code to get some feedback on this. I have 
tried replacing bytes with a memory view as explained 
[here](http://cython.readthedocs.io/en/latest/src/tutorial/strings.html#accepting-strings-from-python-code),
 but this resulted in a packaging error. Any help on this is appreciated.
   
   - The Cython version was upgraded from 0.26.1 to 0.28.1 because of an 
incompatibility between cython and future types. I have not noticed any 
backward incompatibility.
   
   - The is type checks were replaced by isinstance checks because the 
future.builtins are all subclasses of the standard python classes. However, 
this works a lot slower. I could revert this change if I use six again for 
compatibility. The drawback is that `str` and `bytes` will work different 
across modules.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88146)
Time Spent: 5h 10m  (was: 5h)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 5h 10m
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #5286

2018-04-05 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Python #1111

2018-04-05 Thread Apache Jenkins Server
See 


--
[...truncated 1.64 KB...]
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3683524889524858700.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3967160867358285744.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5201572955204619984.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#md5=ca299c7acd13a72e1171a3697f2b99bc
Downloading/unpacking pip from 
https://pypi.python.org/packages/ac/95/a05b56bb975efa78d3557efa36acaf9cf5d2fd0ee0062060493687432e03/pip-9.0.3-py2.py3-none-any.whl#md5=d512ceb964f38ba31addb8142bc657cb
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5198094055685461328.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6805073336516709053.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy==1.13.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: colorama; extra == "windows" in 
/usr/lib/python2.7/dist-packages (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: xmltodict in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests-ntlm>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: ntlm-auth>=1.0.2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: cryptography>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: urllib3<1.23,>=1.21.1 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 

Jenkins build is back to normal : beam_PostCommit_Python_Verify #4603

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3437) Support schema in PCollections

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3437?focusedWorklogId=88175=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88175
 ]

ASF GitHub Bot logged work on BEAM-3437:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:07
Start Date: 05/Apr/18 18:07
Worklog Time Spent: 10m 
  Work Description: reuvenlax commented on issue #4964: [BEAM-3437] 
Introduce Schema class, and use it in BeamSQL
URL: https://github.com/apache/beam/pull/4964#issuecomment-379026543
 
 
   On Thu, Apr 5, 2018 at 11:02 AM Kenn Knowles 
   wrote:
   
   > It seems like this is a good idea that needs lots of baking. That will
   > work best once it is in. How about we build a document with notes on
   > follow-ups or an umbrella JIRA with subtasks? Otherwise I'm concerned the
   > collection of things we want to look into more specifically may get lost.
   >
   > Being totally frank, the code seems fine while the fundamentals of what a
   > schema is are where I still have the most questions, especially as pertains
   > to portability. At the portability layer, encodings (coders) and types are
   > synonymous. In a particular language, there is the language's types that
   > come from coders. Then each SQL dialect has its own notion of standard
   > types that need not correspond to any general purpose language's. And of
   > course Avro and Proto have their own encoding-to-language mappings to
   > contend with. I really don't think Beam should add another.
   >
   
   FYI, the simple answer is that at the portability layer the only type is
   Row - individual schema fields don't exist as separate types at the
   portability layer. And in truth the fact that we currently "encode" fields
   using coders is a potentially temporary implementation detail.
   
   > So I want to get this in as experimental but continue work before we have
   > lots of dependencies on schemas. So [image: :lgtm:]
   > 

   > pending a followup document or JIRA.
   > --
   >
   > Reviewed 79 of 150 files at r1, 4 of 20 files at r4, 11 of 29 files at r5,
   > 1 of 11 files at r6, 20 of 32 files at r7, 1 of 1 files at r8, 32 of 33
   > files at r9.
   > Review status: all files reviewed at latest revision, 3 unresolved
   > discussions.
   > --
   >
   > *sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java,
   > line 39 at r9
   > 

   > (raw file
   > 
):*
   >
   > @Experimentalpublic class RowCoder extends CustomCoder {
   >   private static final Map CODER_MAP = 
ImmutableMap.builder()
   >
   > These should probably be defaults, not hardcoded.
   > --
   >
   > *sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java,
   > line 52 at r9
   > 

   > (raw file
   > 
):*
   >
   >   .build();
   >
   >   private static final Map ESTIMATED_FIELD_SIZES =
   >
   > Units in the name - at usage sites it will not be clear what they are.
   > --
   >
   > *sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java,
   > line 79 at r9
   > 

   > (raw file
   > 
):*
   >
   >* Return the estimated serialized size of a give row object.
   >*/
   >   public static long estimatedSizeBytes(Row row) {
   >
   > And given the particular field coders being per-instance, this would be a
   > non-static method, etc.
   > --
   >
   > *sdks/java/core/src/main/java/org/apache/beam/sdk/io/FileIO.java, line 296
   > at r9
   > 

   > (raw file
   > 
):*
   >
   >  * PCollection transactions = ...;
   >  * transactions.apply(FileIO.

[jira] [Work logged] (BEAM-3973) Allow to disable batch API in SpannerIO

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3973?focusedWorklogId=88179=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88179
 ]

ASF GitHub Bot logged work on BEAM-3973:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:15
Start Date: 05/Apr/18 18:15
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on a change in pull request 
#4946: [BEAM-3973] Adds a parameter to the Cloud Spanner read connector that 
can disable batch API
URL: https://github.com/apache/beam/pull/4946#discussion_r179554941
 
 

 ##
 File path: 
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/SpannerIO.java
 ##
 @@ -329,12 +333,26 @@ public ReadAll withTimestampBound(TimestampBound 
timestampBound) {
   return toBuilder().setTimestampBound(timestampBound).build();
 }
 
+/** If true the uses Cloud Spanner batch API. */
 
 Review comment:
   Can you clarify in the documentation that batching is the default ? 
Alternatively, how about just having a method withoutBatching() that can be 
used to disable batching ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88179)
Time Spent: 2h 40m  (was: 2.5h)

> Allow to disable batch API in SpannerIO
> ---
>
> Key: BEAM-3973
> URL: https://issues.apache.org/jira/browse/BEAM-3973
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Mairbek Khadikov
>Assignee: Mairbek Khadikov
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> In 2.4.0, SpannerIO#read has been migrated to use batch API. The batch API 
> provides abstractions to scale out reads from Spanner, but it requires the 
> query to be root-partitionable. The root-partitionable queries cover majority 
> of the use cases, however there are examples when running arbitrary query is 
> useful. For example, reading all the table names from the 
> information_schema.* and reading the content of those tables in the next 
> step. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3973) Allow to disable batch API in SpannerIO

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3973?focusedWorklogId=88178=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88178
 ]

ASF GitHub Bot logged work on BEAM-3973:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:15
Start Date: 05/Apr/18 18:15
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on a change in pull request 
#4946: [BEAM-3973] Adds a parameter to the Cloud Spanner read connector that 
can disable batch API
URL: https://github.com/apache/beam/pull/4946#discussion_r179554941
 
 

 ##
 File path: 
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/SpannerIO.java
 ##
 @@ -329,12 +333,26 @@ public ReadAll withTimestampBound(TimestampBound 
timestampBound) {
   return toBuilder().setTimestampBound(timestampBound).build();
 }
 
+/** If true the uses Cloud Spanner batch API. */
 
 Review comment:
   Can you clarify in the documentation that batching is the default ? 
Alternatively, wow about just having a method withoutBatching() that can be 
used to disable batching ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88178)
Time Spent: 2.5h  (was: 2h 20m)

> Allow to disable batch API in SpannerIO
> ---
>
> Key: BEAM-3973
> URL: https://issues.apache.org/jira/browse/BEAM-3973
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Mairbek Khadikov
>Assignee: Mairbek Khadikov
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> In 2.4.0, SpannerIO#read has been migrated to use batch API. The batch API 
> provides abstractions to scale out reads from Spanner, but it requires the 
> query to be root-partitionable. The root-partitionable queries cover majority 
> of the use cases, however there are examples when running arbitrary query is 
> useful. For example, reading all the table names from the 
> information_schema.* and reading the content of those tables in the next 
> step. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=88180=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88180
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:19
Start Date: 05/Apr/18 18:19
Worklog Time Spent: 10m 
  Work Description: apilloud commented on issue #4991: [BEAM-3983] [SQL] 
Tables interface supports BigQuery
URL: https://github.com/apache/beam/pull/4991#issuecomment-379030336
 
 
   @xumingming Will you have time to look at this in the next day or two?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88180)
Time Spent: 1h 20m  (was: 1h 10m)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #12

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Simplify the Beam and/or SQL Expressions

[amyrvold] [BEAM-2823] Delete failing Beam Windows MavenInstall tests

--
[...truncated 190.61 KB...]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy61.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:248)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:235)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy60.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy61.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #11

2018-04-05 Thread Apache Jenkins Server
See 


--
[...truncated 106.46 MB...]
04/05/2018 18:29:19 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem(1/1) switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 05, 2018 6:29:19 PM grizzled.slf4j.Logger info
INFO: Un-registering task and sending final execution state FINISHED to 
JobManager for task Combine.perKey(TestCombineFnWithContext) -> 
PAssert$167/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$167/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$167/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$167/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (23fe728b76fd27a953fa03719b7e13e6)
Apr 05, 2018 6:29:19 PM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/05/2018 18:29:19   ToKeyedWorkItem(1/1) switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/05/2018 18:29:19 ToKeyedWorkItem(1/1) switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 05, 2018 6:29:19 PM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/05/2018 18:29:19   
PAssert$166/GroupGlobally/GatherAllOutputs/GroupByKey -> 
PAssert$166/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$166/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$166/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous)(1/1) 
switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/05/2018 18:29:19 PAssert$166/GroupGlobally/GatherAllOutputs/GroupByKey 
-> 
PAssert$166/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$166/GroupGlobally/RewindowActuals/Window.Assign.out -> 
PAssert$166/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous)(1/1) 
switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 05, 2018 6:29:19 PM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/05/2018 18:29:19   
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem(1/1) switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_OUT
04/05/2018 18:29:19 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem(1/1) switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > testSimpleCombineWithContextEmpty 
STANDARD_ERROR
Apr 05, 2018 6:29:19 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous)
 -> (Map, Map) (1/1) (0450691c89e9d27e887afc91ef37370e) switched from RUNNING 
to FINISHED.
Apr 05, 2018 6:29:19 PM org.apache.flink.runtime.client.JobClientActor 
logAndPrintMessage
INFO: 04/05/2018 18:29:19   
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous)
 -> (Map, Map)(1/1) switched to FINISHED 

org.apache.beam.sdk.transforms.CombineTest > 

Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #19

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3437) Support schema in PCollections

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3437?focusedWorklogId=88174=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88174
 ]

ASF GitHub Bot logged work on BEAM-3437:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:02
Start Date: 05/Apr/18 18:02
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on issue #4964: [BEAM-3437] 
Introduce Schema class, and use it in BeamSQL
URL: https://github.com/apache/beam/pull/4964#issuecomment-379024918
 
 
   It seems like this is a good idea that needs lots of baking. That will work 
best once it is in. How about we build a document with notes on follow-ups or 
an umbrella JIRA with subtasks? Otherwise I'm concerned the collection of 
things we want to look into more specifically may get lost.
   
   Being totally frank, the code seems fine while the fundamentals of what a 
schema is are where I still have the most questions, especially as pertains to 
portability. At the portability layer, encodings (coders) and types are 
synonymous. In a particular language, there is the language's types that come 
from coders. Then each SQL dialect has its own notion of standard types that 
need not correspond to any general purpose language's. And of course Avro and 
Proto have their own encoding-to-language mappings to contend with. I really 
don't think Beam should add another.
   
   So I want to get this in as experimental but continue work before we have 
lots of dependencies on schemas. So https://reviewable.io/lgtm.png; height="20" 
width="61"/> pending a followup document or JIRA.
   
   ---
   
   Reviewed 79 of 150 files at r1, 4 of 20 files at r4, 11 of 29 files at r5, 1 
of 11 files at r6, 20 of 32 files at r7, 1 of 1 files at r8, 32 of 33 files at 
r9.
   Review status: all files reviewed at latest revision, 3 unresolved 
discussions.
   
   ---
   
   *[sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java, 
line 39 at 
r9](https://beta.reviewable.io/reviews/apache/beam/4964#-L9LipYfDPrPNLA5iboh:-L9LipYfDPrPNLA5iboi:b-kcvp9c)
 ([raw 
file](https://github.com/apache/beam/blob/d28693b35568d8ebee30301329c77b2cc2feaf26/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java#L39)):*
   > ```Java
   > @Experimental
   > public class RowCoder extends CustomCoder {
   >   private static final Map CODER_MAP = 
ImmutableMap.builder()
   > ```
   
   These should probably be defaults, not hardcoded.
   
   ---
   
   *[sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java, 
line 52 at 
r9](https://beta.reviewable.io/reviews/apache/beam/4964#-L9LiaqH42LqLGYe_yRb:-L9LiaqH42LqLGYe_yRc:bz5ztdg)
 ([raw 
file](https://github.com/apache/beam/blob/d28693b35568d8ebee30301329c77b2cc2feaf26/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java#L52)):*
   > ```Java
   >   .build();
   > 
   >   private static final Map ESTIMATED_FIELD_SIZES =
   > ```
   
   Units in the name - at usage sites it will not be clear what they are.
   
   ---
   
   *[sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java, 
line 79 at 
r9](https://beta.reviewable.io/reviews/apache/beam/4964#-L9Lit5n4nmMBm0PsKNI:-L9Lit5n4nmMBm0PsKNJ:bft3uyf)
 ([raw 
file](https://github.com/apache/beam/blob/d28693b35568d8ebee30301329c77b2cc2feaf26/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/RowCoder.java#L79)):*
   > ```Java
   >* Return the estimated serialized size of a give row object.
   >*/
   >   public static long estimatedSizeBytes(Row row) {
   > ```
   
   And given the particular field coders being per-instance, this would be a 
non-static method, etc.
   
   ---
   
   *[sdks/java/core/src/main/java/org/apache/beam/sdk/io/FileIO.java, line 296 
at 
r9](https://beta.reviewable.io/reviews/apache/beam/4964#-L9LiPXj0UF0Su-q34K4:-L9LiPXj0UF0Su-q34K5:b-kxeoth)
 ([raw 
file](https://github.com/apache/beam/blob/d28693b35568d8ebee30301329c77b2cc2feaf26/sdks/java/core/src/main/java/org/apache/beam/sdk/io/FileIO.java#L296)):*
   > ```Java
   >  * PCollection transactions = ...;
   >  * transactions.apply(FileIO.writeDynamic()
   >  * .by(Transaction::getTypeName)
   > ```
   
   There are a bunch of tiny renames that I don't really get.
   
   ---
   
   *[sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/Schema.java, line 
18 at 
r9](https://beta.reviewable.io/reviews/apache/beam/4964#-L9H0EwC1Ny6SIRPGLDb:-L9H0EwC1Ny6SIRPGLDc:b-8ioiqz)
 ([raw 
file](https://github.com/apache/beam/blob/d28693b35568d8ebee30301329c77b2cc2feaf26/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/Schema.java#L18)):*
   > ```Java
   >  * limitations under the License.
   >  */
   > package org.apache.beam.sdk.schemas;
   > ```
   
   Just to keep things simple - is it useful to have a package boundary here? I 

Build failed in Jenkins: beam_PerformanceTests_Python #1110

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Simplify the Beam and/or SQL Expressions

[amyrvold] [BEAM-2823] Delete failing Beam Windows MavenInstall tests

--
[...truncated 60.67 KB...]
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:copy-resources (copy-go-cmd-source) @ 
beam-sdks-go ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 6 resources
[INFO] 
[INFO] --- maven-assembly-plugin:3.1.0:single (export-go-pkg-sources) @ 
beam-sdks-go ---
[INFO] Reading assembly descriptor: descriptor.xml
[INFO] Building zip: 

[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:get (go-get-imports) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go get google.golang.org/grpc 
golang.org/x/oauth2/google google.golang.org/api/storage/v1 
github.com/spf13/cobra cloud.google.com/go/bigquery 
google.golang.org/api/googleapi google.golang.org/api/dataflow/v1b3
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build-linux-amd64) @ beam-sdks-go 
---
[INFO] Prepared command line : bin/go build -buildmode=default -o 

 github.com/apache/beam/sdks/go/cmd/beamctl
[INFO] The Result file has been successfuly created : 

[INFO] 
[INFO] --- maven-checkstyle-plugin:3.0.0:check (default) @ beam-sdks-go ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.6:test (go-test) @ beam-sdks-go ---
[INFO] Prepared command line : bin/go test ./...
[INFO] 
[INFO] -Exec.Out-
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/beamctl/cmd  [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/specialize   [no test files]
[INFO] ?github.com/apache/beam/sdks/go/cmd/symtab   [no test files]
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam 0.025s
[INFO] ok   github.com/apache/beam/sdks/go/pkg/beam/artifact0.092s
[INFO] 
[ERROR] 
[ERROR] -Exec.Err-
[ERROR] # github.com/apache/beam/sdks/go/pkg/beam/util/gcsx
[ERROR] github.com/apache/beam/sdks/go/pkg/beam/util/gcsx/gcs.go:46:37: 
undefined: option.WithoutAuthentication
[ERROR] 
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  4.305 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . SUCCESS [  3.523 s]
[INFO] Apache Beam :: Model ... SUCCESS [  0.095 s]
[INFO] Apache Beam :: Model :: Pipeline ... SUCCESS [ 11.329 s]
[INFO] Apache Beam :: Model :: Job Management . SUCCESS [  3.391 s]
[INFO] Apache Beam :: Model :: Fn Execution ... SUCCESS [  3.756 s]
[INFO] Apache Beam :: SDKs  SUCCESS [  0.160 s]
[INFO] Apache Beam :: SDKs :: Go .. FAILURE [ 30.832 s]
[INFO] Apache Beam :: SDKs :: Go :: Container . SKIPPED
[INFO] Apache Beam :: SDKs :: Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Core  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Fn Execution  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core 
SKIPPED
[INFO] Apache Beam :: Runners . SKIPPED
[INFO] Apache Beam :: Runners :: Core Construction Java ... SKIPPED
[INFO] Apache Beam :: Runners :: Core Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Harness . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Container ... SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services SKIPPED
[INFO] Apache Beam :: Runners :: Local Java Core .. SKIPPED
[INFO] Apache Beam :: Runners :: Direct Java .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: AMQP 

[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=88177=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88177
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:10
Start Date: 05/Apr/18 18:10
Worklog Time Spent: 10m 
  Work Description: apilloud commented on issue #4947: [BEAM-3983] [SQL] 
Add utils for converting to BigQuery types
URL: https://github.com/apache/beam/pull/4947#issuecomment-379027439
 
 
   R: @kennknowles The first two commits here are API changes in #4991. The 
second two support bigquery.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88177)
Time Spent: 1h 10m  (was: 1h)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PerformanceTests_Compressed_TextIOIT_HDFS #13

2018-04-05 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_JDBC #416

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-3956) Stacktraces from exceptions in user code should be preserved in the Python SDK

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3956?focusedWorklogId=88187=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88187
 ]

ASF GitHub Bot logged work on BEAM-3956:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:26
Start Date: 05/Apr/18 18:26
Worklog Time Spent: 10m 
  Work Description: shoyer commented on issue #4959: [BEAM-3956] Preserve 
stacktraces for Python exceptions
URL: https://github.com/apache/beam/pull/4959#issuecomment-379032629
 
 
   The test suite passes, but Jenkins appears to be failing to install Cython:
   ERROR: invocation failed (errno 2), args: 
['/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/.tox/py27-cython2/bin/pip',
 'install', 'cython==0.26.1'], cwd: 
/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python
   
   I think this is unrelated to my changes?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88187)
Time Spent: 5h  (was: 4h 50m)

> Stacktraces from exceptions in user code should be preserved in the Python SDK
> --
>
> Key: BEAM-3956
> URL: https://issues.apache.org/jira/browse/BEAM-3956
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Stephan Hoyer
>Priority: Major
>  Time Spent: 5h
>  Remaining Estimate: 0h
>
> Currently, Beam's Python SDK loses stacktraces for exceptions. It does 
> helpfully add a tag like "[while running StageA]" to exception error 
> messages, but that doesn't include the stacktrace of Python functions being 
> called.
> Including the full stacktraces would make a big difference for the ease of 
> debugging Beam pipelines when things go wrong.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Spark #1554

2018-04-05 Thread Apache Jenkins Server
See 


Changes:

[coheigea] Simplify the Beam and/or SQL Expressions

[amyrvold] [BEAM-2823] Delete failing Beam Windows MavenInstall tests

--
[...truncated 63.66 KB...]
2018-04-05 18:25:43,633 4bc708fa MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 18:26:04,478 4bc708fa MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 18:26:08,195 4bc708fa MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r6553e47c8066ccbc_0162970efe7d_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r6553e47c8066ccbc_0162970efe7d_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r6553e47c8066ccbc_0162970efe7d_1 ... (0s) Current status: DONE   
2018-04-05 18:26:08,195 4bc708fa MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 18:26:35,031 4bc708fa MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 18:26:38,524 4bc708fa MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r4e4811c2a93dddad_0162970f7567_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r4e4811c2a93dddad_0162970f7567_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r4e4811c2a93dddad_0162970f7567_1 ... (0s) Current status: DONE   
2018-04-05 18:26:38,524 4bc708fa MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-05 18:27:00,614 4bc708fa MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-05 18:27:26,501 4bc708fa MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r5f50b3ec3ef08f27_0162970fd936_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r5f50b3ec3ef08f27_0162970fd936_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r5f50b3ec3ef08f27_0162970fd936_1 ... (1s) Current status: RUNNING 

 Waiting on bqjob_r5f50b3ec3ef08f27_0162970fd936_1 ... (2s) Current status: 
RUNNING 
 Waiting on bqjob_r5f50b3ec3ef08f27_0162970fd936_1 ... (3s) 
Current status: RUNNING 
 Waiting on 
bqjob_r5f50b3ec3ef08f27_0162970fd936_1 ... (4s) Current status: RUNNING 

 Waiting on bqjob_r5f50b3ec3ef08f27_0162970fd936_1 ... (5s) Current status: 
RUNNING 
 Waiting on bqjob_r5f50b3ec3ef08f27_0162970fd936_1 ... (6s) 
Current status: RUNNING 
 Waiting on 
bqjob_r5f50b3ec3ef08f27_0162970fd936_1 ... (7s) Current status: RUNNING 

 Waiting on bqjob_r5f50b3ec3ef08f27_0162970fd936_1 

[jira] [Work logged] (BEAM-3355) Make Go SDK runtime harness hooks pluggable

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3355?focusedWorklogId=88193=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88193
 ]

ASF GitHub Bot logged work on BEAM-3355:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:46
Start Date: 05/Apr/18 18:46
Worklog Time Spent: 10m 
  Work Description: lostluck commented on a change in pull request #4311: 
[BEAM-3355] Diagnostic interfaces
URL: https://github.com/apache/beam/pull/4311#discussion_r179564266
 
 

 ##
 File path: sdks/go/pkg/beam/core/util/hooks/hooks.go
 ##
 @@ -0,0 +1,184 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Hooks allow runners to tailor execution of the worker to allow for 
customization
+// of features used by the harness.
+//
+// Examples of customization:
+//
+// gRPC integration
+// session recording
+// profile recording
+//
+// Registration methods for hooks must be called prior to calling beam.Init()
+// Request methods for hooks must be called as part of building the pipeline
+// request for the runner's Execute method.
+
+package hooks
+
+import (
+   "bytes"
+   "context"
+   "encoding/csv"
+   "encoding/json"
+   "fmt"
+   "os"
+   "strings"
+
+   "github.com/apache/beam/sdks/go/pkg/beam/core/runtime"
+   "github.com/apache/beam/sdks/go/pkg/beam/log"
+   fnpb "github.com/apache/beam/sdks/go/pkg/beam/model/fnexecution_v1"
+)
+
+var (
+   hookRegistry = make(map[string]HookFactory)
+   enabledHooks = make(map[string][]string)
+   activeHooks  = make(map[string]Hook)
+)
+
+// A Hook is a set of hooks to run at various stages of executing a
+// pipelne.
+type Hook struct {
+   // Init is called once at the startup of the worker.
+   Init InitHook
+   // Req is called each time the worker handles a FnAPI instruction 
request.
+   Req RequestHook
+   // Resp is called each time the worker generates a FnAPI instruction 
response.
+   Resp ResponseHook
+}
+
+// InitHook is a hook that is called when the harness
+// initializes.
+type InitHook func(context.Context) error
+
+// HookFactory is a function that produces a Hook from the supplied arguments.
+type HookFactory func([]string) Hook
+
+// RegisterHook registers a Hook for the
+// supplied identifier.
+func RegisterHook(name string, h HookFactory) {
+   hookRegistry[name] = h
+}
+
+// RunInitHooks runs the init hooks.
+func RunInitHooks(ctx context.Context) error {
+   // If an init hook fails to complete, the invariants of the
+   // system are compromised and we can't run a workflow.
+   // The hooks can run in any order. They should not be
+   // interdependent or interfere with each other.
+   for _, h := range activeHooks {
+   if h.Init != nil {
+   if err := h.Init(ctx); err != nil {
+   return err
+   }
+   }
+   }
+   return nil
+}
+
+// RequestHook is called when handling a Fn API instruction.
 
 Review comment:
   
   Fn API -> FnAPI and the same throughout.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88193)

> Make Go SDK runtime harness hooks pluggable
> ---
>
> Key: BEAM-3355
> URL: https://issues.apache.org/jira/browse/BEAM-3355
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Bill Neubauer
>Priority: Minor
>  Time Spent: 4h 50m
>  Remaining Estimate: 0h
>
> We currently hardcode cpu profiling and session recording in the harness. We 
> should make it pluggable instead.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3355) Make Go SDK runtime harness hooks pluggable

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3355?focusedWorklogId=88191=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88191
 ]

ASF GitHub Bot logged work on BEAM-3355:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:46
Start Date: 05/Apr/18 18:46
Worklog Time Spent: 10m 
  Work Description: lostluck commented on a change in pull request #4311: 
[BEAM-3355] Diagnostic interfaces
URL: https://github.com/apache/beam/pull/4311#discussion_r179564264
 
 

 ##
 File path: sdks/go/pkg/beam/core/util/hooks/hooks.go
 ##
 @@ -0,0 +1,184 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Hooks allow runners to tailor execution of the worker to allow for 
customization
+// of features used by the harness.
+//
+// Examples of customization:
+//
+// gRPC integration
+// session recording
+// profile recording
+//
+// Registration methods for hooks must be called prior to calling beam.Init()
+// Request methods for hooks must be called as part of building the pipeline
+// request for the runner's Execute method.
+
+package hooks
+
+import (
+   "bytes"
+   "context"
+   "encoding/csv"
+   "encoding/json"
+   "fmt"
+   "os"
+   "strings"
+
+   "github.com/apache/beam/sdks/go/pkg/beam/core/runtime"
+   "github.com/apache/beam/sdks/go/pkg/beam/log"
+   fnpb "github.com/apache/beam/sdks/go/pkg/beam/model/fnexecution_v1"
+)
+
+var (
+   hookRegistry = make(map[string]HookFactory)
+   enabledHooks = make(map[string][]string)
+   activeHooks  = make(map[string]Hook)
+)
+
+// A Hook is a set of hooks to run at various stages of executing a
+// pipelne.
+type Hook struct {
+   // Init is called once at the startup of the worker.
+   Init InitHook
+   // Req is called each time the worker handles a FnAPI instruction 
request.
+   Req RequestHook
+   // Resp is called each time the worker generates a FnAPI instruction 
response.
+   Resp ResponseHook
+}
+
+// InitHook is a hook that is called when the harness
+// initializes.
+type InitHook func(context.Context) error
+
+// HookFactory is a function that produces a Hook from the supplied arguments.
+type HookFactory func([]string) Hook
+
+// RegisterHook registers a Hook for the
+// supplied identifier.
+func RegisterHook(name string, h HookFactory) {
+   hookRegistry[name] = h
+}
+
+// RunInitHooks runs the init hooks.
+func RunInitHooks(ctx context.Context) error {
+   // If an init hook fails to complete, the invariants of the
+   // system are compromised and we can't run a workflow.
+   // The hooks can run in any order. They should not be
+   // interdependent or interfere with each other.
+   for _, h := range activeHooks {
+   if h.Init != nil {
+   if err := h.Init(ctx); err != nil {
+   return err
+   }
+   }
+   }
+   return nil
+}
+
+// RequestHook is called when handling a Fn API instruction.
+type RequestHook func(context.Context, *fnpb.InstructionRequest) error
+
+// RunRequestHooks runs the hooks that handle a FnAPI request.
+func RunRequestHooks(ctx context.Context, req *fnpb.InstructionRequest) {
+   // The request hooks should not modify the request.
+   // TODO(wcn): pass the request by value to enforce? That's a perf hit.
+   // I'd rather trust users to do the right thing.
+   for n, h := range activeHooks {
+   if h.Req != nil {
+   if err := h.Req(ctx, req); err != nil {
+   log.Infof(ctx, "request hook %s failed: %v", n, 
err)
+   }
+   }
+   }
+}
+
+// ResponseHook is called when sending a Fn API instruction response.
+type ResponseHook func(context.Context, *fnpb.InstructionRequest, 
*fnpb.InstructionResponse) error
+
+// RunResponseHooks runs the hooks that handle a FnAPI response.
+func RunResponseHooks(ctx context.Context, req *fnpb.InstructionRequest, resp 
*fnpb.InstructionResponse) {
+   for n, h := range activeHooks {
+   if h.Resp != 

[jira] [Work logged] (BEAM-3355) Make Go SDK runtime harness hooks pluggable

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3355?focusedWorklogId=88192=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88192
 ]

ASF GitHub Bot logged work on BEAM-3355:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:46
Start Date: 05/Apr/18 18:46
Worklog Time Spent: 10m 
  Work Description: lostluck commented on a change in pull request #4311: 
[BEAM-3355] Diagnostic interfaces
URL: https://github.com/apache/beam/pull/4311#discussion_r179564267
 
 

 ##
 File path: sdks/go/pkg/beam/core/util/hooks/hooks.go
 ##
 @@ -0,0 +1,184 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Hooks allow runners to tailor execution of the worker to allow for 
customization
 
 Review comment:
   
   Consider removing the blank line between this comment block and the package 
declaration, and changing this first sentence to:
   
   Package hooks allows runners to tailor execution of the worker harness.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88192)
Time Spent: 4h 50m  (was: 4h 40m)

> Make Go SDK runtime harness hooks pluggable
> ---
>
> Key: BEAM-3355
> URL: https://issues.apache.org/jira/browse/BEAM-3355
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Bill Neubauer
>Priority: Minor
>  Time Spent: 4h 50m
>  Remaining Estimate: 0h
>
> We currently hardcode cpu profiling and session recording in the harness. We 
> should make it pluggable instead.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3355) Make Go SDK runtime harness hooks pluggable

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3355?focusedWorklogId=88195=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88195
 ]

ASF GitHub Bot logged work on BEAM-3355:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:46
Start Date: 05/Apr/18 18:46
Worklog Time Spent: 10m 
  Work Description: lostluck commented on a change in pull request #4311: 
[BEAM-3355] Diagnostic interfaces
URL: https://github.com/apache/beam/pull/4311#discussion_r179564273
 
 

 ##
 File path: sdks/go/pkg/beam/util/grpcx/hook.go
 ##
 @@ -0,0 +1,88 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package grpcx
+
+import (
+   "context"
+   "fmt"
+   "time"
+
+   "github.com/apache/beam/sdks/go/pkg/beam/core/util/hooks"
+   "google.golang.org/grpc"
+)
+
+// Hook allow a runner to customize various aspects of gRPC
+// communication with the FnAPI harness. Each member of the struct
+// is optional; the default behavior will be used if a value is not
+// supplied.
+type Hook struct {
+   // Dialer allows the runner to customize the gRPC dialing behavior.
+   Dialer func(context.Context, string, time.Duration) (*grpc.ClientConn, 
error)
+   // TODO(wcn): expose other hooks here.
+}
+
+type HookFactory func([]string) Hook
 
 Review comment:
   
   Exported types should have a comment.
   
   // HookFactory configures a hook with the provided arguments.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88195)
Time Spent: 5h 10m  (was: 5h)

> Make Go SDK runtime harness hooks pluggable
> ---
>
> Key: BEAM-3355
> URL: https://issues.apache.org/jira/browse/BEAM-3355
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Bill Neubauer
>Priority: Minor
>  Time Spent: 5h 10m
>  Remaining Estimate: 0h
>
> We currently hardcode cpu profiling and session recording in the harness. We 
> should make it pluggable instead.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3355) Make Go SDK runtime harness hooks pluggable

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3355?focusedWorklogId=88194=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88194
 ]

ASF GitHub Bot logged work on BEAM-3355:


Author: ASF GitHub Bot
Created on: 05/Apr/18 18:46
Start Date: 05/Apr/18 18:46
Worklog Time Spent: 10m 
  Work Description: lostluck commented on a change in pull request #4311: 
[BEAM-3355] Diagnostic interfaces
URL: https://github.com/apache/beam/pull/4311#discussion_r179564270
 
 

 ##
 File path: sdks/go/pkg/beam/core/runtime/harness/session.go
 ##
 @@ -213,3 +186,59 @@ func recordFooter() error {
},
})
 }
+
+// CaptureHook writes the messaging content consumed and
+// produced by the worker, allowing the data to be used as
+// an input for the session runner. Since workers can exist
+// in a variety of environments, this allows the runner
+// to tailor the behavior best for its particular needs.
+type CaptureHook io.WriteCloser
+
+// CaptureHookFactory produces a CaptureHook from the supplied
+// options.
+type CaptureHookFactory func([]string) CaptureHook
+
+var captureHookRegistry = make(map[string]CaptureHookFactory)
+var enabledCaptureHook string
+
+func init() {
+   hf := func(opts []string) hooks.Hook {
+   return hooks.Hook{
+   Init: func(_ context.Context) error {
 
 Review comment:
   
   As implemented, all capture hooks run at worker initialization time.
   
   Is there any further implementation needed?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88194)
Time Spent: 5h  (was: 4h 50m)

> Make Go SDK runtime harness hooks pluggable
> ---
>
> Key: BEAM-3355
> URL: https://issues.apache.org/jira/browse/BEAM-3355
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Bill Neubauer
>Priority: Minor
>  Time Spent: 5h
>  Remaining Estimate: 0h
>
> We currently hardcode cpu profiling and session recording in the harness. We 
> should make it pluggable instead.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Dataflow #5287

2018-04-05 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-4015) GCS artifact proxy is breaking the build when credentials are not available

2018-04-05 Thread Henning Rohde (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4015?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16427479#comment-16427479
 ] 

Henning Rohde commented on BEAM-4015:
-

This is a compile error. I've seen it before if the go libraries are too old. 
The gradle (but not maven) build locks the dependency versions, so the simplest 
path forward is probably to remove the maven one. 

Is this your personal machine, btw? Alternatively, "go get -u all" should also 
fix it.

> GCS artifact proxy is breaking the build when credentials are not available
> ---
>
> Key: BEAM-4015
> URL: https://issues.apache.org/jira/browse/BEAM-4015
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Ismaël Mejía
>Assignee: Henning Rohde
>Priority: Blocker
>
> When running the maven build in a machine without the valid auth credentials 
> the module breaks like this:
> {code:bash}
> {{[INFO] --- mvn-golang-wrapper:2.1.6:build (go-build) @ 
> beam-runners-gcp-gcsproxy ---}}
> {{[INFO] Prepared command line : bin/go build -buildmode=default -o 
> /home/ismael/upstream/beam/runners/gcp/gcsproxy/target/gcsproxy 
> github.com/apache/beam/cmd/gcsproxy}}
> {{[ERROR] }}
> {{[ERROR] -Exec.Err-}}
> {{[ERROR] # github.com/apache/beam/sdks/go/pkg/beam/util/gcsx}}
> {{[ERROR] github.com/apache/beam/sdks/go/pkg/beam/util/gcsx/gcs.go:46:37: 
> undefined: option.WithoutAuthentication}}
> {{[ERROR] }}
> {{}}
> {{[INFO] Apache Beam :: Runners :: Google Cloud Platform :: GCS artifact 
> proxy FAILURE [  1.038 s]}}
> {{}}
> {{[INFO] BUILD FAILURE}}
> {{}}
> {{[ERROR] Failed to execute goal 
> com.igormaznitsa:mvn-golang-wrapper:2.1.6:build (go-build) on project 
> beam-runners-gcp-gcsproxy: Can't find generated target file : 
> /home/ismael/upstream/beam/runners/gcp/gcsproxy/target/gcsproxy -> [Help 1]}}
> {{[ERROR] }}
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3848) SolrIO: Improve retrying mechanism in client writes

2018-04-05 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3848?focusedWorklogId=88203=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-88203
 ]

ASF GitHub Bot logged work on BEAM-3848:


Author: ASF GitHub Bot
Created on: 05/Apr/18 19:24
Start Date: 05/Apr/18 19:24
Worklog Time Spent: 10m 
  Work Description: timrobertson100 commented on issue #4905: [BEAM-3848] 
Enables ability to retry Solr writes on error (SolrIO)
URL: https://github.com/apache/beam/pull/4905#issuecomment-379049114
 
 
   @iemejia - that should be rebased, squashed and presented as one commit.  
Please grab me on slack if you see otherwise and I'll address it.  I'm afraid 
that with a rebase to origin/master to include Romain's recent PRs did not fix 
the thread problem so the `testWriteRetry()` remains with the hack to await 
thread termination, and manual unregistering of `SolrZKClient` from  threadleak 
detection. Perhaps we/Roman/I could diagnose and tackle that later?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 88203)
Time Spent: 7h  (was: 6h 50m)

> SolrIO: Improve retrying mechanism in client writes
> ---
>
> Key: BEAM-3848
> URL: https://issues.apache.org/jira/browse/BEAM-3848
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-solr
>Affects Versions: 2.2.0, 2.3.0
>Reporter: Tim Robertson
>Assignee: Tim Robertson
>Priority: Minor
>  Time Spent: 7h
>  Remaining Estimate: 0h
>
> A busy SOLR server is prone to return RemoteSOLRException on writing which 
> currently fails a complete task (e.g. a partition of a spark RDD being 
> written to SOLR).
> A good addition would be the ability to provide a retrying mechanism for the 
> batch in flight, rather than failing fast, which will most likely trigger a 
> much larger retry of more writes.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4022) beam_PreCommit_Python_MavenInstall failing with cython error

2018-04-05 Thread Ankur Goenka (JIRA)
Ankur Goenka created BEAM-4022:
--

 Summary: beam_PreCommit_Python_MavenInstall failing with cython 
error
 Key: BEAM-4022
 URL: https://issues.apache.org/jira/browse/BEAM-4022
 Project: Beam
  Issue Type: Bug
  Components: testing
Reporter: Ankur Goenka
Assignee: Udi Meiri


Seems to only happen in this working directory:
{{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild}}
but not this:
{{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild@2}}


{{ERROR: invocation failed (errno 2), args: 
['/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/target/.tox/py27-cython2/bin/pip',
 'install', 'cython==0.26.1'], cwd: 
/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python}}
{{ Traceback (most recent call last):}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/bin/tox",
 line 11, in }}
{{ sys.exit(run_main())}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 40, in run_main}}
{{ main(args)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 46, in main}}
{{ retcode = Session(config).runcommand()}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 415, in runcommand}}
{{ return self.subcommand_test()}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 599, in subcommand_test}}
{{ if self.setupenv(venv):}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
 line 491, in setupenv}}
{{ status = venv.update(action=action)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 171, in update}}
{{ self.hook.tox_testenv_install_deps(action=action, venv=self)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
 line 617, in __call__}}
{{ return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
 line 222, in _hookexec}}
{{ return self._inner_hookexec(hook, methods, kwargs)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
 line 216, in }}
{{ firstresult=hook.spec_opts.get('firstresult'),}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
 line 201, in _multicall}}
{{ return outcome.get_result()}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
 line 77, in get_result}}
{{ _reraise(*ex) # noqa}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
 line 180, in _multicall}}
{{ res = hook_impl.function(*args)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 452, in tox_testenv_install_deps}}
{{ venv._install(deps, action=action)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 331, in _install}}
{{ action=action)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 303, in run_install_command}}
{{ action=action, redirect=self.session.report.verbosity < 2)}}
{{ File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
 line 409, in _pcall}}
{{ redirect=redirect, ignore_ret=ignore_ret)}}
{{ File 

[jira] [Assigned] (BEAM-4022) beam_PreCommit_Python_MavenInstall failing with cython error

2018-04-05 Thread Ankur Goenka (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4022?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ankur Goenka reassigned BEAM-4022:
--

Assignee: Ankur Goenka  (was: Udi Meiri)

> beam_PreCommit_Python_MavenInstall failing with cython error
> 
>
> Key: BEAM-4022
> URL: https://issues.apache.org/jira/browse/BEAM-4022
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: Ankur Goenka
>Assignee: Ankur Goenka
>Priority: Major
>
> Seems to only happen in this working directory:
> {{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild}}
> but not this:
> {{/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild@2}}
> {{ERROR: invocation failed (errno 2), args: 
> ['/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/target/.tox/py27-cython2/bin/pip',
>  'install', 'cython==0.26.1'], cwd: 
> /home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python}}
> {{ Traceback (most recent call last):}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/bin/tox",
>  line 11, in }}
> {{ sys.exit(run_main())}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 40, in run_main}}
> {{ main(args)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 46, in main}}
> {{ retcode = Session(config).runcommand()}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 415, in runcommand}}
> {{ return self.subcommand_test()}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 599, in subcommand_test}}
> {{ if self.setupenv(venv):}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/session.py",
>  line 491, in setupenv}}
> {{ status = venv.update(action=action)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
>  line 171, in update}}
> {{ self.hook.tox_testenv_install_deps(action=action, venv=self)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
>  line 617, in __call__}}
> {{ return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
>  line 222, in _hookexec}}
> {{ return self._inner_hookexec(hook, methods, kwargs)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/__init__.py",
>  line 216, in }}
> {{ firstresult=hook.spec_opts.get('firstresult'),}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
>  line 201, in _multicall}}
> {{ return outcome.get_result()}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
>  line 77, in get_result}}
> {{ _reraise(*ex) # noqa}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/pluggy/callers.py",
>  line 180, in _multicall}}
> {{ res = hook_impl.function(*args)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
>  line 452, in tox_testenv_install_deps}}
> {{ venv._install(deps, action=action)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
>  line 331, in _install}}
> {{ action=action)}}
> {{ File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_GradleBuild/src/sdks/python/build/gradleenv/local/lib/python2.7/site-packages/tox/venv.py",
>  line 303, in run_install_command}}
> {{ 

[jira] [Updated] (BEAM-4022) beam_PreCommit_Python_MavenInstall failing with cython error

2018-04-05 Thread Ankur Goenka (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4022?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ankur Goenka updated BEAM-4022:
---
Description: 
beam_PreCommit_Python_MavenInstall is failing with (link to failing job 
[https://builds.apache.org/view/A-D/view/ActiveMQ/job/beam_PreCommit_Python_MavenInstall/4425/console]
 )

 
OK (skipped=44)
py27-gcp runtests: commands[5] | 
/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/run_tox_cleanup.sh
py27-cython2 create: 
/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/.tox/py27-cython2
py27-cython2 installdeps: cython==0.26.1
ERROR: invocation failed (errno 2), args: 
['/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/.tox/py27-cython2/bin/pip',
 'install', 'cython==0.26.1'], cwd: 
/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python
Traceback (most recent call last):
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/bin/tox",
 line 11, in 
sys.exit(run_main())
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/session.py",
 line 40, in run_main
main(args)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/session.py",
 line 46, in main
retcode = Session(config).runcommand()
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/session.py",
 line 415, in runcommand
return self.subcommand_test()
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/session.py",
 line 599, in subcommand_test
if self.setupenv(venv):
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/session.py",
 line 491, in setupenv
status = venv.update(action=action)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/venv.py",
 line 171, in update
self.hook.tox_testenv_install_deps(action=action, venv=self)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/pluggy/__init__.py",
 line 617, in __call__
return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/pluggy/__init__.py",
 line 222, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/pluggy/__init__.py",
 line 216, in 
firstresult=hook.spec_opts.get('firstresult'),
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/pluggy/callers.py",
 line 201, in _multicall
return outcome.get_result()
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/pluggy/callers.py",
 line 77, in get_result
_reraise(*ex)  # noqa
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/pluggy/callers.py",
 line 180, in _multicall
res = hook_impl.function(*args)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/venv.py",
 line 452, in tox_testenv_install_deps
venv._install(deps, action=action)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/venv.py",
 line 331, in _install
action=action)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/venv.py",
 line 303, in run_install_command
action=action, redirect=self.session.report.verbosity < 2)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/venv.py",
 line 409, in _pcall
redirect=redirect, ignore_ret=ignore_ret)
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_MavenInstall@2/src/sdks/python/target/python/lib/python2.7/site-packages/tox/session.py",
 

<    1   2