youngoli commented on a change in pull request #13678:
URL: https://github.com/apache/beam/pull/13678#discussion_r553011334
##########
File path: sdks/go/test/run_validatesrunner_tests.sh
##########
@@ -16,39 +16,68 @@
# limitations under the License.
# This script executes ValidatesRunner tests including launching any additional
-# services needed, such as job services or expansion services. The following
-# runners are supported, and selected via a flag:
+# services needed, such as job services or expansion services. This script
+# should be executed from the root of the Beam repository.
#
+# The following runners are supported, and selected via a flag:
# --runner {portable|direct|flink} (default: portable)
# Select which runner to execute tests on. This flag also determines which
# services to start up and which tests may be skipped.
# direct - Go SDK Direct Runner
# portable - (default) Python Portable Runner (aka. Reference Runner or
FnAPI Runner)
# flink - Java Flink Runner (local mode)
# spark - Java Spark Runner (local mode)
+# dataflow - Dataflow Runner
#
-# --flink_job_server_jar -> Filepath to jar, used if runner is Flink.
-# --spark_job_server_jar -> Filepath to jar, used if runner is Spark.
-# --endpoint -> Replaces jar filepath with existing job server endpoint.
+# General flags:
+# --timeout -> Timeout for the go test command, on a per-package level.
+# --endpoint -> An endpoint for an existing job server outside the script.
+# If present, job server jar flags are ignored.
+# --expansion_service_jar -> Filepath to jar for expansion service, for
+# runners that support cross-language.
+# --expansion_addr -> An endpoint for an existing expansion service outside
+# the script. If present, expansion_service_jar is ignored.
#
-# --expansion_service_jar -> Filepath to jar for expansion service.
-# --expansion_addr -> Replaces jar filepath with existing expansion service
endpoint.
-#
-# Execute from the root of the repository. This script requires that necessary
-# services can be built from the repository.
+# Runner-specific flags:
+# Flink
+# --flink_job_server_jar -> Filepath to jar, used if runner is Flink.
+# Spark
+# --spark_job_server_jar -> Filepath to jar, used if runner is Spark.
+# Dataflow
+# --dataflow_project -> GCP project to run Dataflow jobs on.
+# --project -> Same project as dataflow-project, but in URL format, for
+# example in the format "us.gcr.io/<project>".
+# --region -> GCP region to run Dataflow jobs on.
+# --gcs_location -> GCS URL for storing temporary files for Dataflow jobs.
+# --dataflow_worker_jar -> The Dataflow worker jar to use when running jobs.
+# If not specified, the script attempts to retrieve a previously built
+# jar from the appropriate gradle module, which may not succeed.
set -e
set -v
+# Default runner.
RUNNER=portable
+# Default timeout. This timeout is applied per-package, as tests in different
+# packages are executed in parallel.
+TIMEOUT=1h
+
+# Where to store integration test outputs.
+GCS_LOCATION=gs://clouddfe-danoliveira/temp-storage-go-end-to-end-tests
Review comment:
Whoops, forgot to change that. These should match the old framework's
defaults, changing it.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]