[
https://issues.apache.org/jira/browse/BEAM-7872?focusedWorklogId=298720&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-298720
]
ASF GitHub Bot logged work on BEAM-7872:
----------------------------------------
Author: ASF GitHub Bot
Created on: 21/Aug/19 14:34
Start Date: 21/Aug/19 14:34
Worklog Time Spent: 10m
Work Description: kamilwu commented on pull request #9213: [BEAM-7872]
Simpler Flink cluster set up in load tests
URL: https://github.com/apache/beam/pull/9213#discussion_r316220221
##########
File path: .test-infra/jenkins/Flink.groovy
##########
@@ -19,10 +19,74 @@
import CommonJobProperties as common
import CommonTestProperties.SDK
-class Infrastructure {
+class Flink {
+ private static final String repositoryRoot =
'gcr.io/apache-beam-testing/beam_portability'
+ private static final String dockerTag = 'latest'
+ private static final String jobServerImageTag =
"${repositoryRoot}/flink-job-server:${dockerTag}"
+ private static final String flinkVersion = '1.7'
+ private static final String flinkDownloadUrl =
'https://archive.apache.org/dist/flink/flink-1.7.0/flink-1.7.0-bin-hadoop28-scala_2.11.tgz'
+
+ private static def job
+ private static String jobName
+
+ /**
+ * Returns SDK Harness image tag to be used as an environment_config in the
job definition.
+ *
+ * @param sdk - SDK
+ */
+ static String getSDKHarnessImageTag(SDK sdk) {
+ switch (sdk) {
+ case CommonTestProperties.SDK.PYTHON:
+ return "${repositoryRoot}/python:${dockerTag}"
+ case CommonTestProperties.SDK.JAVA:
+ return "${repositoryRoot}/java:${dockerTag}"
+ default:
+ String sdkName = sdk.name().toLowerCase()
+ throw new IllegalArgumentException("${sdkName} SDK is not supported")
+ }
+ }
+
+ /**
+ * Creates Flink cluster and specifies cleanup steps.
+ *
+ * @param job - jenkins job
+ * @param jobName - string to be used as a base for cluster name
+ * @param sdk - SDK
+ * @param workerCount - the initial number of worker nodes excluding one
extra node for Flink's Job Manager
+ * @param slotsPerTaskmanager - the number of slots per Flink task manager
+ */
+ static Flink setUp(job, String jobName, SDK sdk, Integer workerCount,
Integer slotsPerTaskmanager = 1) {
Review comment:
> Maybe it's even better to create a separate class for supplying docker
images, e.g. Portability.groovy?
That was an interesting suggestion, thank you. I decided to extract those
responsibilities and create new classes `DockerPublisher` and
`SDKHarnessPublisher`. DockerPublisher has a public `publish` method which
takes gradle task and the name and the tag of an image as an arguments.
`SDKHarnessPublisher` has a similar method which takes only the SDK.
> sometimes users of this interface may need a cluster without sdk harness
and the job server
Fair point. I reorganised Flink's interface; now it has the following
methods:
- public constructor for initializing the object
- prepareJobServer
- setUp - specifies start and teardown steps
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 298720)
Time Spent: 6h 20m (was: 6h 10m)
> Simpler Flink cluster set up in load tests
> ------------------------------------------
>
> Key: BEAM-7872
> URL: https://issues.apache.org/jira/browse/BEAM-7872
> Project: Beam
> Issue Type: Sub-task
> Components: testing
> Reporter: Kamil Wasilewski
> Assignee: Kamil Wasilewski
> Priority: Major
> Time Spent: 6h 20m
> Remaining Estimate: 0h
>
> Creating a new load test running on Flink runner could be easier by providing
> a single `setUp` function which would encapsulate the process of creating
> Flink cluster and registering teardown steps
--
This message was sent by Atlassian Jira
(v8.3.2#803003)