[
https://issues.apache.org/jira/browse/BEAM-7368?focusedWorklogId=256935&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-256935
]
ASF GitHub Bot logged work on BEAM-7368:
----------------------------------------
Author: ASF GitHub Bot
Created on: 10/Jun/19 16:23
Start Date: 10/Jun/19 16:23
Worklog Time Spent: 10m
Work Description: lgajowy commented on pull request #8636: [BEAM-7368]
Flink + Python + gbk load test
URL: https://github.com/apache/beam/pull/8636#discussion_r292083831
##########
File path: .test-infra/dataproc/init-actions/flink.sh
##########
@@ -121,7 +123,17 @@ function configure_flink() {
grep 'spark\.executor\.cores' /etc/spark/conf/spark-defaults.conf \
| tail -n1 \
| cut -d'=' -f2)
- local flink_taskmanager_slots="$(($spark_executor_cores * 2))"
Review comment:
I "paraphrased" the default behavior of Dataproc while it was generating the
spark-default.conf file. LMK if it's not too cryptic.
Another approach we could have here is to simply set 1 slot by default and
provide slot number through metadata otherwise. Not cryptic and explicit.
Which approach feels better to you?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 256935)
Time Spent: 10h 20m (was: 10h 10m)
> Run Python GBK load tests on portable Flink runner
> --------------------------------------------------
>
> Key: BEAM-7368
> URL: https://issues.apache.org/jira/browse/BEAM-7368
> Project: Beam
> Issue Type: Sub-task
> Components: testing
> Reporter: Lukasz Gajowy
> Assignee: Lukasz Gajowy
> Priority: Major
> Time Spent: 10h 20m
> Remaining Estimate: 0h
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)