[ 
https://issues.apache.org/jira/browse/BEAM-7664?focusedWorklogId=282608&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-282608
 ]

ASF GitHub Bot logged work on BEAM-7664:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 25/Jul/19 11:21
            Start Date: 25/Jul/19 11:21
    Worklog Time Spent: 10m 
      Work Description: lgajowy commented on pull request #9106: [BEAM-7664] 
add more Python GBK Flink test cases
URL: https://github.com/apache/beam/pull/9106#discussion_r307243626
 
 

 ##########
 File path: .test-infra/jenkins/job_LoadTests_GBK_Flink_Python.groovy
 ##########
 @@ -49,26 +48,162 @@ def testConfiguration = { datasetName ->
                         input_options       : '\'{"num_records": 
200000000,"key_size": 1,"value_size":9}\'',
                         iterations          : 1,
                         fanout              : 1,
-                        parallelism         : parallelism,
+                        parallelism         : 5,
                         job_endpoint: 'localhost:8099',
                         environment_config : pythonHarnessImageTag,
                         environment_type: 'DOCKER'
 
                 ]
-        ]}
+        ],
+        [
+                title        : 'Load test: 2GB of 100B records',
+                itClass      : 
'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner       : CommonTestProperties.Runner.PORTABLE,
+                sdk          : CommonTestProperties.SDK.PYTHON,
+                jobProperties: [
+                        job_name            : 
"load_tests_Python_Flink_Batch_GBK_2_${now}",
+                        publish_to_big_query: false,
+                        project             : 'apache-beam-testing',
+                        metrics_dataset     : datasetName,
+                        metrics_table       : "python_flink_batch_GBK_2",
+                        input_options       : '\'{"num_records": 
20000000,"key_size": 10,"value_size":90}\'',
+                        iterations          : 1,
+                        fanout              : 1,
+                        parallelism         : 5,
+                        job_endpoint: 'localhost:8099',
+                        environment_config : pythonHarnessImageTag,
+                        environment_type: 'DOCKER'
+
+                ]
+        ],
+        [
+                title        : 'Load test: 2GB of 100kB records',
+                itClass      : 
'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner       : CommonTestProperties.Runner.PORTABLE,
+                sdk          : CommonTestProperties.SDK.PYTHON,
+                jobProperties: [
+                        job_name            : 
"load_tests_Python_Flink_Batch_GBK_3_${now}",
+                        publish_to_big_query: false,
+                        project             : 'apache-beam-testing',
+                        metrics_dataset     : datasetName,
+                        metrics_table       : "python_flink_batch_GBK_3",
+                        input_options       : '\'{"num_records": 
2000,"key_size": 100000,"value_size":900000}\'',
+                        iterations          : 1,
+                        fanout              : 1,
+                        parallelism         : 5,
+                        job_endpoint: 'localhost:8099',
+                        environment_config : pythonHarnessImageTag,
+                        environment_type: 'DOCKER'
+
+                ]
+        ],
+        [
+                title        : 'Load test: reiterate 4 times 10kB values',
+                itClass      : 
'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner       : CommonTestProperties.Runner.PORTABLE,
+                sdk          : CommonTestProperties.SDK.PYTHON,
+                jobProperties: [
+                        job_name            : 
"load_tests_Python_Flink_Batch_GBK_6_${now}",
+                        publish_to_big_query: false,
+                        project             : 'apache-beam-testing',
+                        metrics_dataset     : datasetName,
+                        metrics_table       : "python_flink_batch_GBK_5",
+                        input_options       : '\'{"num_records": 
20000000,"key_size": 10,"value_size":90, "num_hot_keys": 200, 
"hot_key_fraction": 1}\'',
+                        iterations          : 4,
+                        fanout              : 1,
+                        parallelism         : 5,
+                        job_endpoint: 'localhost:8099',
+                        environment_config : pythonHarnessImageTag,
+                        environment_type: 'DOCKER'
+
+                ]
+        ],
+        [
+                title        : 'Load test: reiterate 4 times 2MB values',
+                itClass      : 
'apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey',
+                runner       : CommonTestProperties.Runner.PORTABLE,
+                sdk          : CommonTestProperties.SDK.PYTHON,
+                jobProperties: [
+                        job_name            : 
"load_tests_Python_Flink_Batch_GBK_7_${now}",
+                        publish_to_big_query: false,
+                        project             : 'apache-beam-testing',
+                        metrics_dataset     : datasetName,
+                        metrics_table       : "python_flink_batch_GBK_5",
+                        input_options       : '\'{"num_records": 
20000000,"key_size": 10,"value_size":90, "num_hot_keys": 10, 
"hot_key_fraction": 1}\'',
+                        iterations          : 4,
+                        fanout              : 1,
+                        parallelism         : 5,
+                        job_endpoint: 'localhost:8099',
+                        environment_config : pythonHarnessImageTag,
+                        environment_type: 'DOCKER'
+
+                ]
+        ]
+    ]}
+
+    def testConfigurationWithSixteenWorkers = { datasetName -> [
 
 Review comment:
   I think we should try keep the configurations together in one list if 
possible. Then do something like (pseudocode, just to sketch the idea):
   
   ```
    setup5WorkerFlink()
    def testsToRun = testConfigurations.findAll { it -> 
it.jobProperties.parallelism = 5 }
    run (testsToRun)
   
    scaleCluster(16 /* workers */);
    def testsToRun = testConfigurations.findAll { it -> 
it.jobProperties.parallelism = 16 }
    run(testsToRun) 
   ```
   
   In case we have some more sophisticated criteria, we have all the data in 
one place and construct the query however we want. :)
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 282608)
    Time Spent: 8h 20m  (was: 8h 10m)

> Add the rest of GBK tests in Jenkins job [Flink]
> ------------------------------------------------
>
>                 Key: BEAM-7664
>                 URL: https://issues.apache.org/jira/browse/BEAM-7664
>             Project: Beam
>          Issue Type: Sub-task
>          Components: testing
>            Reporter: Kamil Wasilewski
>            Assignee: Kasia Kucharczyk
>            Priority: Major
>          Time Spent: 8h 20m
>  Remaining Estimate: 0h
>
> In the file .test-infra/jenkins/job_LoadTests_GBK_Flink_Python.groovy



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to