[ 
https://issues.apache.org/jira/browse/BEAM-9085?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17029201#comment-17029201
 ] 

Valentyn Tymofieiev commented on BEAM-9085:
-------------------------------------------

My repro was running
{noformat}
python setup.py nosetests --nocapture --test-pipeline-options=" --iterations=10 
--number_of_counters=1 --number_of_counter_operations=1 
--project=big-query-project --publish_to_big_query=false 
--metrics_dataset=python_load_tests --metrics_table=pardo --input_options='{ 
\"num_records\": 100000, \"key_size\": 10, \"value_size\":90, 
\"bundle_size_distribution_type\": \"const\", 
\"bundle_size_distribution_param\": 1, \"force_initial_num_bundles\": 0 }'" 
--tests apache_beam.testing.load_tests.pardo_test 2>/dev/null | sort -n -k4
{noformat}
Wtih these changes 
https://github.com/apache/beam/commit/5e4bc432ee89f0d517b353c638fbc4a4f0eadb95

> Investigate performance difference between Python 2/3 on Dataflow
> -----------------------------------------------------------------
>
>                 Key: BEAM-9085
>                 URL: https://issues.apache.org/jira/browse/BEAM-9085
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-py-core
>            Reporter: Kamil Wasilewski
>            Assignee: Valentyn Tymofieiev
>            Priority: Major
>
> Tests show that the performance of core Beam operations in Python 3.x on 
> Dataflow can be a few time slower than in Python 2.7. We should investigate 
> what's the cause of the problem.
> Currently, we have one ParDo test that is run both in Py3 and Py2 [1]. A 
> dashboard with runtime results can be found here [2].
> [1] sdks/python/apache_beam/testing/load_tests/pardo_test.py
> [2] https://apache-beam-testing.appspot.com/explore?dashboard=5678187241537536



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to