[
https://issues.apache.org/jira/browse/BEAM-10475?focusedWorklogId=521958&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-521958
]
ASF GitHub Bot logged work on BEAM-10475:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 09/Dec/20 00:37
Start Date: 09/Dec/20 00:37
Worklog Time Spent: 10m
Work Description: nehsyc commented on a change in pull request #13493:
URL: https://github.com/apache/beam/pull/13493#discussion_r538914797
##########
File path: sdks/python/apache_beam/transforms/util.py
##########
@@ -72,6 +72,9 @@
from apache_beam.transforms.window import NonMergingWindowFn
from apache_beam.transforms.window import TimestampCombiner
from apache_beam.transforms.window import TimestampedValue
+from apache_beam.typehints.sharded_key_type import ShardedKeyType
+from apache_beam.typehints.typehints import IterableTypeConstraint
+from apache_beam.typehints.typehints import TupleConstraint
Review comment:
Updated the code to use `Tuple` and `Iterable` directly. Couldn't add
`ShardedKeyType` to `apache_beam/typehints/__init__.py` due to circular imports
in coders
```
ImportError while loading conftest
'/usr/local/google/home/sychen/Documents/GitHub/working_dir/beam/sdks/python/conftest.py'.
conftest.py:23: in <module>
from apache_beam.options import pipeline_options
apache_beam/__init__.py:95: in <module>
from apache_beam import coders
apache_beam/coders/__init__.py:19: in <module>
from apache_beam.coders.coders import *
apache_beam/coders/coders.py:52: in <module>
from apache_beam.typehints import typehints
apache_beam/typehints/__init__.py:25: in <module>
from apache_beam.typehints.sharded_key_type import ShardedKeyType
apache_beam/typehints/sharded_key_type.py:22: in <module>
from apache_beam.coders import typecoders
apache_beam/coders/typecoders.py:81: in <module>
from apache_beam.coders.coders import CoderElementType
E ImportError: cannot import name 'CoderElementType' from partially
initialized module 'apache_beam.coders.coders' (most likely due to a circular
import)
(/usr/local/google/home/sychen/Documents/GitHub/working_dir/beam/sdks/python/apache_beam/coders/coders.py)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 521958)
Time Spent: 24h 10m (was: 24h)
> GroupIntoBatches with Runner-determined Sharding
> ------------------------------------------------
>
> Key: BEAM-10475
> URL: https://issues.apache.org/jira/browse/BEAM-10475
> Project: Beam
> Issue Type: Improvement
> Components: runner-dataflow
> Reporter: Siyuan Chen
> Assignee: Siyuan Chen
> Priority: P2
> Labels: GCP, performance
> Time Spent: 24h 10m
> Remaining Estimate: 0h
>
> [https://s.apache.org/sharded-group-into-batches|https://s.apache.org/sharded-group-into-batches__]
> Improve the existing Beam transform, GroupIntoBatches, to allow runners to
> choose different sharding strategies depending on how the data needs to be
> grouped. The goal is to help with the situation where the elements to process
> need to be co-located to reduce the overhead that would otherwise be incurred
> per element, while not losing the ability to scale the parallelism. The
> essential idea is to build a stateful DoFn with shardable states.
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)