je-ik commented on a change in pull request #15550:
URL: https://github.com/apache/beam/pull/15550#discussion_r714561520



##########
File path: 
sdks/python/apache_beam/runners/portability/fn_api_runner/translations.py
##########
@@ -357,12 +357,29 @@ def wrapper(self, *args):
 
 class TransformContext(object):
 
-  _KNOWN_CODER_URNS = set(
+  _COMMON_CODER_URNS = set(
       value.urn for (key, value) in common_urns.coders.__dict__.items()
       if not key.startswith('_')
       # Length prefix Rows rather than re-coding them.
   ) - set([common_urns.coders.ROW.urn])
 
+  _REQUIRED_CODER_URNS = set([
+      common_urns.coders.WINDOWED_VALUE.urn,
+      # For impulse.
+      common_urns.coders.BYTES.urn,
+      common_urns.coders.GLOBAL_WINDOW.urn,
+      # For GBK.
+      common_urns.coders.KV.urn,
+      common_urns.coders.ITERABLE.urn,
+      # For SDF.
+      common_urns.coders.DOUBLE.urn,

Review comment:
       This feels a little weird. Should we bring requirements of specific 
transforms (even though primitive) into generic requirements? I'd say, that the 
required coders should be the set without which *no* pipeline will ever 
successfully run. A pipeline using SDF might need DoubleCoder, but in such case 
both PIpeline (SDK) and the runner will have it in its supported components, 
right?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to