[ 
https://issues.apache.org/jira/browse/BEAM-10708?focusedWorklogId=552078&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-552078
 ]

ASF GitHub Bot logged work on BEAM-10708:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 13/Feb/21 00:44
            Start Date: 13/Feb/21 00:44
    Worklog Time Spent: 10m 
      Work Description: TheNeuralBit commented on a change in pull request 
#13944:
URL: https://github.com/apache/beam/pull/13944#discussion_r575590762



##########
File path: 
sdks/python/apache_beam/runners/portability/fn_api_runner/worker_handlers.py
##########
@@ -737,11 +738,15 @@ def __init__(self,
 
   def host_from_worker(self):
     # type: () -> str
-    if sys.platform == "darwin":
+    if sys.platform == 'darwin':
       # See https://docs.docker.com/docker-for-mac/networking/
       return 'host.docker.internal'
-    else:
-      return super(DockerSdkWorkerHandler, self).host_from_worker()
+    if sys.platform != 'win32' and is_in_notebook():

Review comment:
       I'm not sure this is always necessary when running inside a notebook. I 
was able to run SqlTransform in a notebook for our demo last fall: 
https://gist.github.com/TheNeuralBit/9c79d71cbc90a962e795b80ca54fa3c8#file-simpler-python-pipelines-demo-ipynb
   
   It seems like this is necessary in your case because the IPython kernel is 
running inside a docker container, but that's not always true. In my case I was 
running the kernel directly on my desktop. Is there something else we can test 
here?

##########
File path: 
sdks/python/apache_beam/runners/portability/fn_api_runner/worker_handlers.py
##########
@@ -737,11 +738,15 @@ def __init__(self,
 
   def host_from_worker(self):
     # type: () -> str
-    if sys.platform == "darwin":
+    if sys.platform == 'darwin':
       # See https://docs.docker.com/docker-for-mac/networking/
       return 'host.docker.internal'
-    else:
-      return super(DockerSdkWorkerHandler, self).host_from_worker()
+    if sys.platform != 'win32' and is_in_notebook():

Review comment:
       Why the special case for windows?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 552078)
    Time Spent: 1h 20m  (was: 1h 10m)

> InteractiveRunner cannot execute pipeline with cross-language transform
> -----------------------------------------------------------------------
>
>                 Key: BEAM-10708
>                 URL: https://issues.apache.org/jira/browse/BEAM-10708
>             Project: Beam
>          Issue Type: Bug
>          Components: cross-language
>            Reporter: Brian Hulette
>            Assignee: Ning Kang
>            Priority: P2
>          Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> The InteractiveRunner crashes when given a pipeline that includes a 
> cross-language transform.
> Here's the example I tried to run in a jupyter notebook:
> {code:python}
> p = beam.Pipeline(InteractiveRunner())
> pc = (p | SqlTransform("""SELECT
>             CAST(1 AS INT) AS `id`,
>             CAST('foo' AS VARCHAR) AS `str`,
>             CAST(3.14  AS DOUBLE) AS `flt`"""))
> df = interactive_beam.collect(pc)
> {code}
> The problem occurs when 
> [pipeline_fragment.py|https://github.com/apache/beam/blob/dce1eb83b8d5137c56ac58568820c24bd8fda526/sdks/python/apache_beam/runners/interactive/pipeline_fragment.py#L66]
>  creates a copy of the pipeline by [writing it to proto and reading it 
> back|https://github.com/apache/beam/blob/dce1eb83b8d5137c56ac58568820c24bd8fda526/sdks/python/apache_beam/runners/interactive/pipeline_fragment.py#L120].
>  Reading it back fails because some of the pipeline is not written in Python.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to