[
https://issues.apache.org/jira/browse/BEAM-4437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17546977#comment-17546977
]
Kenneth Knowles commented on BEAM-4437:
---------------------------------------
This issue has been migrated to https://github.com/apache/beam/issues/18783
> Maximum recursion depth with Apache Beam and Google Cloud SDK
> -------------------------------------------------------------
>
> Key: BEAM-4437
> URL: https://issues.apache.org/jira/browse/BEAM-4437
> Project: Beam
> Issue Type: Bug
> Components: sdk-py-core
> Affects Versions: 2.4.0
> Environment: LSB Version:
> core-9.20170808ubuntu1-noarch:printing-9.20170808ubuntu1-noarch:security-9.20170808ubuntu1-noarch
> Distributor ID: Ubuntu
> Description: Ubuntu 18.04 LTS
> Release: 18.04
> Codename: bionic
> Python 2.7.15rc1
> Google Cloud SDK 200.0.0
> app-engine-go
> app-engine-python 1.9.69
> app-engine-python-extras 1.9.69
> bq 2.0.33
> core 2018.04.30
> gsutil 4.31
> Reporter: Jonathan Perron
> Priority: P3
>
> I am working with a project which is using Google Cloud SDK and Cloud Engine.
> This project is generating a DataStore which I need to access in order to
> copy parts of it towards a Big Query instance, similar to what is describe in
> [https://blog.papercut.com/google-cloud-dataflow-data-migration/].
> I install Apache Beam using pip install --upgrade apache_beam. However, when
> I set up everything in order to access the DataStore using the ndb models and
> want to import Beam using from airflow_beam import pipeline, I got an error
> related to some part of the SDK described in the docs part.
--
This message was sent by Atlassian Jira
(v8.20.7#820007)