found the problem forgot to apply setup_file parameter in pipeline options with my package module.
On Monday, September 4, 2017 at 8:47:52 PM UTC+3, Yannick (Cloud Platform Support) wrote: > > Hello, if I understand correctly you have an App Engine application with > code to create and run a Dataflow pipeline. When you use this application > to run the pipeline manually it works with no issues, but a pipeline > created by the exact same code and with the same pipeline parameters fails > when its creation is triggered through cron? > > If so that definitely shouldn't be happening. The most likely scenario is > an issue somewhere in your code, and the best place to diagnose that would > be on Stack Overflow using one of the tags > <https://cloud.google.com/support/docs/stackexchange> monitored by our > community technical support team. > > If you have clear indications that the issue is actually linked to a bug > in Dataflow then you should report it on our Public Issue Tracker. > <https://cloud.google.com/support/docs/issue-trackers> > > On Sunday, September 3, 2017 at 12:13:21 PM UTC-4, Leonid SpiralSolutions > wrote: >> >> >> The application is deployed successfully and can be scheduled/ launched >> via cron. >> it creates the dataflow job successfully and then the job fails. >> >> The flow reads data from Bigquery, filters rows and writes to Bigquery. >> The filter step fails (beam.Filter(....)) >> >> *error details from google error report:* >> >> ImportError: No module named dataflow_pipeline >> at _import_module >> (/usr/local/lib/python2.7/dist-packages/dill/dill.py:767) >> at load_reduce (/usr/lib/python2.7/pickle.py:1133) >> at load (/usr/lib/python2.7/pickle.py:858) >> at load (/usr/local/lib/python2.7/dist-packages/dill/dill.py:266) >> at loads (/usr/local/lib/python2.7/dist-packages/dill/dill.py:277) >> at loads >> (/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py:225) >> at dataflow_worker.operations.DoOperation.start >> (dataflow_worker/operations.c:9775) (operations.py:289) >> at dataflow_worker.operations.DoOperation.start >> (dataflow_worker/operations.c:10574) (operations.py:284) >> at dataflow_worker.operations.DoOperation.start >> (dataflow_worker/operations.c:10680) (operations.py:283) >> at execute >> (/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py:166) >> at do_work >> (/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py:581) >> >> >> when running dataflow manually from client, works fine! >> any suggestions and comments would be really helpful! >> > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/google-appengine. To view this discussion on the web visit https://groups.google.com/d/msgid/google-appengine/4b0c0d7a-f5bd-474d-ad16-d1900c009511%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
