Re: Command for Beam worker on Spark cluster

2019-11-12 Thread Kyle Weaver
point, etc. >> >> Where can I extract these parameters from? (In apache_beam Python code, >> those can be extracted from StartWorker request parameters) >> >> Also, how spark executor can find the port that grpc server is running on? >> >> *Sent:* Wednesday, Novemb

Re: Command for Beam worker on Spark cluster

2019-11-07 Thread Matthew K.
M From: "Kyle Weaver" To: dev Subject: Re: Command for Beam worker on Spark cluster > Where can I extract these parameters from?   These parameters should be passed automatically when the process is run (note the use of $* in the example script): https://g

Re: Command for Beam worker on Spark cluster

2019-11-06 Thread Kyle Weaver
er is running on? > > *Sent:* Wednesday, November 06, 2019 at 5:07 PM > *From:* "Kyle Weaver" > *To:* dev > *Subject:* Re: Command for Beam worker on Spark cluster > In Docker mode, most everything's taken care of for you, but in process > mode you have to do a lot of

Re: Command for Beam worker on Spark cluster

2019-11-06 Thread Matthew K.
find the port that grpc server is running on?   Sent: Wednesday, November 06, 2019 at 5:07 PM From: "Kyle Weaver" To: dev Subject: Re: Command for Beam worker on Spark cluster In Docker mode, most everything's taken care of for you, but in process mode you have to do a lot of setu

Re: Command for Beam worker on Spark cluster

2019-11-06 Thread Kyle Weaver
In Docker mode, most everything's taken care of for you, but in process mode you have to do a lot of setup yourself. The command you're looking for is `sdks/python/container/build/target/launcher/linux_amd64/boot`. You will be required to have both that executable (which you can build from source