Going back to main subject of this thread, I just wanted to make things clear
for all.
Seems like that everybody is agree that we will just deprecate AWS SDK V1
connectors once the alternative will be available, don’t remove them and still
distribute artifacts [1] with new releases along with
Thank you for suggestion. It will definitely reduce development efforts.
On Thu, Nov 28, 2019, 6:51 PM Michał Walenia
wrote:
> I'm glad I was able to help :)
> And you might consider using WSL, it will allow you to run commands in
> Bash without starting a Docker container or a VM :)
> Good
I'm glad I was able to help :)
And you might consider using WSL, it will allow you to run commands in Bash
without starting a Docker container or a VM :)
Good luck!
Michal
On Thu, Nov 28, 2019 at 2:46 PM Rehman Murad Ali <
rehman.murad...@venturedive.com> wrote:
> After running tasks with --info
After running tasks with --info and --rerun-tasks, I got to know that
application default credentials were not set in Docker. I have set
GOOGLE_APPLICATION_CREDENTIALS and now it is working fine. I can find the
job on the dataflow cloud. Thank you for your guidance.
*Thanks & Regards*
*Rehman
IMHO, I think the "rule of thumb” should be to accept a PR which was done by
correspondent Jira's assigner (of course if it correctly addresses and fixes
initial issue). Jira assigning should be like a “mutex” (as Kenn said in other
discussion) for a person who works on this. Though, we need to
Nice, this should bring a great performance improvement for SQL. Thanks for
your work!
On Thu, Nov 28, 2019 at 6:33 AM Kenneth Knowles wrote:
> Nice! Thanks for the very thorough summary. I think this will be a really
> good thing for Beam. Most of the IO sources are very highly optimized for
>
It seems that Windows CMD doesn't play nice with JSON. I'm not sure what's
the problem in your Docker though, try running gradle with --info and
--rerun-tasks. You'll see more output this way
On Thu, Nov 28, 2019 at 1:02 PM Rehman Murad Ali <
rehman.murad...@venturedive.com> wrote:
> Above
Agreed with Jan. This kind of use case requires having incoming elements
ordered by timestamp. Only effective solution is to delegate sorting to the
runner, which is currently impossible. Introducing an "annotation" that
would guarantee event time order looks like a nice clean to solve this. :+1:
What kind of a shell are you using? Screenshot suggests it's not CMD, but a
bash shell on Windows(?) BTW, do you have permissions on the
apache-beam-testing project that allow you to start dataflow jobs directly?
If you don't, change the project to your org project, get the permissions
and try
Hi all,
FYI, I closed the most recent one (with explanation and a sorry message):
https://github.com/apache/beam/pull/10025
Etienne
On 26/11/2019 17:06, Robert Bradshaw wrote:
On Tue, Nov 26, 2019 at 6:15 AM Etienne Chauchot wrote:
Hi guys,
I wanted your opinion about something:
I have 2
I am using Windows CLI for this command. Moreover, I have tried setting up
docker and run this command which results in "Build Successful" but I
cannot find any jobs running on the cloud console. Here are the output
logs. Is there another way to run Dataflow test case on Windows?
11 matches
Mail list logo