tion or understanding ADL was a problem, it would
> > have thrown error like ADLFileSystem missing or probably access failed or
> > something. Thoughts?
> > >
> > > -Milan.
> > >
> > > -Original Message-
> > > From: Lukasz Cwik [
t; something. Thoughts?
> >
> > -Milan.
> >
> > -Original Message-
> > From: Lukasz Cwik [mailto:lc...@google.com.INVALID]
> > Sent: Thursday, November 23, 2017 5:05 AM
> > To: dev@beam.apache.org
> > Subject: Re: Azure(ADLS) compatibility on Beam
PM
To: dev@beam.apache.org
Subject: Re: Azure(ADLS) compatibility on Beam with Spark runner
The Azure guys tried to use ADLS via Beam HDFS filesystem, but it seems they
didn't succeed.
The new approach we plan is to directly use the ADLS API.
I keep you posted.
Regards
JB
On 11/23/2017 07:42 AM
, it would have
thrown error like ADLFileSystem missing or probably access failed or something.
Thoughts?
-Milan.
-Original Message-
From: Lukasz Cwik [mailto:lc...@google.com.INVALID]
Sent: Thursday, November 23, 2017 5:05 AM
To: dev@beam.apache.org
Subject: Re: Azure(ADLS) compatibility
AM
To: dev@beam.apache.org
Subject: Re: Azure(ADLS) compatibility on Beam with Spark runner
In your example it seems as though your HDFS configuration doesn't contain any
ADL specific configuration: "--hdfsConfiguration='[{\"fs.defaultFS\":
\"hdfs://home/sample.txt\&qu
In your example it seems as though your HDFS configuration doesn't contain
any ADL specific configuration: "--hdfsConfiguration='[{\"fs.defaultFS\":
\"hdfs://home/sample.txt\"]'"
Do you have a core-site.xml or hdfs-site.xml configured as per:
Hi,
Has anyone tried IO from(to) ADLS account on Beam with Spark runner?
I was trying recently to do this but was unable to make it work.
Steps that I tried:
1. Took HDI + Spark 1.6 cluster with default storage as ADLS account.
2. Built Apache Beam on that. Built to include