That failure "Failed to construct Hadoop filesystem with configuration
Configuration: /home/conf/hadoop/core-site.xml,
/home/conf/hadoop/hdfs-site.xml" seems pretty interesting.

I don't know much about hadoop/hdfs - is there config info you should be
putting there? Perhaps you have the *-site.xml files in a different
location? I believe I saw some discussion elsewhere about using a standard
hadoop environmental variable to set the location to read from (earlier on
user@)

S

On Fri, Jun 16, 2017 at 3:31 PM Claire Yuan <[email protected]>
wrote:

> Hi,
>    Here is what I got after I explicitly include the dependency for
>
>        <dependency>
>           <groupId>org.apache.beam</groupId>
>           <artifactId>beam-sdks-java-io-hadoop-file-system</artifactId>
>           <scope>runtime</scope>
>         </dependency>
> in my pom.xml:
> Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.5.0:java
> (default-cli) on project beam-examples-java: An exception occured while
> executing the Java class. null: InvocationTargetException: Failed to
> construct Hadoop filesystem with configuration Configuration:
> /home/conf/hadoop/core-site.xml, /home/conf/hadoop/hdfs-site.xml: No
> FileSystem for scheme: hdfs ->
> I am also wondering if I add it correctly?
>
> Claire
>
>
> On Friday, June 16, 2017 2:32 PM, Stephen Sisk <[email protected]> wrote:
>
>
> We've seen a couple reports involving the "Unable to find registrar for
> hdfs"
>
> The other potential cause is misconfiguration of HDFS/beam can't find the
> HDFS config.
>
> I filed https://issues.apache.org/jira/browse/BEAM-2457 - we don't
> believe this is a bug in beam, but a number of users seem to be running
> into the issues so there might be an undiagnosed issue or a common
> misconfiguration problem.
>
> Claire - if you figure out the root cause, it'd be helpful if you let us
> know what solved the issue so we can improve the error message you saw.
> (and if you can't figure it out, hopefully folks on this list will help you
> figure it out)
>
> S
>
> On Fri, Jun 16, 2017 at 1:58 PM Kenneth Knowles <[email protected]> wrote:
>
> Hi Claire,
>
> The 'hdfs' filesystem is registered when you include the artifact
> "org.apache.beam:beam-sdks-java-io-hadoop-file-system". Do you have this in
> your dependencies?
>
> Kenn
>
> On Fri, Jun 16, 2017 at 11:45 AM, Claire Yuan <[email protected]>
> wrote:
>
> Hi all,
>   I was following the instruction here Apache Apex Runner
> <https://beam.apache.org/documentation/runners/apex/> to submit the work
> into the cluster. The building seems to be successful. However, the thing
> is that I could not find where the output is. I set my param in my maven
> command with:
> --output=/user/claire/output/
> and I checked with hadoop dfs -ls /home/claire/output/ but seems no such
> directory created.
> I also checked my local directory with
> --output=/home/claire/output/, and still no output there
> Finally I set the output directory manually with:
> --output=hdfs:///user/claireyuan/output
> it gave exception as: Failed to execute goal
> org.codehaus.mojo:exec-maven-plugin:1.5.0:java (default-cli) on project
> beam-examples-java: An exception occured while executing the Java class.
> null: InvocationTargetException: Unable to find registrar for hdfs -> [Help
> 1]
>
> Apache Apex Runner
> Apache Beam is an open source, unified model and set of language-specific
> SDKs for defining and executing data p...
> <https://beam.apache.org/documentation/runners/apex/>
> I am wondering where I should check or modify my output directory to be?
>
> Claire
>
>
>
>
>

Reply via email to