Hi Alex,

What is the input directory path that you have specified in the
configuration?
On Apr 3, 2016 7:57 AM, "McCullough, Alex" <[email protected]>
wrote:

> I have the dependencies defined in the POM but get the same error when
> running in the IDE.
>
> <dependency>
>     <groupId>org.apache.hadoop</groupId>
>     <artifactId>hadoop-hdfs</artifactId>
>     <version>2.7.2</version>
>     <scope>provided</scope>
> </dependency>
>
> <dependency>
>     <groupId>org.apache.hadoop</groupId>
>     <artifactId>hadoop-client</artifactId>
>     <version>2.7.2</version>
>     <scope>provided</scope>
> </dependency>
>
> <dependency>
>     <groupId>org.apache.hadoop</groupId>
>     <artifactId>hadoop-common</artifactId>
>     <version>2.7.2</version>
>     <scope>provided</scope>
> </dependency>
>
>
>
>
>
>
>
> On 4/3/16, 10:33 AM, "Thomas Weise" <[email protected]> wrote:
>
> >Alex,
> >
> >Local mode in the IDE is fully embedded. It does not use any external
> >Hadoop install. Please try launching the app through the CLI (in local
> >mode) or you need to include the HDFS dependencies in your application pom
> >(with provided scope).
> >
> >Thanks
> >
> >--
> >sent from mobile
> >On Apr 3, 2016 7:29 AM, "McCullough, Alex" <
> [email protected]>
> >wrote:
> >
> >> Hello All,
> >>
> >> I am trying to get my local test application to run but when I try I get
> >> an error. This test application is running on my local machine and
> >> attempting to connect to a remote HDFS cluster.
> >>
> >> I have made sure I have the same version of hadoop that is on the
> cluster
> >> installed locally and with the same config. I can run local HDFS
> commands
> >> that connect just fine to cluster to read the same files  my application
> >> should be reading/writing.
> >>
> >> Thanks,
> >> Alex
> >>
> >>
> >>
> >> Error:
> >>
> >> java.lang.RuntimeException: java.io.IOException: No FileSystem for
> scheme:
> >> hdfs
> >> at
> >>
> com.datatorrent.lib.io.fs.AbstractFileOutputOperator.setup(AbstractFileOutputOperator.java:334)
> >> at
> >>
> com.capitalone.vault8.citadel.operators.impl.HdfsFileOutputOperator.setup(HdfsFileOutputOperator.java:71)
> >> at
> >>
> com.capitalone.vault8.citadel.operators.impl.HdfsFileOutputOperator.setup(HdfsFileOutputOperator.java:22)
> >> at com.datatorrent.stram.engine.Node.setup(Node.java:182)
> >> at
> >>
> com.datatorrent.stram.engine.StreamingContainer.setupNode(StreamingContainer.java:1290)
> >> at
> >>
> com.datatorrent.stram.engine.StreamingContainer.access$100(StreamingContainer.java:129)
> >> at
> >>
> com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1363)
> >> Caused by: java.io.IOException: No FileSystem for scheme: hdfs
> >> at
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2421)
> >> at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2428)
> >> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
> >> at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
> >> at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:2455)
> >> at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:414)
> >> at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:422)
> >> at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:404)
> >> at
> >>
> com.capitalone.vault8.citadel.operators.impl.HdfsFileOutputOperator.getFSInstance(HdfsFileOutputOperator.java:90)
> >> at
> >>
> com.datatorrent.lib.io.fs.AbstractFileOutputOperator.setup(AbstractFileOutputOperator.java:332)
> >> ... 6 more
> >>
> >>
> >> Application Test Class:
> >>
> >>
> >> @Category(IntegrationTest.class)
> >> public class ApplicationTest {
> >>
> >>   @Test
> >>   public void testApplication() throws IOException, Exception {
> >>     try {
> >>       LocalMode lma = LocalMode.newInstance();
> >>       Configuration conf = new Configuration(false);
> >>
> >>
> conf.addResource(this.getClass().getResourceAsStream("/META-INF/properties.xml"));
> >>       lma.prepareDAG(new Application(), conf);
> >>       LocalMode.Controller lc = lma.getController();
> >>       lc.run(200000); // runs for 10 seconds and quits
> >>       lc.shutdown();
> >>     } catch (ConstraintViolationException e) {
> >>       Assert.fail("constraint violations: " +
> e.getConstraintViolations());
> >>     }
> >>   }
> >> }
> >> ________________________________________________________
> >>
> >> The information contained in this e-mail is confidential and/or
> >> proprietary to Capital One and/or its affiliates and may only be used
> >> solely in performance of work or services for Capital One. The
> information
> >> transmitted herewith is intended only for use by the individual or
> entity
> >> to which it is addressed. If the reader of this message is not the
> intended
> >> recipient, you are hereby notified that any review, retransmission,
> >> dissemination, distribution, copying or other use of, or taking of any
> >> action in reliance upon this information is strictly prohibited. If you
> >> have received this communication in error, please contact the sender and
> >> delete the material from your computer.
> >>
> ________________________________________________________
>
> The information contained in this e-mail is confidential and/or
> proprietary to Capital One and/or its affiliates and may only be used
> solely in performance of work or services for Capital One. The information
> transmitted herewith is intended only for use by the individual or entity
> to which it is addressed. If the reader of this message is not the intended
> recipient, you are hereby notified that any review, retransmission,
> dissemination, distribution, copying or other use of, or taking of any
> action in reliance upon this information is strictly prohibited. If you
> have received this communication in error, please contact the sender and
> delete the material from your computer.
>

Reply via email to