That’s what I was afraid of!

We did something similar (set a Boolean flag to mark when the file path was 
set).

Thanks for everyone’s help!

From: Munagala Ramanath <[email protected]>
Reply-To: "[email protected]" <[email protected]>
Date: Thursday, November 10, 2016 at 10:16 AM
To: "[email protected]" <[email protected]>
Subject: Re: error with AbstractFileOutputOperator rolling files from tmp

The application id is not known until the application starts running so that 
kind of substitution
likely won't be possible.

Can you not simply check if filePath already ends with the application id 
before appending ? e.g. :

String appid = 
Context.OperatorContext.getValue(Context.DAGContext.APPLICATION_ID);
if (! filePath.endsWith(appid)) filePath = filePath + "/" + appid)

Ram



On Thu, Nov 10, 2016 at 5:34 AM, Feldkamp, Brandon (CONT) 
<[email protected]<mailto:[email protected]>> wrote:
Good point. I’m sure that’s most like what happened.

Is there anyway to reference in the application id in the properties.xml? I 
tried the following but it didn’t work:

<property>
        <name>dt.operator.fileOut.prop.filePath</name>
        <value>output/${dt.attr.APPLICATION_ID}</value>
    </property>

Thanks!
Brandon

On 11/10/16, 12:33 AM, "Tushar Gosavi" 
<[email protected]<mailto:[email protected]>> wrote:

    was there any failure in the operator or redeploy of the operator? do
    you have any killed container before seeing this error on the
    operator?

    - First initialization of operator correctly set filePath to  filePath
    + "/" + applicationId
      at this stage filePath is set to filePath + "/" + applicationId.

    - If the operator is redeployed again due to upstream operator
    failure, or this operator failure. The setup gets called again, which
    again appends applicationId to
      the last set value of filePath causing applicationId appended twice.

    - Tushar.


    On Thu, Nov 10, 2016 at 7:50 AM, Feldkamp, Brandon (CONT)
    <[email protected]<mailto:[email protected]>> 
wrote:
    > Cut off part of the stack trace
    >
    >
    >
    > Abandoning deployment due to setup failure. java.lang.RuntimeException:
    > java.io.FileNotFoundException: File does not exist:
    > 
hdfs://.../output/application_1478724068939_0002/application_1478724068939_0002/output.txt.0.1478726546727.tmp
    >
    >     at
    > 
com.datatorrent.lib.io.fs.AbstractFileOutputOperator.setup(AbstractFileOutputOperator.java:418)
    >
    >     at
    > 
com.capitalone.cerberus.lazarus.operators.FileOutputOperator.setup(FileOutputOperator.java:58)
    >
    >     at
    > 
com.capitalone.cerberus.lazarus.operators.FileOutputOperator.setup(FileOutputOperator.java:27)
    >
    >     at com.datatorrent.stram.engine.Node.setup(Node.java:187)
    >
    >     at
    > 
com.datatorrent.stram.engine.StreamingContainer.setupNode(StreamingContainer.java:1309)
    >
    >     at
    > 
com.datatorrent.stram.engine.StreamingContainer.access$100(StreamingContainer.java:130)
    >
    >     at
    > 
com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1388)
    >
    > Caused by: java.io.FileNotFoundException: File does not exist:
    >
    > 
hdfs://.../output/application_1478724068939_0002/application_1478724068939_0002/output.txt.0.1478726546727.tmp
    >
    >     at
    > 
org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1219)
    >
    >     at
    > 
org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1211)
    >
    >     at
    > 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    >
    >     at
    > 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1211)
    >
    >     at
    > 
com.datatorrent.lib.io.fs.AbstractFileOutputOperator.setup(AbstractFileOutputOperator.java:411)
    >
    >     ... 6 more
    >
    >
    >
    >
    >
    > From: "Feldkamp, Brandon (CONT)" 
<[email protected]<mailto:[email protected]>>
    > Reply-To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
    > Date: Wednesday, November 9, 2016 at 9:09 PM
    > To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
    > Subject: error with AbstractFileOutputOperator rolling files from tmp
    >
    >
    >
    > Hello,
    >
    >
    >
    > I’m seeing this error:
    >
    >
    >
    > 
hdfs://.../output/application_1478724068939_0002/application_1478724068939_0002/output.txt.0.1478726546727.tmp
    >
    >     at
    > 
org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1219)
    >
    >     at
    > 
org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1211)
    >
    >     at
    > 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    >
    >     at
    > 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1211)
    >
    >     at
    > 
com.datatorrent.lib.io.fs.AbstractFileOutputOperator.setup(AbstractFileOutputOperator.java:411)
    >
    >     ... 6 more
    >
    >
    >
    > For some reason “application_1478724068939_0002” is being added to the 
path
    > twice. Any idea why this could be happening?
    >
    >
    >
    > This is how we set up the path in our FileOutputOperator which extends
    > AbstractFileOutputOperator
    >
    >
    >
    > @Override
    > public void setup(Context.OperatorContext context) {
    >   …
    >
    >   //create directories based on application_id
    >   String applicationId =
    > context.getValue(Context.DAGContext.APPLICATION_ID);
    >   setFilePath((getFilePath()+"/"+applicationId));
    >
    >
    > …
    >
    >
    >   super.setup(context);
    > }
    >
    >
    >
    >
    >
    >
    >
    > ________________________________
    >
    > The information contained in this e-mail is confidential and/or 
proprietary
    > to Capital One and/or its affiliates and may only be used solely in
    > performance of work or services for Capital One. The information 
transmitted
    > herewith is intended only for use by the individual or entity to which it 
is
    > addressed. If the reader of this message is not the intended recipient, 
you
    > are hereby notified that any review, retransmission, dissemination,
    > distribution, copying or other use of, or taking of any action in reliance
    > upon this information is strictly prohibited. If you have received this
    > communication in error, please contact the sender and delete the material
    > from your computer.
    >
    >
    > ________________________________
    >
    > The information contained in this e-mail is confidential and/or 
proprietary
    > to Capital One and/or its affiliates and may only be used solely in
    > performance of work or services for Capital One. The information 
transmitted
    > herewith is intended only for use by the individual or entity to which it 
is
    > addressed. If the reader of this message is not the intended recipient, 
you
    > are hereby notified that any review, retransmission, dissemination,
    > distribution, copying or other use of, or taking of any action in reliance
    > upon this information is strictly prohibited. If you have received this
    > communication in error, please contact the sender and delete the material
    > from your computer.



________________________________________________________

The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates and may only be used solely in performance of 
work or services for Capital One. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed. If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.

________________________________________________________

The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates and may only be used solely in performance of 
work or services for Capital One. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed. If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.

Reply via email to