You should not copy that file. One of the way is to send the props file as <file> tag and Oozie will make sure the file is one the current working directory of the executable node. You can access that file from you java code.
On Sunday, April 27, 2014 4:28 PM, Jonathan Hodges <[email protected]> wrote: Okay I figured out the issue. The Java class expects a local filesystem for the properties file instead of HDFS. Since a Java action can be executed on any of the Hadoop nodes I had to copy this properties file to each Hadoop node's local filesystem. Even though this fixed the problem I don't really like the solution. I would prefer to have the properties file encapsulated in Oozie application directory in HDFS. Is there no way to pass local property files to Java actions without physically copying to all the Hadoop nodes? On Wed, Apr 23, 2014 at 10:44 AM, Jonathan Hodges <[email protected]> wrote: > Hi, > > The following is my workflow.xml and properties file for the Java action. > > <workflow-app xmlns="uri:oozie:workflow:0.5" name="camus-wf"> > <start to="camusNonAvroJob"/> > <action name="camusNonAvroJob"> > <java> > <job-tracker>${jobTracker}</job-tracker> > <name-node>${nameNode}</name-node> > <configuration> > <property> > <name>mapred.job.name</name> > <value>camusNonAvroJob</value> > </property> > <property> > <name>mapred.job.queue.name</name> > <value>${queueName}</value> > </property> > </configuration> > <main-class>com.linkedin.camus.etl.kafka.CamusJob</main-class> > <arg>-P</arg> > <arg>${camusNonAvroJobProperties}</arg> > </java> > <ok to="end"/> > <error to="fail"/> > </action> > <kill name="fail"> > <message>${wf:errorMessage(wf:lastErrorNode())}</message> > </kill> > <end name="end"/> > </workflow-app> > > nameNode=hdfs://x.x.x.x:9000 > jobTracker=x.x.x.x:9001 > queueName=default > oozie.wf.application.path=${nameNode}/user/${user.name > }/oozie/camusNonAvroJob > camusNonAvroJobProperties=${nameNode}/user/${user.name > }/oozie/camusNonAvroJob/camus_non_avro.properties > > > This removed forward slash '/' from the HDFS path is causing a file not > found exception and the MR job getting killed.bin/oozie job -config > /home/hadoop/oozie/camusNonAvroJob/camus-workflow.properties -run > > Failing Oozie Launcher, Main class [com.linkedin.camus.etl.kafka.CamusJob], > main() threw exception, > hdfs:/x.x.x.x:9000/user/hadoop/oozie/camusNonAvroJob/camus_non_avro.properties > (No such file or directory) > java.io.FileNotFoundException: > hdfs:/x.x.x.x:9000/user/hadoop/oozie/camusNonAvroJob/camus_non_avro.properties > (No such file or directory) > at java.io.FileInputStream.open(Native Method) > at java.io.FileInputStream.<init>(FileInputStream.java:146) > at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:602) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) > at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:572) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > > Has anyone seen this before? > > Thank in advance! > Jonathan > >
