Hi there,

Please find below portion of my oozie-site.xml

<property>
        <name>oozie.service.HadoopAccessorService.hadoop.configurations</name>
        <value>*=/disk2/oozie/conf/hadoop-conf</value>
        <description>
            Comma separated AUTHORITY=HADOOP_CONF_DIR, where AUTHORITY is the 
HOST:PORT of
            the Hadoop service (JobTracker, HDFS). The wildcard '*' 
configuration is
            used when there is no exact match for an authority. The 
HADOOP_CONF_DIR contains
            the relevant Hadoop *-site.xml files. If the path is relative is 
looked within
            the Oozie configuration directory; though the path can be absolute 
(i.e. to point
            to Hadoop client conf/ directories in the local filesystem.
        </description>
    </property>

    <property>
        <name>oozie.service.WorkflowAppService.system.libpath</name>
        <value>/user/oozie/sharelib/sharelib</value>
        <description>
            System library path to use for workflow applications.
            This path is added to workflow application if their job properties 
sets
            the property 'oozie.use.system.libpath' to true.
        </description>
    </property>

in hdfs at location /user/oozie/sharelib/sharelib I have following content:

distcp  hcatalog  hive  hive2  mapreduce-streaming  oozie  pig  
sharelib.properties  spark  sqoop

in the spark folder i do have spark-assembly-1.5.2-hadoop2.6.0.jar

Please let me know if this is the required set up, otherwise what am i missing 
over here.


Regards,
Rohit Mishra

> On 16-Jan-2017, at 1:33 pm, 권병창 <[email protected]> wrote:
> 
> Hi
> try to  make sure there is spark-assembly-1.5.2-hadoop2.6.0.jar in oozie 
> spark share lib. 
> spark assembly jar must be locate in oozie spark share lib. 
>  
> -----Original Message-----
> From: "Rohit Mishra"<[email protected] 
> <mailto:[email protected]>> 
> To: <[email protected] <mailto:[email protected]>>; 
> Cc: 
> Sent: 2017-01-16 (월) 15:04:26
> Subject: oozie issue java.lang.UnsupportedOperationException: Not implemented 
> by the TFS FileSystem implementatio
>  
> Hello,
>  
> I am new to hadoop.
> I am having issue to run a spark job in oozie.
> individually i am able to run the spark job but with oozie after the job is 
> launched i am getting the following error:
>  
> 017-01-12 13:51:57,696 INFO [main] org.apache.hadoop.service.AbstractService: 
> Service org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state 
> INITED; cause: java.lang.UnsupportedOperationException: Not implemented by 
> the TFS FileSystem implementation
> java.lang.UnsupportedOperationException: Not implemented by the TFS 
> FileSystem implementation
>       at org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:216)
>       at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2564)
>       at 
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)
>       at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
>       at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>       at 
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
>       at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
>       at 
> org.apache.hadoop.mapreduce.v2.app.MRAppMaster.getFileSystem(MRAppMaster.java:497)
>       at 
> org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:281)
>       at 
> org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>       at 
> org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1499)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>       at 
> org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1496)
>       at 
> org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1429)
>  
> Spark version: spark-1.5.2-bin-hadoop2.6
> Hadoop: hadoop-2.6.2
> Hbase : hbase-1.1.5
> Oozie: oozie-4.2.0
>  
> snapshot of my pom.xml is:
>  
> <dependency>
>    <groupId>org.apache.zookeeper</groupId>
>    <artifactId>zookeeper</artifactId>
>    <version>3.4.8</version>
>    <type>pom</type>
> </dependency>
> <dependency>
>    <groupId>org.apache.hbase</groupId>
>    <artifactId>hbase-common</artifactId>
>    <version>1.1.5</version>
>    <exclusions>
>       <exclusion>
>          <groupId>org.slf4j</groupId>
>          <artifactId>slf4j-log4j12</artifactId>
>       </exclusion>
>    </exclusions>
> </dependency>
> 
> <dependency>
>    <groupId>org.apache.hbase</groupId>
>    <artifactId>hbase-client</artifactId>
>    <version>1.1.5</version>
>    <exclusions>
>       <exclusion>
>          <groupId>org.slf4j</groupId>
>          <artifactId>slf4j-log4j12</artifactId>
>       </exclusion>
>    </exclusions>
> </dependency>
> 
> <dependency>
>    <groupId>org.apache.hbase</groupId>
>    <artifactId>hbase-server</artifactId>
>    <version>1.1.5</version>
>    <exclusions>
>       <exclusion>
>          <groupId>org.slf4j</groupId>
>          <artifactId>slf4j-log4j12</artifactId>
>       </exclusion>
>    </exclusions>
> </dependency>
> <dependency>
>    <groupId>org.apache.hbase</groupId>
>    <artifactId>hbase-testing-util</artifactId>
>    <version>1.1.5</version>
> </dependency>
> <dependency>
>    <groupId>org.apache.spark</groupId>
>    <artifactId>spark-core_2.11</artifactId>
>    <version>1.5.2</version>
>    <exclusions>
>       <exclusion>
>          <artifactId>javax.servlet</artifactId>
>          <groupId>org.eclipse.jetty.orbit</groupId>
>       </exclusion>
>    </exclusions>
> </dependency>
> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 
> <https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10> -->
> <dependency>
>    <groupId>org.apache.spark</groupId>
>    <artifactId>spark-sql_2.11</artifactId>
>    <version>1.5.2</version>
> </dependency>
> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.10 
> <https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.10> -->
> <dependency>
>    <groupId>org.apache.spark</groupId>
>    <artifactId>spark-yarn_2.11</artifactId>
>    <version>1.5.2</version>
> 
> </dependency>
> 
> 
> <!-- 
> https://mvnrepository.com/artifact/org.mongodb.mongo-hadoop/mongo-hadoop-core 
> <https://mvnrepository.com/artifact/org.mongodb.mongo-hadoop/mongo-hadoop-core>
>  -->
> <dependency>
>    <groupId>org.mongodb.mongo-hadoop</groupId>
>    <artifactId>mongo-hadoop-core</artifactId>
>    <version>1.5.2</version>
> </dependency>
> <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common 
> <https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common> -->
> <dependency>
>    <groupId>org.apache.hadoop</groupId>
>    <artifactId>hadoop-common</artifactId>
>    <version>2.6.2</version>
>    <exclusions>
>       <exclusion>
>          <artifactId>servlet-api</artifactId>
>          <groupId>javax.servlet</groupId>
>       </exclusion>
>       <exclusion>
>          <artifactId>jetty-util</artifactId>
>          <groupId>org.mortbay.jetty</groupId>
>       </exclusion>
>       <exclusion>
>          <artifactId>jsp-api</artifactId>
>          <groupId>javax.servlet.jsp</groupId>
>       </exclusion>
>    </exclusions>
> </dependency>
> <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client 
> <https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client> -->
> <dependency>
>    <groupId>org.apache.hadoop</groupId>
>    <artifactId>hadoop-client</artifactId>
>    <version>2.6.2</version>
>    <exclusions>
>       <exclusion>
>          <artifactId>jetty-util</artifactId>
>          <groupId>org.mortbay.jetty</groupId>
>       </exclusion>
>    </exclusions>
> </dependency>
> <!-- 
> https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core
>  
> <https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core>
>  -->
> <dependency>
>    <groupId>org.apache.hadoop</groupId>
>    <artifactId>hadoop-mapreduce-client-core</artifactId>
>    <version>2.6.2</version>
> </dependency>
> <dependency>
>    <groupId>org.mongodb</groupId>
>    <artifactId>mongo-java-driver</artifactId>
>    <version>3.2.1</version>
> </dependency>
> <!-- hadoop dependency -->
> <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core 
> <https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core> -->
> <dependency>
>    <groupId>org.apache.hadoop</groupId>
>    <artifactId>hadoop-core</artifactId>
>    <version>1.2.1</version>
>    <exclusions>
>       <exclusion>
>          <artifactId>jetty-util</artifactId>
>          <groupId>org.mortbay.jetty</groupId>
>       </exclusion>
>    </exclusions>
> </dependency>
>  
>  
> Till now I have searched several blogs. What i do understand from reading 
> those blogs iis that there is some issue with the tachyon jar which is 
> embedded in spark-assembly-1.5.2-hadoop2.6.0.jar.
> I tried removing tachyon-0.5.0.jar tachyon-client-0.5.0.jar from shared 
> library of oozie (was present under spark library) but then i started getting 
> error:
>  
> Failing Oozie Launcher, Main class 
> [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, 
> org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()I
> java.lang.NoSuchMethodError: 
> org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()I
>  
> Please help me debug and solve it.
>  
> Thanks,
> Rohit

Reply via email to