Hi, thank you very much !

We are using Ambari 2.1.0
Our system is updated from 2.1 to 2.3. We haven't used Ranger before (in
version 2.1)
(We are going to do minor update soon - from 2.3.0 to 2.3.2)

Lukas

/etc/hive
Sep  8 15:41 .
Dec 14 15:36 ..
Sep  8 15:41 2.3.0.0-2557
Sep  8 15:41 conf -> /usr/hdp/current/hive-client/conf
Sep  8 15:41 conf.install


/ETC/hadoop

Sep  8 15:40 2.3.0.0-2557
Sep  8 15:40 conf -> /usr/hdp/current/hadoop-client/conf
Sep  8 15:39 conf.install



/hdp/current
15:40     accumulo-client               ->
/usr/hdp/2.3.0.0-2557/accumulo
15:40     accumulo-gc      ->           /usr/hdp/2.3.0.0-2557/accumulo
15:40     accumulo-master            ->
/usr/hdp/2.3.0.0-2557/accumulo
15:40     accumulo-monitor          ->
/usr/hdp/2.3.0.0-2557/accumulo
15:40     accumulo-tablet              ->
/usr/hdp/2.3.0.0-2557/accumulo
15:40     accumulo-tracer              ->
/usr/hdp/2.3.0.0-2557/accumulo
15:40     atlas-server       ->           /usr/hdp/2.3.0.0-2557/atlas
15:40     falcon-client      ->           /usr/hdp/2.3.0.0-2557/falcon
15:40     falcon-server    ->           /usr/hdp/2.3.0.0-2557/falcon
15:40     flume-server     ->           /usr/hdp/2.3.0.0-2557/flume
15:40     hadoop-client   ->           /usr/hdp/2.3.0.0-2557/hadoop
15:40     hadoop-hdfs-client        ->
/usr/hdp/2.3.0.0-2557/hadoop-hdfs
15:40     hadoop-hdfs-datanode               ->
/usr/hdp/2.3.0.0-2557/hadoop-hdfs
15:40     hadoop-hdfs-journalnode          ->
/usr/hdp/2.3.0.0-2557/hadoop-hdfs
15:40     hadoop-hdfs-namenode            ->
/usr/hdp/2.3.0.0-2557/hadoop-hdfs
15:40     hadoop-hdfs-nfs3          ->
/usr/hdp/2.3.0.0-2557/hadoop-hdfs
15:40     hadoop-hdfs-portmap ->           /usr/hdp/2.3.0.0-2557/hadoop-hdfs
15:40     hadoop-hdfs-secondarynamenode       ->
/usr/hdp/2.3.0.0-2557/hadoop-hdfs
15:40     hadoop-httpfs  ->           /usr/hdp/2.3.0.0-2557/hadoop-httpfs
15:40     hadoop-mapreduce-client         ->
/usr/hdp/2.3.0.0-2557/hadoop-mapreduce
15:40     hadoop-mapreduce-historyserver         ->
/usr/hdp/2.3.0.0-2557/hadoop-mapreduce
15:40     hadoop-yarn-client        ->
/usr/hdp/2.3.0.0-2557/hadoop-yarn
15:40     hadoop-yarn-nodemanager      ->
/usr/hdp/2.3.0.0-2557/hadoop-yarn
15:40     hadoop-yarn-resourcemanager              ->
/usr/hdp/2.3.0.0-2557/hadoop-yarn
15:40     hadoop-yarn-timelineserver     ->
/usr/hdp/2.3.0.0-2557/hadoop-yarn
15:40     hbase-client      ->           /usr/hdp/2.3.0.0-2557/hbase
15:40     hbase-master   ->           /usr/hdp/2.3.0.0-2557/hbase
15:40     hbase-regionserver       ->           /usr/hdp/2.3.0.0-2557/hbase
15:40     hive-client          ->           /usr/hdp/2.3.0.0-2557/hive
15:40     hive-metastore               ->
/usr/hdp/2.3.0.0-2557/hive
15:40     hive-server2      ->           /usr/hdp/2.3.0.0-2557/hive
15:40     hive-webhcat   ->           /usr/hdp/2.3.0.0-2557/hive-hcatalog
15:40     kafka-broker     ->           /usr/hdp/2.3.0.0-2557/kafka
15:40     knox-server       ->           /usr/hdp/2.3.0.0-2557/knox
15:40     mahout-client   ->           /usr/hdp/2.3.0.0-2557/mahout
15:40     oozie-client        ->           /usr/hdp/2.3.0.0-2557/oozie
15:40     oozie-server      ->           /usr/hdp/2.3.0.0-2557/oozie
15:40     phoenix-client  ->           /usr/hdp/2.3.0.0-2557/phoenix
15:40     phoenix-server                ->
/usr/hdp/2.3.0.0-2557/phoenix
15:40     pig-client             ->           /usr/hdp/2.3.0.0-2557/pig
15:40     ranger-admin    ->           /usr/hdp/2.3.0.0-2557/ranger-admin
15:40     ranger-kms        ->           /usr/hdp/2.3.0.0-2557/ranger-kms
15:40     ranger-usersync              ->
/usr/hdp/2.3.0.0-2557/ranger-usersync
15:40     slider-client        ->           /usr/hdp/2.3.0.0-2557/slider
15:40     spark-client        ->           /usr/hdp/2.3.0.0-2557/spark
15:40     spark-historyserver       ->           /usr/hdp/2.3.0.0-2557/spark
15:40     sqoop-client      ->           /usr/hdp/2.3.0.0-2557/sqoop
15:40     sqoop-server    ->           /usr/hdp/2.3.0.0-2557/sqoop
15:40     storm-client       ->           /usr/hdp/2.3.0.0-2557/storm
15:40     storm-nimbus   ->           /usr/hdp/2.3.0.0-2557/storm
15:40     storm-slider-client          ->
/usr/hdp/2.3.0.0-2557/storm-slider-client
15:40     storm-supervisor            ->
/usr/hdp/2.3.0.0-2557/storm
15:40     tez-client            ->           /usr/hdp/2.3.0.0-2557/tez
15:40     zookeeper-client            ->
/usr/hdp/2.3.0.0-2557/zookeeper
15:40     zookeeper-server          ->
/usr/hdp/2.3.0.0-2557/zookeeper

Hdp-select status ranger-admin
ranger-admin - 2.3.0.0-2557



---------- Forwarded message ----------
From: Gautam Borad <[email protected]>
Date: 2015-12-15 12:24 GMT+01:00
Subject: Re: RangerHivePlugin: FileNotFoundException - xasecure-audit.xml
To: [email protected]


Hi Lukas,
    If you provide answers to the below question, it would help in
debugging more about the core issue.

   1. What version of Ambari are you using?
   2. Seems like the HDP is 2.3. Is this a fresh installation OR is it an
   upgrade of HDP from 2.2 -> 2.3
   3. Can you paste the output of :
   - ls -al /etc/hadoop/
      - ls -al /etc/hive
      - ls -al /etc
   4. Check if the HDP is pointing to older HDP ( this is only if you did
   update from 2.2 to 2.3 )
      - ls -l /usr/hdp/current
      - hdp-select status ranger-admin



On Tue, Dec 15, 2015 at 3:57 PM, lukas nalezenec <[email protected]>
wrote:

> Hi,
>
> When i install Hive Ranger plugin I am getting FileNotFound exception
> whenever the plugin is activated.
>
> I am already using HDFS ranger plugin and it works well although it may
> use the same code (The plugin also extends RangerBasePlugin).
>
>
> Is it possible that the bug is similar to
> https://issues.apache.org/jira/browse/AMBARI-13564 ?
>
>
> > 15/12/14 13:30:36 [main]: ERROR ql.Driver: FAILED: RuntimeException
> java.io.FileNotFoundException: /etc/hive/2.3.0.0-2557/0/xasecure-audit.xml
> (No such file or directory)
>
>     > java.lang.RuntimeException: java.io.FileNotFoundException:
> /etc/hive/2.3.0.0-2557/0/xasecure-audit.xml (No such file or directory)
>
>     >         at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2639)
>
>     >         at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2502)
>
>     >         at
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)
>
>     >         at
> org.apache.ranger.authorization.hadoop.config.RangerConfiguration.initAudit(RangerConfiguration.java:120)
>
>     >         at
> org.apache.ranger.plugin.service.RangerBasePlugin.init(RangerBasePlugin.java:89)
>
>     >         at
> org.apache.ranger.authorization.hive.authorizer.RangerHivePlugin.init(RangerHiveAuthorizer.java:960)
>
>     >         at
> org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.<init>(RangerHiveAuthorizer.java:100)
>
>     >         at
> org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory.createHiveAuthorizer(RangerHiveAuthorizerFactory.java:37)
>
>     >         at
> org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:734)
>
>     >         at
> org.apache.hadoop.hive.ql.session.SessionState.getAuthorizationMode(SessionState.java:1504)
>
>     >         at
> org.apache.hadoop.hive.ql.session.SessionState.isAuthorizationModeV2(SessionState.java:1515)
>
>     >         at
> org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:566)
>
>     >         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
>
>     >         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
>
>     >         at
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
>
>     >         at
> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
>
>     >         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>
>     >         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
>
>     >         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
>
>     >         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
>
>     >         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
>
>     >         at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
>
>     >         at
> org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
>
>     >         at
> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
>
>     >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>     >         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>     >         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>     >         at java.lang.reflect.Method.invoke(Method.java:497)
>
>     >         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>
>     >         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>
>     > Caused by: java.io.FileNotFoundException:
> /etc/hive/2.3.0.0-2557/0/xasecure-audit.xml (No such file or directory)
>
>     >         at java.io.FileInputStream.open0(Native Method)
>
>     >         at java.io.FileInputStream.open(FileInputStream.java:195)
>
>     >         at java.io.FileInputStream.<init>(FileInputStream.java:138)
>
>     >         at java.io.FileInputStream.<init>(FileInputStream.java:93)
>
>     >         at
> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
>
>     >         at
> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
>
>     >         at java.net.URL.openStream(URL.java:1038)
>
>     >         at
> org.apache.hadoop.conf.Configuration.parse(Configuration.java:2468)
>
>     >         at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2536)
>
>     >         ... 29 more
>



-- 
Regards,
Gautam.

Reply via email to