Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-17 Thread Li Yang
The download page now has Kylin 1.5.4 on HBase1.x. Recommend that version
if it solves your problem.

http://kylin.apache.org/download/

On Mon, Sep 12, 2016 at 10:45 AM, udana pathirana  wrote:

> Greetings ,
>
> I manually merged "1.5.x-HBase1.x" with "mater" and built the binary
> package. Now I can login to the UI . There are someother issues , let me
> create a new thread for that.
>
>
> On Mon, Sep 12, 2016 at 11:07 AM, udana pathirana <
> udana.pathir...@gmail.com> wrote:
>
>> Also I see hbase 0.98 in the root pom.xml of the "master" branch :
>> So I cant use "master" branch with HBase 1.x ?
>>
>>   
>> 
>> 1.7
>> 3.3.9
>> UTF-8> oding>
>> UTF-8> outputEncoding>
>>
>> 
>> 2.6.0
>> 2.6.0
>>
>> 
>> 0.14.0
>> 0.14.0
>>
>> 
>> 0.98.8-hadoop2
>> 0.8.1
>>
>> >>>
 Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
 MaxPermSize=128M; support was removed in 8.0
 SLF4J: Class path contains multiple SLF4J bindings.
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/sl
 f4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-log
 4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf
 4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.2.4.
 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
 ggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.2.4.
 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
 ggerBinder.class]
 SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for
 an explanation.
 SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFac
 tory]
 usage: java org.apache.catalina.startup.Catalina [ -config
 {pathname} ] [ -nonaming ]  { -help | start | stop }
 Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
 lifecycleEvent
 INFO: The APR based Apache Tomcat Native library which allows
 optimal performance in production environments was not found on the
 java.library.path: /home/udana/hdp_c5000/hadoop-2
 .7.1.2.4.2.0-258/lib/native
 Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
 INFO: Initializing ProtocolHandler ["http-bio-7070"]
 Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
 INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
 Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
 INFO: Initialization processed in 545 ms
 Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
 startInternal
 INFO: Starting service Catalina
 Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
 startInternal
 INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
 Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig
 deployWAR
 INFO: Deploying web application archive
 /home/udana/apache-kylin-1.5.3-HBase1.x-bin/tomcat/webapps/k
 ylin.war
 Sep 06, 2016 1:47:30 AM org.apache.tomcat.util.scan.StandardJarScanner
 scan
 WARNING: Failed to scan [file:/home/udana/hdp_c5000/ha
 doop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar] from
 classloader hierarchy
 java.io.FileNotFoundException: /home/udana/hdp_c5000/hadoop-2
 .7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar (No such file or
 directory)
 at java.util.zip.ZipFile.open(Native Method)
 at java.util.zip.ZipFile.(ZipFile.java:219)
 at java.util.zip.ZipFile.(ZipFile.java:149)
 at java.util.jar.JarFile.(JarFile.java:166)
 at java.util.jar.JarFile.(JarFile.java:103)
 at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93)
 at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.ja
 va:69)
 at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.j
 ava:99)
 at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConn
 ection.java:122)
 at sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLC
 onnection.java:89)
 at org.apache.tomcat.util.scan.FileUrlJar.(FileUrlJar.jav
 a:41)
 at 

Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-11 Thread udana pathirana
Greetings ,

I manually merged "1.5.x-HBase1.x" with "mater" and built the binary
package. Now I can login to the UI . There are someother issues , let me
create a new thread for that.


On Mon, Sep 12, 2016 at 11:07 AM, udana pathirana  wrote:

> Also I see hbase 0.98 in the root pom.xml of the "master" branch :
> So I cant use "master" branch with HBase 1.x ?
>
>   
> 
> 1.7
> 3.3.9
> UTF-8
> UTF-8 reporting.outputEncoding>
>
> 
> 2.6.0
> 2.6.0
>
> 
> 0.14.0
> 0.14.0
>
> 
> 0.98.8-hadoop2
> 0.8.1
>
> >>
>>> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
>>> MaxPermSize=128M; support was removed in 8.0
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/sl
>>> f4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-log
>>> 4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf
>>> 4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.2.4.
>>> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
>>> ggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.2.4.
>>> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
>>> ggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> usage: java org.apache.catalina.startup.Catalina [ -config
>>> {pathname} ] [ -nonaming ]  { -help | start | stop }
>>> Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
>>> lifecycleEvent
>>> INFO: The APR based Apache Tomcat Native library which allows
>>> optimal performance in production environments was not found on the
>>> java.library.path: /home/udana/hdp_c5000/hadoop-2
>>> .7.1.2.4.2.0-258/lib/native
>>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
>>> INFO: Initialization processed in 545 ms
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
>>> startInternal
>>> INFO: Starting service Catalina
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
>>> startInternal
>>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig
>>> deployWAR
>>> INFO: Deploying web application archive
>>> /home/udana/apache-kylin-1.5.3-HBase1.x-bin/tomcat/webapps/kylin.war
>>> Sep 06, 2016 1:47:30 AM org.apache.tomcat.util.scan.StandardJarScanner
>>> scan
>>> WARNING: Failed to scan [file:/home/udana/hdp_c5000/ha
>>> doop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar] from
>>> classloader hierarchy
>>> java.io.FileNotFoundException: /home/udana/hdp_c5000/hadoop-2
>>> .7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar (No such file or
>>> directory)
>>> at java.util.zip.ZipFile.open(Native Method)
>>> at java.util.zip.ZipFile.(ZipFile.java:219)
>>> at java.util.zip.ZipFile.(ZipFile.java:149)
>>> at java.util.jar.JarFile.(JarFile.java:166)
>>> at java.util.jar.JarFile.(JarFile.java:103)
>>> at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93)
>>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.ja
>>> va:69)
>>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.j
>>> ava:99)
>>> at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConn
>>> ection.java:122)
>>> at sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLC
>>> onnection.java:89)
>>> at org.apache.tomcat.util.scan.FileUrlJar.(FileUrlJar.java:41)
>>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactor
>>> y.java:34)
>>> at org.apache.catalina.startup.ContextConfig$FragmentJarScanner
>>> Callback.scan(ContextConfig.java:2679)
>>> at org.apache.tomcat.util.scan.StandardJarScanner.process(Stand
>>> ardJarScanner.java:259)
>>> at org.apache.tomcat.util.scan.StandardJarScanner.scan(Standard
>>> JarScanner.java:221)
>>> at 

Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-11 Thread udana pathirana
Also I see hbase 0.98 in the root pom.xml of the "master" branch :
So I cant use "master" branch with HBase 1.x ?

  

1.7
3.3.9
UTF-8

UTF-8


2.6.0
2.6.0


0.14.0
0.14.0


0.98.8-hadoop2
0.8.1

>
>> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
>> MaxPermSize=128M; support was removed in 8.0
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/sl
>> f4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-log
>> 4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf
>> 4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.2.4.
>> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
>> ggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.2.4.
>> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
>> ggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> usage: java org.apache.catalina.startup.Catalina [ -config
>> {pathname} ] [ -nonaming ]  { -help | start | stop }
>> Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
>> lifecycleEvent
>> INFO: The APR based Apache Tomcat Native library which allows optimal
>> performance in production environments was not found on the
>> java.library.path: /home/udana/hdp_c5000/hadoop-2
>> .7.1.2.4.2.0-258/lib/native
>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
>> INFO: Initialization processed in 545 ms
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
>> startInternal
>> INFO: Starting service Catalina
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
>> startInternal
>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig
>> deployWAR
>> INFO: Deploying web application archive /home/udana/apache-kylin-1.5.3
>> -HBase1.x-bin/tomcat/webapps/kylin.war
>> Sep 06, 2016 1:47:30 AM org.apache.tomcat.util.scan.StandardJarScanner
>> scan
>> WARNING: Failed to scan [file:/home/udana/hdp_c5000/ha
>> doop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar] from
>> classloader hierarchy
>> java.io.FileNotFoundException: /home/udana/hdp_c5000/hadoop-2
>> .7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar (No such file or
>> directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.(ZipFile.java:219)
>> at java.util.zip.ZipFile.(ZipFile.java:149)
>> at java.util.jar.JarFile.(JarFile.java:166)
>> at java.util.jar.JarFile.(JarFile.java:103)
>> at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93)
>> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
>> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.j
>> ava:99)
>> at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConn
>> ection.java:122)
>> at sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLC
>> onnection.java:89)
>> at org.apache.tomcat.util.scan.FileUrlJar.(FileUrlJar.java:41)
>> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactor
>> y.java:34)
>> at org.apache.catalina.startup.ContextConfig$FragmentJarScanner
>> Callback.scan(ContextConfig.java:2679)
>> at org.apache.tomcat.util.scan.StandardJarScanner.process(Stand
>> ardJarScanner.java:259)
>> at org.apache.tomcat.util.scan.StandardJarScanner.scan(Standard
>> JarScanner.java:221)
>> at org.apache.catalina.startup.ContextConfig.processJarsForWebF
>> ragments(ContextConfig.java:1915)
>> at org.apache.catalina.startup.ContextConfig.webConfig(ContextC
>> onfig.java:1270)
>> at org.apache.catalina.startup.ContextConfig.configureStart(Con
>> textConfig.java:887)
>> at org.apache.catalina.startup.ContextConfig.lifecycleEvent(Con
>> textConfig.java:387)
>> at 

Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-11 Thread udana pathirana
Thank you ,

When I build "master" branch,  the Spring application fails initialising
throwing following error:

Caused by: java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.client.HBaseAdmin.(Lorg/apache/hadoop/hbase/client/HConnection;)V
at
org.apache.kylin.storage.hbase.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:273)
at
org.apache.kylin.storage.hbase.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:265)
at
org.apache.kylin.rest.security.RealAclHBaseStorage.prepareHBaseTable(RealAclHBaseStorage.java:48)
at
org.apache.kylin.rest.security.MockAclHBaseStorage.prepareHBaseTable(MockAclHBaseStorage.java:53)
at org.apache.kylin.rest.service.AclService.init(AclService.java:121)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)



 In Hadoop class path, my hbase-client lib is
 "hbase-client-1.1.2.2.3.4.7-4.jar"

 I don't see this error when I build the branch "1.5.x-HBase1.x".

On Mon, Sep 12, 2016 at 10:08 AM, ShaoFeng Shi 
wrote:

>  "1.5.4-rc1" branch is for HBase 0.98, couldn't work for HBase 1.x;
>
> 1.5.x-HBase1.x branch has been merged with latest master branch, you can
> manually make a build from it.
>
> 2016-09-12 8:15 GMT+08:00 udana pathirana :
>
>> Thank you for the reply. According to the ticket ,the issue has been
>> fixed.
>> I am using 1.5.x-HBase1.x branch.Will it be fixed in all the branches ?
>> Can I use "1.5.4-rc1" branch with HBase1.x ?
>>
>> On Sun, Sep 11, 2016 at 9:23 PM, Li Yang  wrote:
>>
>>> You are hit by this issue: https://issues.apache.org/jira
>>> /browse/KYLIN-1963
>>>
>>> Not very common, only happens in certain env (with certain version of
>>> log4j). It has been fixed and will be release very soon. Keep tuned.
>>>
>>> Cheers
>>> Yang
>>>
>>>
>>>
>>> On Wed, Sep 7, 2016 at 8:31 AM, udana pathirana <
>>> udana.pathir...@gmail.com> wrote:
>>>
 Greetings,

 Could someone give me some tips on resolving this issue please.

 1) Do I have to fix the warning about capacity-scheduler JARS ?
 (According to our Hadoop Admins, they cannot find this folder in any of the
 nodes.We are running HortonWorks 2.4.2)
  "WARNING: Failed to process JAR [jar:file:/home/udana/hdp_c500
 0/hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar!/] "

 2) How to get the detailed logs for these errors
 for further investigation ?
 3) I can successfully acess 'hadoop','hbase' and 'hive' commands from
 the client node.Do I need any other special settings to install Kylin ?
 4) Our cluster is secured.Do I have to set any special kerberos related
 settings for Kylin? (I have keytab files and krb5.conf for hadoop client
 node)

 Best regards,


 On Tue, Sep 6, 2016 at 10:53 AM, udana pathirana <
 udana.pathir...@gmail.com> wrote:

> Greetings,
>
> I am trying to install "apache-kylin-1.5.3-HBase1.x-bin" on one of
> the client nodes(edge node).
> Our Hadoop cluster is HDP 2.4.2.
> HBase version is 1.x running on Slider.
> I can succesfully access hive,hbase,hadoop commands from the client
> node.
>
> When I try to start Kylin I get the following SEVERE log messeges, and
> cannot see any UI when I access the 7070 port.
> Why I am getting these SEVERE  logs?
> How can i see more detailed error logs ? (cannot find useful logs in
> tomcat/logs folder)
>
>
>
> --
>
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
> MaxPermSize=128M; support was removed in 8.0
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
> 0/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/sl
> f4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
> 0/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-log
> 4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
> 0/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf
> 4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.2.4.
> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
> ggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.2.4.
> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
> ggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> usage: java org.apache.catalina.startup.Catalina [ -config {pathname}
> ] [ -nonaming ]  { -help | start | stop }
> Sep 06, 2016 1:47:29 AM 

Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-11 Thread ShaoFeng Shi
 "1.5.4-rc1" branch is for HBase 0.98, couldn't work for HBase 1.x;

1.5.x-HBase1.x branch has been merged with latest master branch, you can
manually make a build from it.

2016-09-12 8:15 GMT+08:00 udana pathirana :

> Thank you for the reply. According to the ticket ,the issue has been fixed.
> I am using 1.5.x-HBase1.x branch.Will it be fixed in all the branches ?
> Can I use "1.5.4-rc1" branch with HBase1.x ?
>
> On Sun, Sep 11, 2016 at 9:23 PM, Li Yang  wrote:
>
>> You are hit by this issue: https://issues.apache.org/jira
>> /browse/KYLIN-1963
>>
>> Not very common, only happens in certain env (with certain version of
>> log4j). It has been fixed and will be release very soon. Keep tuned.
>>
>> Cheers
>> Yang
>>
>>
>>
>> On Wed, Sep 7, 2016 at 8:31 AM, udana pathirana <
>> udana.pathir...@gmail.com> wrote:
>>
>>> Greetings,
>>>
>>> Could someone give me some tips on resolving this issue please.
>>>
>>> 1) Do I have to fix the warning about capacity-scheduler JARS ?
>>> (According to our Hadoop Admins, they cannot find this folder in any of the
>>> nodes.We are running HortonWorks 2.4.2)
>>>  "WARNING: Failed to process JAR [jar:file:/home/udana/hdp_c500
>>> 0/hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar!/] "
>>>
>>> 2) How to get the detailed logs for these errors
>>> for further investigation ?
>>> 3) I can successfully acess 'hadoop','hbase' and 'hive' commands from
>>> the client node.Do I need any other special settings to install Kylin ?
>>> 4) Our cluster is secured.Do I have to set any special kerberos related
>>> settings for Kylin? (I have keytab files and krb5.conf for hadoop client
>>> node)
>>>
>>> Best regards,
>>>
>>>
>>> On Tue, Sep 6, 2016 at 10:53 AM, udana pathirana <
>>> udana.pathir...@gmail.com> wrote:
>>>
 Greetings,

 I am trying to install "apache-kylin-1.5.3-HBase1.x-bin" on one of the
 client nodes(edge node).
 Our Hadoop cluster is HDP 2.4.2.
 HBase version is 1.x running on Slider.
 I can succesfully access hive,hbase,hadoop commands from the client
 node.

 When I try to start Kylin I get the following SEVERE log messeges, and
 cannot see any UI when I access the 7070 port.
 Why I am getting these SEVERE  logs?
 How can i see more detailed error logs ? (cannot find useful logs in
 tomcat/logs folder)



 --

 Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
 MaxPermSize=128M; support was removed in 8.0
 SLF4J: Class path contains multiple SLF4J bindings.
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/sl
 f4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-log
 4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf
 4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.2.4.
 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
 ggerBinder.class]
 SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.2.4.
 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLo
 ggerBinder.class]
 SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
 explanation.
 SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
 usage: java org.apache.catalina.startup.Catalina [ -config {pathname}
 ] [ -nonaming ]  { -help | start | stop }
 Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
 lifecycleEvent
 INFO: The APR based Apache Tomcat Native library which allows optimal
 performance in production environments was not found on the
 java.library.path: /home/udana/hdp_c5000/hadoop-2
 .7.1.2.4.2.0-258/lib/native
 Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
 INFO: Initializing ProtocolHandler ["http-bio-7070"]
 Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
 INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
 Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
 INFO: Initialization processed in 545 ms
 Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
 startInternal
 INFO: Starting service Catalina
 Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
 startInternal
 INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
 Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig
 deployWAR
 INFO: Deploying web application archive /home/udana/apache-kylin-1.5.3
 

Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-11 Thread udana pathirana
Thank you for the reply. According to the ticket ,the issue has been fixed.
I am using 1.5.x-HBase1.x branch.Will it be fixed in all the branches ?
Can I use "1.5.4-rc1" branch with HBase1.x ?

On Sun, Sep 11, 2016 at 9:23 PM, Li Yang  wrote:

> You are hit by this issue: https://issues.apache.org/
> jira/browse/KYLIN-1963
>
> Not very common, only happens in certain env (with certain version of
> log4j). It has been fixed and will be release very soon. Keep tuned.
>
> Cheers
> Yang
>
>
>
> On Wed, Sep 7, 2016 at 8:31 AM, udana pathirana  > wrote:
>
>> Greetings,
>>
>> Could someone give me some tips on resolving this issue please.
>>
>> 1) Do I have to fix the warning about capacity-scheduler JARS ?
>> (According to our Hadoop Admins, they cannot find this folder in any of the
>> nodes.We are running HortonWorks 2.4.2)
>>  "WARNING: Failed to process JAR [jar:file:/home/udana/hdp_c500
>> 0/hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar!/] "
>>
>> 2) How to get the detailed logs for these errors
>> for further investigation ?
>> 3) I can successfully acess 'hadoop','hbase' and 'hive' commands from the
>> client node.Do I need any other special settings to install Kylin ?
>> 4) Our cluster is secured.Do I have to set any special kerberos related
>> settings for Kylin? (I have keytab files and krb5.conf for hadoop client
>> node)
>>
>> Best regards,
>>
>>
>> On Tue, Sep 6, 2016 at 10:53 AM, udana pathirana <
>> udana.pathir...@gmail.com> wrote:
>>
>>> Greetings,
>>>
>>> I am trying to install "apache-kylin-1.5.3-HBase1.x-bin" on one of the
>>> client nodes(edge node).
>>> Our Hadoop cluster is HDP 2.4.2.
>>> HBase version is 1.x running on Slider.
>>> I can succesfully access hive,hbase,hadoop commands from the client node.
>>>
>>> When I try to start Kylin I get the following SEVERE log messeges, and
>>> cannot see any UI when I access the 7070 port.
>>> Why I am getting these SEVERE  logs?
>>> How can i see more detailed error logs ? (cannot find useful logs in
>>> tomcat/logs folder)
>>>
>>>
>>>
>>> --
>>>
>>> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
>>> MaxPermSize=128M; support was removed in 8.0
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/sl
>>> f4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-log
>>> 4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf
>>> 4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.2.4.
>>> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/Static
>>> LoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.2.4.
>>> 2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/Static
>>> LoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ]
>>> [ -nonaming ]  { -help | start | stop }
>>> Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
>>> lifecycleEvent
>>> INFO: The APR based Apache Tomcat Native library which allows optimal
>>> performance in production environments was not found on the
>>> java.library.path: /home/udana/hdp_c5000/hadoop-2
>>> .7.1.2.4.2.0-258/lib/native
>>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
>>> INFO: Initialization processed in 545 ms
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
>>> startInternal
>>> INFO: Starting service Catalina
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
>>> startInternal
>>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
>>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig deployWAR
>>> INFO: Deploying web application archive /home/udana/apache-kylin-1.5.3
>>> -HBase1.x-bin/tomcat/webapps/kylin.war
>>> Sep 06, 2016 1:47:30 AM org.apache.tomcat.util.scan.StandardJarScanner
>>> scan
>>> WARNING: Failed to scan [file:/home/udana/hdp_c5000/ha
>>> doop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar] from
>>> classloader hierarchy
>>> java.io.FileNotFoundException: /home/udana/hdp_c5000/hadoop-2
>>> .7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar (No such 

Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-11 Thread Li Yang
You are hit by this issue: https://issues.apache.org/jira/browse/KYLIN-1963

Not very common, only happens in certain env (with certain version of
log4j). It has been fixed and will be release very soon. Keep tuned.

Cheers
Yang



On Wed, Sep 7, 2016 at 8:31 AM, udana pathirana 
wrote:

> Greetings,
>
> Could someone give me some tips on resolving this issue please.
>
> 1) Do I have to fix the warning about capacity-scheduler JARS ? (According
> to our Hadoop Admins, they cannot find this folder in any of the nodes.We
> are running HortonWorks 2.4.2)
>  "WARNING: Failed to process JAR [jar:file:/home/udana/hdp_c500
> 0/hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar!/] "
>
> 2) How to get the detailed logs for these errors for further investigation
> ?
> 3) I can successfully acess 'hadoop','hbase' and 'hive' commands from the
> client node.Do I need any other special settings to install Kylin ?
> 4) Our cluster is secured.Do I have to set any special kerberos related
> settings for Kylin? (I have keytab files and krb5.conf for hadoop client
> node)
>
> Best regards,
>
>
> On Tue, Sep 6, 2016 at 10:53 AM, udana pathirana <
> udana.pathir...@gmail.com> wrote:
>
>> Greetings,
>>
>> I am trying to install "apache-kylin-1.5.3-HBase1.x-bin" on one of the
>> client nodes(edge node).
>> Our Hadoop cluster is HDP 2.4.2.
>> HBase version is 1.x running on Slider.
>> I can succesfully access hive,hbase,hadoop commands from the client node.
>>
>> When I try to start Kylin I get the following SEVERE log messeges, and
>> cannot see any UI when I access the 7070 port.
>> Why I am getting these SEVERE  logs?
>> How can i see more detailed error logs ? (cannot find useful logs in
>> tomcat/logs folder)
>>
>>
>>
>> --
>>
>> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
>> MaxPermSize=128M; support was removed in 8.0
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/
>> slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-
>> log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/
>> slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.
>> 2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/Stat
>> icLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/udana/hdp_c500
>> 0/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.
>> 2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/Stat
>> icLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ]
>> [ -nonaming ]  { -help | start | stop }
>> Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
>> lifecycleEvent
>> INFO: The APR based Apache Tomcat Native library which allows optimal
>> performance in production environments was not found on the
>> java.library.path: /home/udana/hdp_c5000/hadoop-2
>> .7.1.2.4.2.0-258/lib/native
>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>> INFO: Initializing ProtocolHandler ["http-bio-7070"]
>> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
>> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
>> INFO: Initialization processed in 545 ms
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
>> startInternal
>> INFO: Starting service Catalina
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
>> startInternal
>> INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
>> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig deployWAR
>> INFO: Deploying web application archive /home/udana/apache-kylin-1.5.3
>> -HBase1.x-bin/tomcat/webapps/kylin.war
>> Sep 06, 2016 1:47:30 AM org.apache.tomcat.util.scan.StandardJarScanner
>> scan
>> WARNING: Failed to scan [file:/home/udana/hdp_c5000/ha
>> doop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar] from
>> classloader hierarchy
>> java.io.FileNotFoundException: /home/udana/hdp_c5000/hadoop-2
>> .7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar (No such file or
>> directory)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.(ZipFile.java:219)
>> at java.util.zip.ZipFile.(ZipFile.java:149)
>> at java.util.jar.JarFile.(JarFile.java:166)
>> at java.util.jar.JarFile.(JarFile.java:103)
>> at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93)
>> at 

Re: SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-06 Thread udana pathirana
Greetings,

Could someone give me some tips on resolving this issue please.

1) Do I have to fix the warning about capacity-scheduler JARS ? (According
to our Hadoop Admins, they cannot find this folder in any of the nodes.We
are running HortonWorks 2.4.2)
 "WARNING: Failed to process JAR [jar:file:/home/udana/hdp_
c5000/hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar!/] "

2) How to get the detailed logs for these errors for further investigation ?
3) I can successfully acess 'hadoop','hbase' and 'hive' commands from the
client node.Do I need any other special settings to install Kylin ?
4) Our cluster is secured.Do I have to set any special kerberos related
settings for Kylin? (I have keytab files and krb5.conf for hadoop client
node)

Best regards,


On Tue, Sep 6, 2016 at 10:53 AM, udana pathirana 
wrote:

> Greetings,
>
> I am trying to install "apache-kylin-1.5.3-HBase1.x-bin" on one of the
> client nodes(edge node).
> Our Hadoop cluster is HDP 2.4.2.
> HBase version is 1.x running on Slider.
> I can succesfully access hive,hbase,hadoop commands from the client node.
>
> When I try to start Kylin I get the following SEVERE log messeges, and
> cannot see any UI when I access the 7070 port.
> Why I am getting these SEVERE  logs?
> How can i see more detailed error logs ? (cannot find useful logs in
> tomcat/logs folder)
>
>
>
> --
>
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
> MaxPermSize=128M; support was removed in 8.0
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/home/udana/hdp_
> c5000/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/
> StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_
> c5000/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/
> slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_
> c5000/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/
> StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_
> c5000/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.
> 1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/
> StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/udana/hdp_
> c5000/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.
> 1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/
> StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ] [
> -nonaming ]  { -help | start | stop }
> Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
> lifecycleEvent
> INFO: The APR based Apache Tomcat Native library which allows optimal
> performance in production environments was not found on the
> java.library.path: /home/udana/hdp_c5000/hadoop-
> 2.7.1.2.4.2.0-258/lib/native
> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
> INFO: Initializing ProtocolHandler ["http-bio-7070"]
> Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
> INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
> INFO: Initialization processed in 545 ms
> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
> startInternal
> INFO: Starting service Catalina
> Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
> startInternal
> INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
> Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig deployWAR
> INFO: Deploying web application archive /home/udana/apache-kylin-1.5.
> 3-HBase1.x-bin/tomcat/webapps/kylin.war
> Sep 06, 2016 1:47:30 AM org.apache.tomcat.util.scan.StandardJarScanner
> scan
> WARNING: Failed to scan [file:/home/udana/hdp_c5000/
> hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar] from
> classloader hierarchy
> java.io.FileNotFoundException: /home/udana/hdp_c5000/hadoop-
> 2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar (No such file or
> directory)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.(ZipFile.java:219)
> at java.util.zip.ZipFile.(ZipFile.java:149)
> at java.util.jar.JarFile.(JarFile.java:166)
> at java.util.jar.JarFile.(JarFile.java:103)
> at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93)
> at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
> at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
> at sun.net.www.protocol.jar.JarURLConnection.connect(
> JarURLConnection.java:122)
> at sun.net.www.protocol.jar.JarURLConnection.getJarFile(
> JarURLConnection.java:89)
> at org.apache.tomcat.util.scan.FileUrlJar.(FileUrlJar.java:41)
> at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
> at 

SEVERE logs when installing Kylin-Hbase1.x on a HDP cluster

2016-09-05 Thread udana pathirana
Greetings,

I am trying to install "apache-kylin-1.5.3-HBase1.x-bin" on one of the
client nodes(edge node).
Our Hadoop cluster is HDP 2.4.2.
HBase version is 1.x running on Slider.
I can succesfully access hive,hbase,hadoop commands from the client node.

When I try to start Kylin I get the following SEVERE log messeges, and
cannot see any UI when I access the 7070 port.
Why I am getting these SEVERE  logs?
How can i see more detailed error logs ? (cannot find useful logs in
tomcat/logs folder)



--

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
MaxPermSize=128M; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/udana/hdp_c5000/hbase-1.1.2.2.3.4.7-4/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/udana/hdp_c5000/hadoop-2.7.1.2.4.2.0-258/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/udana/hdp_c5000/tez-0.7.0.2.4.2.0-258/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/udana/hdp_c5000/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-examples-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/udana/hdp_c5000/spark-2.10-1.6.1.2.4.2.0-258/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
usage: java org.apache.catalina.startup.Catalina [ -config {pathname} ] [
-nonaming ]  { -help | start | stop }
Sep 06, 2016 1:47:29 AM org.apache.catalina.core.AprLifecycleListener
lifecycleEvent
INFO: The APR based Apache Tomcat Native library which allows optimal
performance in production environments was not found on the
java.library.path: /home/udana/hdp_c5000/hadoop-2.7.1.2.4.2.0-258/lib/native
Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["http-bio-7070"]
Sep 06, 2016 1:47:30 AM org.apache.coyote.AbstractProtocol init
INFO: Initializing ProtocolHandler ["ajp-bio-9009"]
Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 545 ms
Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardService
startInternal
INFO: Starting service Catalina
Sep 06, 2016 1:47:30 AM org.apache.catalina.core.StandardEngine
startInternal
INFO: Starting Servlet Engine: Apache Tomcat/7.0.69
Sep 06, 2016 1:47:30 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive
/home/udana/apache-kylin-1.5.3-HBase1.x-bin/tomcat/webapps/kylin.war
Sep 06, 2016 1:47:30 AM org.apache.tomcat.util.scan.StandardJarScanner scan
WARNING: Failed to scan
[file:/home/udana/hdp_c5000/hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar]
from classloader hierarchy
java.io.FileNotFoundException:
/home/udana/hdp_c5000/hadoop-2.7.1.2.4.2.0-258/contrib/capacity-scheduler/*.jar
(No such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.(ZipFile.java:219)
at java.util.zip.ZipFile.(ZipFile.java:149)
at java.util.jar.JarFile.(JarFile.java:166)
at java.util.jar.JarFile.(JarFile.java:103)
at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at
org.apache.catalina.startup.ContextConfig$FragmentJarScannerCallback.scan(ContextConfig.java:2679)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:259)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:221)
at
org.apache.catalina.startup.ContextConfig.processJarsForWebFragments(ContextConfig.java:1915)
at
org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1270)
at
org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:887)
at
org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:387)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5472)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899)
at