Hi Nikhil:
  thanks for your suggestions. I checked our hbase cluster and it was in a
healthy state.

  And I am trying to find the problem from the exception stack. I found
that the TimelineAuthenticator.java file was removed since hadoop2.6.
Reference:
https://jira.apache.org/jira/secure/attachment/12675639/YARN-2676.5.patch

  Our hadoop cluster version is 2.7.3, and the kylin version I downloaded
is apache-kylin-1.6.0-hbase1.x-bin.tar.gz. I don't know why Kylin calls
TimelineAuthenticator.java.

  Finally, we upgraded the kylin version to
apache-kylin-2.4.1-bin-hbase1x.tar.gz and it works fine.


Nikhil Jain <[email protected]> 于2019年8月30日周五 下午1:05写道:

> Hello Alex,
>
>
>
> Please check below link, this might be helpful.
>
>
> http://apache-kylin.74782.x6.nabble.com/Extract-Fact-Table-Distinct-Columns-error-td10092.html
>
>
>
> Let me know if it helps!
>
>
>
> Best Regards,
>
> *Nikhil Jain*
>
>
>
>
>
> *From: *Alex Wang <[email protected]>
> *Reply-To: *"[email protected]" <[email protected]>
> *Date: *Friday, August 30, 2019 at 12:39 PM
> *To: *"[email protected]" <[email protected]>, "
> [email protected]" <[email protected]>
> *Subject: *Kylin truned on kerberos , Error execute MapReduceExecutable
>
>
>
> Hi ALL:
>
> Environment: kylin 1.6
> Jdk: 1.8.0_74
> Hadoop: 2.7.3
> Hbase : 1.2.4
>
> I turned on kerberos. Set kylin.job.status.with.kerberos=true
> klist  info:
>
>
>
> Ticket cache: FILE:/tmp/krb5cc_500
> Default principal: kylin/[email protected]
>
> Valid starting       Expires              Service principal
> 08/30/2019 11:30:53  08/31/2019 11:30:53  krbtgt/[email protected]
> renew until 09/05/2019 16:09:51
>
>
>
> then i start kylin , /bin/kylin.sh start
>
> When I build cube, kylin reports the following error, Can someone give me
> some advice?
>
>
>
> 2019-08-29 17:32:57,332 ERROR [pool-9-thread-2]
> common.MapReduceExecutable:127 : error execute
> MapReduceExecutable{id=7bf99a0e-8308-49e1-84ed-123bc9c49833-02,
> name=Extract Fact Table Distinct Columns, state=RUNNING}
> org.codehaus.jackson.map.exc.UnrecognizedPropertyException: Unrecognized
> field "Token" (Class
> org.apache.hadoop.yarn.api.records.timeline.TimelineDelegationTokenResponse),
> not marked as ignorable
>  at [Source: N/A; line: -1, column: -1] (through reference chain:
> org.apache.hadoop.yarn.api.records.timeline.TimelineDelegationTokenResponse["Token"])
> at
> org.codehaus.jackson.map.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:53)
> at
> org.codehaus.jackson.map.deser.StdDeserializationContext.unknownFieldException(StdDeserializationContext.java:267)
> at
> org.codehaus.jackson.map.deser.std.StdDeserializer.reportUnknownProperty(StdDeserializer.java:673)
> at
> org.codehaus.jackson.map.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:659)
> at
> org.codehaus.jackson.map.deser.BeanDeserializer.handleUnknownProperty(BeanDeserializer.java:1365)
> at
> org.codehaus.jackson.map.deser.BeanDeserializer._handleUnknown(BeanDeserializer.java:725)
> at
> org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:703)
> at
> org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:580)
> at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2704)
> at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1999)
> at
> org.apache.hadoop.yarn.client.api.impl.TimelineAuthenticator.validateAndParseResponse(TimelineAuthenticator.java:222)
> at
> org.apache.hadoop.yarn.client.api.impl.TimelineAuthenticator.getDelegationToken(TimelineAuthenticator.java:114)
> at
> org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl.getDelegationToken(TimelineClientImpl.java:167)
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.addTimelineDelegationToken(YarnClientImpl.java:275)
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.submitApplication(YarnClientImpl.java:221)
> at
> org.apache.hadoop.mapred.ResourceMgrDelegate.submitApplication(ResourceMgrDelegate.java:282)
> at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:289)
> at
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432)
> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
> at
> org.apache.kylin.engine.mr.common.AbstractHadoopJob.waitForCompletion(AbstractHadoopJob.java:149)
> at
> org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:108)
> at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> at
> org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:120)
> at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113)
> at
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:57)
> at
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113)
> at
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:136)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
>
>
>
>
>
> --
>
> Best
>


-- 
Best

Reply via email to