[jira] [Updated] (KYLIN-3407) java.io.FileNotFoundException: File does not exist: hdfs://localhost:9000/**/hive/lib/hive-catalog-core not found

2018-06-13 Thread Rahul Midha (JIRA)


 [ 
https://issues.apache.org/jira/browse/KYLIN-3407?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rahul Midha updated KYLIN-3407:
---
Description: 
While creating a cube in kylin i am getting error 

"java.io.FileNotFoundException: File does not exist: 
hdfs://localhost:9000/**/hive/lib/hive-catalog-core.jar not found" while file 
is there in the path, also when i remove that file from the path, the error 
come for some other jar file. 

my hadoop version is  hadoop2.7.3 ,hive2.3.3, hbase1.1.1, kylin2.3.1

 

 The error is 

java.io.FileNotFoundException: File does not exist: 
hdfs://localhost:9000/home/dir/hive/lib/hive-hcatalog-core.jar
 at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
 at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
 at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
 at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
 at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
 at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:99)
 at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
 at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
 at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
 at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:422)
 at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
 at 
org.apache.kylin.engine.mr.common.AbstractHadoopJob.waitForCompletion(AbstractHadoopJob.java:175)
 at org.apache.kylin.storage.hbase.steps.CubeHFileJob.run(CubeHFileJob.java:110)
 at 
org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:130)
 at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:162)
 at 
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:67)
 at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:162)
 at 
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:300)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 at java.lang.Thread.run(Thread.java:748)

 

 

 

 

  was:
While creating a cube in kylin i am getting error 

"java.io.FileNotFoundException: File does not exist: 
hdfs://localhost:9000/**/hive/lib/hive-catalog-core.jar not found" while file 
is there in the path, also when i remove that file from the path, the error 
come for some other jar file.  

my hadoop version is  hadoop2.7.3 ,hive2.3.3, hbase1.1.1, kylin2.3.1

 

 

 

 

 


> java.io.FileNotFoundException: File does not exist: 
> hdfs://localhost:9000/**/hive/lib/hive-catalog-core not found
> -
>
> Key: KYLIN-3407
> URL: https://issues.apache.org/jira/browse/KYLIN-3407
> Project: Kylin
>  Issue Type: Bug
>  Components: Job Engine
>Reporter: Rahul Midha
>Priority: Major
>
> While creating a cube in kylin i am getting error 
> "java.io.FileNotFoundException: File does not exist: 
> hdfs://localhost:9000/**/hive/lib/hive-catalog-core.jar not found" while file 
> is there in the path, also when i remove that file from the path, the error 
> come for some other jar file. 
> my hadoop version is  hadoop2.7.3 ,hive2.3.3, hbase1.1.1, kylin2.3.1
>  
>  The error is 
> java.io.FileNotFoundException: File does not exist: 
> hdfs://localhost:9000/home/dir/hive/lib/hive-hcatalog-core.jar
>  at 
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
>  at 
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
>  at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>  at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
>  at 
> 

[jira] [Created] (KYLIN-3407) java.io.FileNotFoundException: File does not exist: hdfs://localhost:9000/**/hive/lib/hive-catalog-core not found

2018-06-12 Thread Rahul Midha (JIRA)
Rahul Midha created KYLIN-3407:
--

 Summary: java.io.FileNotFoundException: File does not exist: 
hdfs://localhost:9000/**/hive/lib/hive-catalog-core not found
 Key: KYLIN-3407
 URL: https://issues.apache.org/jira/browse/KYLIN-3407
 Project: Kylin
  Issue Type: Bug
  Components: Job Engine
Reporter: Rahul Midha


While creating a cube in kylin i am getting error 

"java.io.FileNotFoundException: File does not exist: 
hdfs://localhost:9000/**/hive/lib/hive-catalog-core.jar not found" while file 
is there in the path, also when i remove that file from the path, the error 
come for some other jar file.  

my hadoop version is  hadoop2.7.3 ,hive2.3.3, hbase1.1.1, kylin2.3.1

 

 

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)