Jerry has created a jira ticket for this issue.
https://issues.apache.org/jira/browse/KYLIN-953
It seems that it is not a special case.
Can you please provide details of your environment?

Li Yang <[email protected]>于2015年8月11日周二 下午5:35写道:

> What's your Hadoop & HBase version??
>
> The line at
>
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:545)
> is
>
> Path partitionsPath = new Path("/tmp", "partitions_" + UUID.randomUUID());
>
> which is impossible to produce any null value. My feeling is mismatch of
> software versions.
>
> On Mon, Aug 10, 2015 at 6:14 PM, liam <[email protected]> wrote:
>
> > Hi,all
> >
> >     I got error in [#14 Step Name: Convert Cuboid Data to HFile] when
> build
> > the "sample" cube.
> > I guess maybe I missing any configuration for the "MR output path" ?
> >
> >
> > 【---------------------------Error message------------------------】
> >
> > [pool-5-thread-2]:[2015-08-10
> >
> >
> 17:54:34,630][ERROR][org.apache.kylin.job.hadoop.cube.CubeHFileJob.run(CubeHFileJob.java:98)]
> > - *error in CubeHFileJob*
> >
> > *java.lang.IllegalArgumentException: Can not create a Path from a null
> > string*
> >
> > at org.apache.hadoop.fs.Path.checkPathArg(Path.java:123)
> >
> > at org.apache.hadoop.fs.Path.<init>(Path.java:135)
> >
> > at org.apache.hadoop.fs.Path.<init>(Path.java:89)
> >
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:545)
> >
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:394)
> >
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat.configureIncrementalLoad(HFileOutputFormat.java:88)
> >
> > at
> org.apache.kylin.job.hadoop.cube.CubeHFileJob.run(CubeHFileJob.java:89)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> >
> > at
> >
> >
> org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExecutable.java:112)
> >
> > at
> >
> >
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:106)
> >
> > at
> >
> >
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50)
> >
> > at
> >
> >
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:106)
> >
> > at
> >
> >
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:133)
> >
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >
> > at java.lang.Thread.run(Thread.java:745)
> >
> > *[pool-5-thread-2]:[2015-08-10
> >
> >
> 17:54:34,641][ERROR][org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExecutable.java:115)]
> > - error execute
> > MapReduceExecutable{id=01a048d5-c385-4f53-a00a-267e2aeabb49-13,
> > name=Convert Cuboid Data to HFile, state=RUNNING}*
> >
> > java.lang.IllegalArgumentException: Can not create a Path from a null
> > string
> >
> > at org.apache.hadoop.fs.Path.checkPathArg(Path.java:123)
> >
> > at org.apache.hadoop.fs.Path.<init>(Path.java:135)
> >
> > at org.apache.hadoop.fs.Path.<init>(Path.java:89)
> >
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:545)
> >
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:394)
> >
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat.configureIncrementalLoad(HFileOutputFormat.java:88)
> >
> > at
> org.apache.kylin.job.hadoop.cube.CubeHFileJob.run(CubeHFileJob.java:89)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> >
> > at
> >
> >
> org.apache.kylin.job.common.MapReduceExecutable.doWork(MapReduceExecutable.java:112)
> >
> > at
> >
> >
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:106)
> >
> > at
> >
> >
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50)
> >
> > at
> >
> >
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:106)
> >
> > at
> >
> >
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:133)
> >
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >
> > at java.lang.Thread.run(Thread.java:745)
> >
>
-- 
Best Regard
ZhouQianhao

Reply via email to