Yang, currently, I have no enough environment to debug the issue, I will
try to look into it next week~ however, when I check the biggest segment of
the cube with the following command:

sudo -uhdfs hadoop fs -cat
/tmp/kylin-3c3159c6-012f-497d-826a-65dc9926442e/test/fact_distinct_columns/mydate
| sort -nr -k 1 | wc -l

it returns 517, which means the number of distinct days,  and of course, as
confirmed, the content for mydate is very normal and regular like:
2015-03-01
2015-02-28
2015-02-27
2015-02-26
2015-02-25
2015-02-24
2015-02-23
2015-02-22
2015-02-21
2015-02-20
2015-02-19
2015-02-18
2015-02-17
2015-02-16
2015-02-15
2015-02-14
2015-02-13


2015-03-20 10:38 GMT+08:00 Li Yang <[email protected]>:

> Hi Dong, id 3652427 is illegal. After my fix, the biggest date is
> 9999-12-31. Hope your analysis won't go beyond that point of time. :-)
>
> For your merge problem, you still need to dig why your data generate the
> illegal ID. You can look at DateStrDictionaryTest.java for details of
> what's supported and what's not.
>
> Once data is fixed, refresh impacted segment so dictionary is refreshed to
> correct state, then merge will be able to work.
>
> Cheers
> Yang
>
> On Tue, Mar 17, 2015 at 10:09 AM, dong wang <[email protected]>
> wrote:
>
> > Yang, another thing is that I'm not sure whether the value of
> > "id"(=3652427) is correct or not, if it is legal, then there may be some
> > problems about the function mentioned above, if illegal, there should be
> > some problems inside logic of generating id?
> >
>

Reply via email to