There is a limit on the number of job counters you can have  -- 120 is the
default. Are you sure you need more than that?

If so, you can change the limit in the mapreduce.job.counters.limit setting.

On Mon, Apr 27, 2015 at 6:07 AM, 李运田 <[email protected]> wrote:

> I use rank to add ID now ,but I always get error "
>  FATAL [AsyncDispatcher event handler]
> org.apache.hadoop.yarn.event.AsyncDispatcher: Error in dispatcher thread
> org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many
> counters: 121 max=120
>         at
> org.apache.hadoop.mapreduce.counters.Limits.checkCounters(Limits.java:103)
>         at
> org.apache.hadoop.mapreduce.counters.Limits.incrCounters(Limits.java:110)
>         at
> org.apache.hadoop.mapreduce.counters.AbstractCounterGroup.addCounter(AbstractCounterGroup.java:78)
>         at
> org.apache.hadoop.mapreduce.counters.AbstractCounterGroup.addCounterImpl(AbstractCounterGroup.java:95)
> "
> I get many solution in website ,but it is not resolved . perhaps there is
> something wrong with some sets like " mapreduce.job.counters.max
>  mapreduce.job.counters.group.name.max
>  mapreduce.job.counters.counter.name.max
>  mapreduce.job.counters.groups.max"
> can you give me some advice?
>
>
>
>
>
>
>
>
> At 2015-04-24 21:56:20, "Alex Nastetsky" <[email protected]>
> wrote:
> >Have you looked at the RANK function?
> >https://pig.apache.org/docs/r0.11.0/basic.html#rank
> >
> >On Fri, Apr 24, 2015 at 5:15 AM, 李运田 <[email protected]> wrote:
> >
> >> I have a big data about 10T ,I want to add column number to every column
> >> from 1 to COUNT(my data),,I  use two functions in
> >>
> http://stackoverflow.com/questions/9288578/how-can-i-add-row-numbers-for-rows-in-pig-or-hive
> >> and http://www.aiuxian.com/article/p-139530.html
> >> but I always get many files. every file column number is  from 0 to
> >> ......  perhaps there are many reduces. I dont know how to add  column
> >> number   from 1 to COUNT(my data)?
> >> thank you ,give me some advice or some one can give me some advice
> >> websites?
> >> thank you very much.
> >>
> >>
>

Reply via email to