Okay, so our people here looked at this further. In the Hadoop source, the
exception is thrown every time the constructor is called so it means
nothing.

We're in a crunch to get this up and running today and would really
appreciate some help:

*The console hangs with any query that causes a map/reduce ( so select *
from table works, but that’s it). From what I can tell, any query that tries
to submit a job to Hadoop hangs, everything else works. So creating tables,
deleting them, importing and exporting data all work.

A simple query we can try is

*select count(1) from english;*

This fails, with example output below.

Other very simple queries that trigger a Map/Reduce do the same.

Thanks!*

hive> select count(1) from english;
Total MapReduce jobs = 2
Number of reduce tasks not specified. Defaulting to jobconf value of: 16
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
*
The logs have nothing useful that I can see after that.  Instead, here is a
stack dump of the CLI threads that may be of interest (using jstack).
Additionally, the client no longer responds to Ctrl-D for shutdown, and only
stops with the more forceful Ctrl-C.*

"main" prio=10 tid=0x0000000055e70000 nid=0x4605 in Object.wait()
[0x0000000040208000..0x0000000040209ec0]
   java.lang.Thread.State: WAITING (on object monitor)
    at java.lang.Object.wait(Native Method)
    - waiting on <0x00002aaadef20cc8> (a org.apache.hadoop.ipc.Client$Call)
    at java.lang.Object.wait(Object.java:485)
    at org.apache.hadoop.ipc.Client.call(Client.java:710)
    - locked <0x00002aaadef20cc8> (a org.apache.hadoop.ipc.Client$Call)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
    at org.apache.hadoop.mapred.$Proxy1.getNewJobId(Unknown Source)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:725)
    at
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:391)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:238)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:306)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
    at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
    at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)

   Locked ownable synchronizers:
    - None

"IPC Client (47) connection to /10.3.0.66:55554 from an unknown user" daemon
prio=10 tid=0x00002aaaf9038000 nid=0x464a runnable
[0x000000004171e000..0x000000004171ea00]
   java.lang.Thread.State: RUNNABLE
    at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
    at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:215)
    at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:65)
    at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:69)
    - locked <0x00002aaab416d6a8> (a sun.nio.ch.Util$1)
    - locked <0x00002aaab416d690> (a java.util.Collections$UnmodifiableSet)
    - locked <0x00002aaab416d300> (a sun.nio.ch.EPollSelectorImpl)
    at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:80)
    at
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:260)
    at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:155)
    at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:150)
    at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:123)
    at java.io.FilterInputStream.read(FilterInputStream.java:116)
    at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:273)
    at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:237)
    - locked <0x00002aaadeec6288> (a java.io.BufferedInputStream)
    at java.io.DataInputStream.readInt(DataInputStream.java:370)
    at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:500)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:442)

   Locked ownable synchronizers:
    - None




On Mon, Feb 23, 2009 at 9:00 PM, hc busy <[email protected]> wrote:

>
> Okay, great! got some help from Zheng, and got it past that point. But now,
> instead of throwing that exception, the hive CLI hangs with the hadoop
> cluster idle. We went in and looked at the logs and here's what it says:
>
> 2009-02-23 20:55:49,706 DEBUG parse.SemanticAnalyzer
> (SemanticAnalyzer.java:genSelectPlan(1120)) - genSelectPlan: input =
> {((tok_function count 1),0: bigint)}
> 2009-02-23 20:55:49,707 DEBUG parse.SemanticAnalyzer
> (SemanticAnalyzer.java:genSelectPlan(1196)) - Created Select Plan for
> clause: insclause-0 row schema: null{(_c0,0: bigint)}
> 2009-02-23 20:55:49,707 DEBUG serde2.MetadataTypedColumnsetSerDe
> (MetadataTypedColumnsetSerDe.java:initialize(121)) -
> org.apache.hadoop.hive.serde2.MetadataTypedColumnsetSerDe: initialized with
> columnNames: [_c0] and separator code=1 lastColumnTakesRest=false
> splitLimit=-1
> 2009-02-23 20:55:49,707 DEBUG parse.SemanticAnalyzer
> (SemanticAnalyzer.java:genFileSinkPlan(2092)) - Created FileSink Plan for
> clause: insclause-0dest_path: /tmp/hive-candiru/490583327.10000 row schema:
> null{(_c0,0: bigint)}
> 2009-02-23 20:55:49,708 DEBUG parse.SemanticAnalyzer
> (SemanticAnalyzer.java:genBodyPlan(2784)) - Created Body Plan for Query
> Block null
> 2009-02-23 20:55:49,708 DEBUG parse.SemanticAnalyzer
> (SemanticAnalyzer.java:genPlan(3072)) - Created Plan for Query Block null
> 2009-02-23 20:55:49,710 INFO  parse.SemanticAnalyzer
> (SemanticAnalyzer.java:analyzeInternal(3266)) - Completed partition pruning
> 2009-02-23 20:55:49,710 INFO  parse.SemanticAnalyzer
> (SemanticAnalyzer.java:analyzeInternal(3270)) - Completed sample pruning
> 2009-02-23 20:55:49,720 DEBUG conf.Configuration
> (Configuration.java:<init>(157)) - java.io.IOException: config()
>         at
> org.apache.hadoop.conf.Configuration.<init>(Configuration.java:157)
>         at
> org.apache.hadoop.hive.ql.exec.ExecDriver.<clinit>(ExecDriver.java:128)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at java.lang.Class.newInstance0(Class.java:355)
>         at java.lang.Class.newInstance(Class.java:308)
>         at
> org.apache.hadoop.hive.ql.exec.TaskFactory.get(TaskFactory.java:101)
>         at
> org.apache.hadoop.hive.ql.exec.TaskFactory.get(TaskFactory.java:116)
>         at
> org.apache.hadoop.hive.ql.optimizer.GenMRTableScan1.process(GenMRTableScan1.java:54)
>         at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
>         at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
>         at
> org.apache.hadoop.hive.ql.parse.GenMapRedWalker.walk(GenMapRedWalker.java:51)
>         at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
>         at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genMapRedTasks(SemanticAnalyzer.java:3202)
>         at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3277)
>         at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:306)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>         at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
> 2009-02-23 20:55:49,741 DEBUG optimizer.GenMapRedUtils
> (GenMapRedUtils.java:setTaskPlan(209)) - Adding
> hdfs://machine:port/user/hive/warehouse/english of tableenglish
> 2009-02-23 20:55:49,741 DEBUG optimizer.GenMapRedUtils
> (GenMapRedUtils.java:setTaskPlan(216)) - Information added for path
> hdfs://machine:port/user/hive/warehouse/english
> 2009-02-23 20:55:49,742 DEBUG optimizer.GenMapRedUtils
> (GenMapRedUtils.java:setTaskPlan(221)) - Created Map Work for english
> 2009-02-23 20:55:49,743 INFO  parse.SemanticAnalyzer
> (SemanticAnalyzer.java:analyzeInternal(3279)) - Completed plan generation
> 2009-02-23 20:55:49,744 DEBUG conf.Configuration
> (Configuration.java:<init>(171)) - java.io.IOException: config(config)
>         at
> org.apache.hadoop.conf.Configuration.<init>(Configuration.java:171)
>         at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:130)
>         at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:140)
>         at
> org.apache.hadoop.hive.ql.exec.ExecDriver.initialize(ExecDriver.java:88)
>         at org.apache.hadoop.hive.ql.exec.Task.initialize(Task.java:84)
>         at
> org.apache.hadoop.hive.ql.exec.ExecDriver.initialize(ExecDriver.java:87)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:215)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:306)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>         at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
> On Mon, Feb 23, 2009 at 7:57 PM, hc busy <[email protected]> wrote:
>
>>
>> Also, we tried branch-0.2 and trunk against 0.18.2 and 0.18.3; We saw a
>> few notes on this error before related building against proper hadoop
>> version but that didn't seem to help here, what else could be causing this?
>>
>>
>>
>>
>> On Mon, Feb 23, 2009 at 7:48 PM, hc busy <[email protected]> wrote:
>>
>>> *Setting*:
>>> Hadoop 0.18.3, on several nodes that is able to run mr jobs.
>>> Hive trunk built with "ant -Dtarget.dir=/hive -Dhadoop.version='0.18.3'
>>> package", and then deployed by copying build/dist/* to /hive;
>>> $HADOOP_HOME, $HADOOP, $HIVE_HOME are all configured correctly.
>>>
>>> I imported some list of English words into a table called english. It is
>>> a table with one column of string that I can do 'select * from english;'
>>> BUT!! the following fails. Can anybody help?
>>>
>>> .
>>> .
>>> .
>>> courtesan
>>> courtesanry
>>> courtesans
>>> courtesanship
>>> courtesied
>>> courtesies
>>> courtesy
>>> courtesy
>>> Time taken: 4.584 seconds
>>> *hive*> *select count(1) from english;*
>>>
>>> Total MapReduce jobs = 2
>>> Number of reduce tasks not specified. Defaulting to jobconf value of: 16
>>> In order to change the average load for a reducer (in bytes):
>>>   set hive.exec.reducers.bytes.per.reducer=<number>
>>> In order to limit the maximum number of reducers:
>>>   set hive.exec.reducers.max=<number>
>>> In order to set a constant number of reducers:
>>>   set mapred.reduce.tasks=<number>
>>> java.lang.AbstractMethodError:
>>> org.apache.hadoop.hive.ql.io.HiveInputFormat.validateInput(Lorg/apache/hadoop/mapred/JobConf;)V
>>>         at
>>> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:735)
>>>         at
>>> org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:391)
>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:238)
>>>         at
>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>>>         at
>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>>>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:306)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>         at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>         at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>
>>>
>>
>

Reply via email to