This is the exception i found in JobTracker Log file.

2007-10-15 11:20:49,378 FATAL org.apache.hadoop.conf.Configuration: error
parsing conf file: java.io.FileNotFoundException:
/home/jaya/hadoop-0.13.0/conf/hadoop-default.xml (Too many open files)
2007-10-15 11:20:49,379 FATAL org.apache.hadoop.mapred.JobTracker:
java.lang.RuntimeException: java.io.FileNotFoundException:
/home/jaya/hadoop-0.13.0/conf/hadoop-default.xml (Too many open files)
        at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:562)
        at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:481)
        at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:462)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:224)
        at org.apache.hadoop.mapred.JobConf.getSystemDir(JobConf.java:127)
        at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:623)
        at
org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:106)
        at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:1732)
Caused by: java.io.FileNotFoundException:
/home/jaya/hadoop-0.13.0/conf/hadoop-default.xml (Too many open files)
        at java.io.FileInputStream.open(Native Method)
        at java.io.FileInputStream.<init>(FileInputStream.java:106)
        at java.io.FileInputStream.<init>(FileInputStream.java:66)
        at
sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:70)
        at
sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:161)
        at
com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:653)
        at
com.sun.org.apache.xerces.internal.impl.XMLVersionDetector.determineDocVersion(XMLVersionDetector.java:186)
        at
com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)
        at
com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
        at
com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:107)
        at
com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:225)
        at
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:283)
        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:180)
        at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:506)
        ... 7 more


cpreethi wrote:
> 
> I have been using Hadoop-0.13.0 version.
> 
> Thanks
> Preethi.C
> 
> 
> 
> Arun C Murthy wrote:
>> 
>> On Sun, Oct 14, 2007 at 08:26:27PM -0700, cpreethi wrote:
>>>
>>>Hi,
>>>
>>>This is the error that i get.Kindly look into it and suggest me as how I
>>>should go about this.
>>>Thanks in advance
>>>
>>>[EMAIL PROTECTED] bin]$ ./hadoop jar
>>>/home/jaya/hadoop-0.13.0/hadoop-0.13.0-examples.jar wordcount input
output
>>>java.net.SocketTimeoutException: timed out waiting for rpc response
>> 
>> Looks like you JobClient cannot connect to the JobTracker to submit your
>> job. 
>> Please check the JobTracker logs (and NameNode logs if the JobTracker is
>> fine) for exceptions etc.
>> 
>> Also, what version of hadoop are you running?
>> 
>> Arun
>> 
>>>        at org.apache.hadoop.ipc.Client.call(Client.java:473)
>>>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:165)
>>>        at $Proxy1.getProtocolVersion(Unknown Source)
>>>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:249)
>>>        at org.apache.hadoop.mapred.JobClient.init(JobClient.java:208)
>>>        at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:200)
>>>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:528)
>>>        at org.apache.hadoop.examples.WordCount.main(WordCount.java:148)
>>>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>        at
>>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>        at
>>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>        at java.lang.reflect.Method.invoke(Method.java:597)
>>>        at
>>>org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:69)
>>>        at
>>>org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:140)
>>>        at
>>>org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:40)
>>>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>        at
>>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>        at
>>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>        at java.lang.reflect.Method.invoke(Method.java:597)
>>>        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>
>>>
>>>Thanks,
>>>Preethi.C
>>>
>>>
>>>
>>>Rajagopal Natarajan-2 wrote:
>>>> 
>>>> On 10/10/07, cpreethi <[EMAIL PROTECTED]> wrote:
>>>>>
>>>>> I have a Hadoop cluster of three machines. When a wordcount example
>>>>> was
>>>>> submitted it was working fine.
>>>>> Nowadays when i submit a job i get a socket error.. and job is not
>>>>> workin.
>>>>>
>>>>> Any reason as why this is happenin?
>>>> 
>>>> 
>>>> Can you paste the error message?
>>>> 
>>>> 
>>>> -- 
>>>> N. Rajagopal,
>>>> Visit me at http://users.kaski-net.net/~raj/
>>>> 
>>>> 
>>>
>>>-- 
>>>View this message in context:
http://www.nabble.com/Hadoop-Not-running-tf4598347.html#a13206194
>>>Sent from the Hadoop Users mailing list archive at Nabble.com.
>>>
>> 
>> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Hadoop-Not-running-tf4598347.html#a13207946
Sent from the Hadoop Users mailing list archive at Nabble.com.

Reply via email to