[
https://issues.apache.org/jira/browse/HIVE-984?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12797890#action_12797890
]
Namit Jain commented on HIVE-984:
---------------------------------
[junit] at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1161)
[junit] at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1029)
[junit] at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:979)
[junit] at
org.apache.hadoop.conf.Configuration.set(Configuration.java:404)
[junit] at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:308)
[junit] at
org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:318)
[junit] at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:243)
[junit] at
org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:332)
[junit] at
org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:312)
[junit] at
org.apache.hadoop.hive.ql.QTestUtil.<init>(QTestUtil.java:173)
[junit] at
org.apache.hadoop.hive.cli.TestCliDriver.setUp(TestCliDriver.java:39)
[junit] at junit.framework.TestCase.runBare(TestCase.java:125)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:118)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:208)
[junit] at junit.framework.TestSuite.run(TestSuite.java:203)
[junit] at
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: java.io.FileNotFoundException:
/data/users/njain/hive_commit1/hive_commit1/build/hadoopcore/hadoop-0.20.0/conf/core-site.xml
(Too many open files)
[junit] at java.io.FileInputStream.open(Native Method)
[junit] at java.io.FileInputStream.<init>(FileInputStream.java:106)
[junit] at java.io.FileInputStream.<init>(FileInputStream.java:66)
[junit] at
sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:70)
[junit] at
sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:161)
[junit] at
com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:653)
[junit] at
com.sun.org.apache.xerces.internal.impl.XMLVersionDetector.determineDocVersion(XMLVersionDetector.java:186)
[junit] at
com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)
[junit] at
com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
[junit] at
com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:107)
[junit] at
com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:225)
[junit] at
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:283)
[junit] at
javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:180)
[junit] at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1078)
[junit] ... 20 more
[junit] Exception: java.io.FileNotFoundException:
/data/users/njain/hive_commit1/hive_commit1/build/hadoopcore/hadoop-0.20.0/conf/core-site.xml
(Too many open files)
I am getting a lot of errors like this
> Building Hive occasionally fails with Ivy error:
> hadoop#core;0.20.1!hadoop.tar.gz(source): invalid md5:
> -------------------------------------------------------------------------------------------------------
>
> Key: HIVE-984
> URL: https://issues.apache.org/jira/browse/HIVE-984
> Project: Hadoop Hive
> Issue Type: Bug
> Components: Build Infrastructure
> Reporter: Carl Steinbach
> Assignee: Carl Steinbach
> Attachments: HIVE-984.2.patch, HIVE-984.patch
>
>
> Folks keep running into this problem when building Hive from source:
> {noformat}
> [ivy:retrieve]
> [ivy:retrieve] :: problems summary ::
> [ivy:retrieve] :::: WARNINGS
> [ivy:retrieve] [FAILED ]
> hadoop#core;0.20.1!hadoop.tar.gz(source): invalid md5:
> expected=hadoop-0.20.1.tar.gz: computed=719e169b7760c168441b49f405855b72
> (138662ms)
> [ivy:retrieve] [FAILED ]
> hadoop#core;0.20.1!hadoop.tar.gz(source): invalid md5:
> expected=hadoop-0.20.1.tar.gz: computed=719e169b7760c168441b49f405855b72
> (138662ms)
> [ivy:retrieve] ==== hadoop-resolver: tried
> [ivy:retrieve]
> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> [ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
> [ivy:retrieve] :: FAILED DOWNLOADS ::
> [ivy:retrieve] :: ^ see resolution messages for details ^ ::
> [ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
> [ivy:retrieve] :: hadoop#core;0.20.1!hadoop.tar.gz(source)
> [ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
> [ivy:retrieve]
> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
> {noformat}
> The problem appears to be either with a) the Hive build scripts, b) ivy, or
> c) archive.apache.org
> Besides fixing the actual bug, one other option worth considering is to add
> the Hadoop jars to the
> Hive source repository.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.