Re: User is not allowed to impersonate user1
Will do. Any idea why I get that error, though? I tried it on the Sun JDK and it gives me the same error. On Thu, Jan 26, 2012 at 11:46 AM, Harsh J ha...@cloudera.com wrote: Bai, In that case, to get started quick, try: mvn install -DskipTests mvn eclipse:eclipse Then import all your required projects in. On Thu, Jan 26, 2012 at 10:12 PM, Bai Shen baishen.li...@gmail.com wrote: Yes. I need to do some debugging, so I'm trying to get hadoop compiled so I can try some changes. Also, I'm still getting the error even when running mvn clean test. I'm using the OpenJDK 1.6.0_22 on F16. Is there any other information needed? On Thu, Jan 26, 2012 at 9:16 AM, Harsh J ha...@cloudera.com wrote: What are you looking to do exactly? Are you looking to compile and get the project setups ready for eclipse? On Thu, Jan 26, 2012 at 7:26 PM, Bai Shen baishen.li...@gmail.com wrote: Ah, oops. I forgot there was a dev list. Sorry. I ran mvn clean install -Pdist -Dtar -Ptest-patch as directed in the document. I'm not sure what the options do, though. I'll give mvn clean test a try. On Thu, Jan 26, 2012 at 7:07 AM, Harsh J ha...@cloudera.com wrote: Moving to common-dev@. I'm able to run all hadoop-hdfs-httpfs tests without failure if I do a mvn clean test under that directory. Can you try to clean and then re-run? Might have been a transient failure? Do you have the test logs? If you can reliably reproduce this test failure, please report with your environment (JVM make/version specifically), to the Apache JIRA under HDFS project. Note that if you want to build and go ahead with development without tests, pass -DskipTests to your maven commands. On Thu, Jan 26, 2012 at 3:02 AM, Bai Shen baishen.li...@gmail.com wrote: I'm following the instructions in http://wiki.apache.org/hadoop/HowToContribute to build the Hadoop trunk. The build is failing the HttpFS tests with the above error. User is not allowed to impersonate user1. Googling doesn't seem to provide any useful results. Is there something I'm missing in the build configuration? Thanks. -- Harsh J Customer Ops. Engineer, Cloudera -- Harsh J Customer Ops. Engineer, Cloudera -- Harsh J Customer Ops. Engineer, Cloudera
Re: User is not allowed to impersonate user1
I'm not sure, but please report the failure over a HDFS JIRA if you are able to reproduce that test failure reliably in your environment with as much detail as you can provide, so that we can take a closer look. The whole httpfs set of tests seem to pass for me locally when I try (I ran them on OSX though). On Fri, Jan 27, 2012 at 7:15 PM, Bai Shen baishen.li...@gmail.com wrote: Will do. Any idea why I get that error, though? I tried it on the Sun JDK and it gives me the same error. On Thu, Jan 26, 2012 at 11:46 AM, Harsh J ha...@cloudera.com wrote: Bai, In that case, to get started quick, try: mvn install -DskipTests mvn eclipse:eclipse Then import all your required projects in. On Thu, Jan 26, 2012 at 10:12 PM, Bai Shen baishen.li...@gmail.com wrote: Yes. I need to do some debugging, so I'm trying to get hadoop compiled so I can try some changes. Also, I'm still getting the error even when running mvn clean test. I'm using the OpenJDK 1.6.0_22 on F16. Is there any other information needed? On Thu, Jan 26, 2012 at 9:16 AM, Harsh J ha...@cloudera.com wrote: What are you looking to do exactly? Are you looking to compile and get the project setups ready for eclipse? On Thu, Jan 26, 2012 at 7:26 PM, Bai Shen baishen.li...@gmail.com wrote: Ah, oops. I forgot there was a dev list. Sorry. I ran mvn clean install -Pdist -Dtar -Ptest-patch as directed in the document. I'm not sure what the options do, though. I'll give mvn clean test a try. On Thu, Jan 26, 2012 at 7:07 AM, Harsh J ha...@cloudera.com wrote: Moving to common-dev@. I'm able to run all hadoop-hdfs-httpfs tests without failure if I do a mvn clean test under that directory. Can you try to clean and then re-run? Might have been a transient failure? Do you have the test logs? If you can reliably reproduce this test failure, please report with your environment (JVM make/version specifically), to the Apache JIRA under HDFS project. Note that if you want to build and go ahead with development without tests, pass -DskipTests to your maven commands. On Thu, Jan 26, 2012 at 3:02 AM, Bai Shen baishen.li...@gmail.com wrote: I'm following the instructions in http://wiki.apache.org/hadoop/HowToContribute to build the Hadoop trunk. The build is failing the HttpFS tests with the above error. User is not allowed to impersonate user1. Googling doesn't seem to provide any useful results. Is there something I'm missing in the build configuration? Thanks. -- Harsh J Customer Ops. Engineer, Cloudera -- Harsh J Customer Ops. Engineer, Cloudera -- Harsh J Customer Ops. Engineer, Cloudera -- Harsh J Customer Ops. Engineer, Cloudera
[jira] [Created] (HADOOP-7998) CheckFileSystem does not correctly honor setVerifyChecksum
CheckFileSystem does not correctly honor setVerifyChecksum -- Key: HADOOP-7998 URL: https://issues.apache.org/jira/browse/HADOOP-7998 Project: Hadoop Common Issue Type: Bug Components: fs Affects Versions: 0.23.0, 0.24.0 Reporter: Daryn Sharp Assignee: Daryn Sharp Regardless of the verify checksum flag, {{ChecksumFileSystem#open}} will instantiate a {{ChecksumFSInputChecker}} instead of a normal stream. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-7999) hadoop archive fails with ClassNotFoundException
hadoop archive fails with ClassNotFoundException -- Key: HADOOP-7999 URL: https://issues.apache.org/jira/browse/HADOOP-7999 Project: Hadoop Common Issue Type: Bug Components: scripts Affects Versions: 0.24.0, 0.23.1 Reporter: Jason Lowe Assignee: Jason Lowe Running hadoop archive from a command prompt results in this error: Exception in thread main java.lang.NoClassDefFoundError: org/apache/hadoop/tools/HadoopArchives Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.tools.HadoopArchives at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the main class: org.apache.hadoop.tools.HadoopArchives. Program will exit. The hadoop front-end script expects the TOOL_PATH environment variable to be set, but nothing provides a default value for it if it is not set. Since $TOOL_PATH expands to nothing, the hadoop-archives jar under share/hadoop/tools/lib isn't found, and we end up with a ClassNotFound exception. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-8000) fetchdt command not available in bin/hadoop
fetchdt command not available in bin/hadoop --- Key: HADOOP-8000 URL: https://issues.apache.org/jira/browse/HADOOP-8000 Project: Hadoop Common Issue Type: Bug Affects Versions: 0.24.0, 0.23.1 Reporter: Arpit Gupta Assignee: Arpit Gupta fetchdt command needs to be added to bin/hadoop to allow for backwards compatibility. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-8001) ChecksumFileSystem's rename doesn't correctly handle checksum files
ChecksumFileSystem's rename doesn't correctly handle checksum files --- Key: HADOOP-8001 URL: https://issues.apache.org/jira/browse/HADOOP-8001 Project: Hadoop Common Issue Type: Bug Components: fs Affects Versions: 0.23.0, 0.24.0 Reporter: Daryn Sharp Assignee: Daryn Sharp Rename will move the src file and its crc *if present* to the destination. If the src file has no crc, but the destination already exists with a crc, then src will be associated with the old file's crc. Subsequent access to the file will fail with checksum errors. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-8002) SecurityUtil acquired token message should be a debug rather than info
SecurityUtil acquired token message should be a debug rather than info -- Key: HADOOP-8002 URL: https://issues.apache.org/jira/browse/HADOOP-8002 Project: Hadoop Common Issue Type: Bug Affects Versions: 0.24.0, 0.23.1 Reporter: Arpit Gupta Assignee: Arpit Gupta -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira