Thanks Eli.

I have resolvers=internal in my $HOME/build.properties file. Is that enough, 
our should I also put -Dresolvers=internal on the command line?

Thanks,
-Eric

-----Original Message-----
From: Eli Collins [mailto:e...@cloudera.com] 
Sent: Friday, August 12, 2011 12:06 PM
To: Eric Payne
Cc: hdfs-dev@hadoop.apache.org; Tom White
Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

You need to build hdfs with and -Dresolvers=internal after runing mvn
install -DskipTests in common.

On Fri, Aug 12, 2011 at 9:51 AM, Eric Payne <er...@yahoo-inc.com> wrote:
> I'm seeing this error when I try to build a fresh checkout.
>
> I can get around it by removing the .m2 directory in my $HOME directory and 
> then running 'mvn install -DskipTests' again in trun root.
>
> However, test-patch still gets the error and fails the 'system test 
> framework' build.
>
> -Eric
>
> -----Original Message-----
> From: Alejandro Abdelnur [mailto:t...@cloudera.com]
> Sent: Friday, August 12, 2011 12:41 AM
> To: Eli Collins
> Cc: hdfs-dev@hadoop.apache.org; Tom White
> Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing
>
> Eli,
>
> I think you are right, I'm pretty sure it is picking up the latest deployed
> snapshot.
>
> I'll discuss with Tom tomorrow morning how to take care of this (once HDFS
> is Mavenized we can easily build/use all latest bits from all modules, still
> some tricks not to run all modules test will have to be done).
>
> Thxs.
>
> Alejandro
>
> On Thu, Aug 11, 2011 at 10:20 PM, Eli Collins <e...@cloudera.com> wrote:
>
>> Tucu and co - does hdfs build the latest common or does it try to
>> resolve against the latest deployed common artifact?
>> Looks like hudson-test-patch doesn't pick up on the latest common build.
>>
>>
>>
>> On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server
>> <jenk...@builds.apache.org> wrote:
>> > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/
>> >
>> >
>> ###################################################################################
>> > ########################## LAST 60 LINES OF THE CONSOLE
>> ###########################
>> > [...truncated 1273 lines...]
>> >     [iajc]                      ^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/data");
>> >     [iajc]                                 ^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118
>> [error] The method getRawPath(HttpServletRequest, String) is undefined for
>> the type ServletUtil
>> >     [iajc] final String encodedPath = ServletUtil.getRawPath(request,
>> "/data");
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/listPaths");
>> >     [iajc]                                 ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String filePath = ServletUtil.getDecodedPath(request,
>> "/listPaths");
>> >     [iajc]                                     ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/streamFile");
>> >     [iajc]                                 ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66
>> [error] The method getRawPath(HttpServletRequest, String) is undefined for
>> the type ServletUtil
>> >     [iajc] final String rawPath = ServletUtil.getRawPath(request,
>> "/streamFile");
>> >     [iajc]                                    ^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> >     [iajc] 18 errors, 4 warnings
>> >
>> > BUILD FAILED
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222:
>> The following error occurred while executing this line:
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203:
>> The following error occurred while executing this line:
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90:
>> compile errors: 18
>> >
>> > Total time: 55 seconds
>> >
>> >
>> > ======================================================================
>> > ======================================================================
>> > STORE: saving artifacts
>> > ======================================================================
>> > ======================================================================
>> >
>> >
>> > mv: cannot stat `build/*.tar.gz': No such file or directory
>> > mv: cannot stat `build/test/findbugs': No such file or directory
>> > mv: cannot stat `build/docs/api': No such file or directory
>> > Build Failed
>> > [FINDBUGS] Skipping publisher since build result is FAILURE
>> > Archiving artifacts
>> > Publishing Clover coverage report...
>> > No Clover report will be published due to a Build Failure
>> > Recording test results
>> > Publishing Javadoc
>> > Recording fingerprints
>> > Updating HDFS-2235
>> > Email was triggered for: Failure
>> > Sending email for trigger: Failure
>> >
>> >
>> >
>> >
>> ###################################################################################
>> > ############################## FAILED TESTS (if any)
>> ##############################
>> > No tests ran.
>> >
>>
>

Reply via email to