Thank's for your reply Susam Pal !

I have run ant and I have an error I can't resolve... Look at this :

debian:~/nutch-0.9# ant
Buildfile: build.xml

init:
    [unjar] Expanding: /root/nutch-0.9/lib/hadoop-0.12.2-core.jar into 
/root/nutch-0.9/build/hadoop
    [untar] Expanding: /root/nutch-0.9/build/hadoop/bin.tgz into 
/root/nutch-0.9/bin
    [unjar] Expanding: /root/nutch-0.9/lib/hadoop-0.12.2-core.jar into 
/root/nutch-0.9/build

compile-core:
    [javac] Compiling 133 source files to /root/nutch-0.9/build/classes
    [javac] /root/nutch-0.9/src/java/org/apache/nutch/crawl/Crawl.java:150: 
cannot find symbol
    [javac] symbol  : variable HadoopFSUtil
    [javac] location: class org.apache.nutch.crawl.Crawl
    [javac]       merger.merge(fs.listPaths(indexes, 
HadoopFSUtil.getPassAllFilter()),
    [javac]                                          ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 1 error

BUILD FAILED
/root/nutch-0.9/build.xml:106: Compile failed; see the compiler error output 
for details.

Total time: 8 seconds

I have already corrected 3errors but I can't correct this one... I don't know 
what's HadoopFSUtil and so I can't correct the error... Help me please,

Thank's for your help !

Jisay


>
> The patch was generated for Nutch 1.0 development version which is
> currently in trunk. So, it is unable to patch your older version
> cleanly.
>
> I also see that you are using NUTCH-601v0.3.patch. However,
> NUTCH-601v1.0.patch is the recommended patch. If this patch fails, you
> can make the modifications manually. This patch is extremely simple
> and if you just open the patch using a text editor, you would find
> that 3 lines have been removed from the original source code
> (indicated by leading minus signs) and 11 new lines have been added
> (indicated by plus signs). You have to make these changes manually to
> your Nutch 0.9 source code directory.
>
> Once you make the changes, just build your project again with ant and
> you would be ready for recrawl.
>
> Regards,
> Susam Pal
>
> On Tue, Mar 18, 2008 at 7:12 PM, Jean-Christophe Alleman
>  wrote:
>>
>>
>> Hi, I'm interested by this patch but I can't patch it. I have some problems 
>> when I try to patch...
>>
>> Here is what I do :
>>
>> debian:~/patch# patch -p0 > can't find file to patch at input line 5
>> Perhaps you used the wrong -p or --strip option?
>> The text leading up to this was:
>> --------------------------
>> |Index: src/java/org/apache/nutch/crawl/Crawl.java
>> |===================================================================
>> |--- src/java/org/apache/nutch/crawl/Crawl.java (revision 628119)
>> |+++ src/java/org/apache/nutch/crawl/Crawl.java (working copy)
>> --------------------------
>> File to patch: /root/nutch-0.9/src/java/org/apache/nutch/crawl/Crawl.java
>> patching file /root/nutch-0.9/src/java/org/apache/nutch/crawl/Crawl.java
>> Reversed (or previously applied) patch detected! Assume -R? [n] y
>> Hunk #2 FAILED at 100.
>> Hunk #3 FAILED at 131.
>> 2 out of 3 hunks FAILED -- saving rejects to file 
>> /root/nutch-0.9/src/java/org/apache/nutch/crawl/Crawl.java.rej
>>
>> Can you please help me ! It's first time I patch. Please help me !
>>
>> Thank's in advance,
>>
>> Jisay
>>
>>
>>
>>>
>>> The recrawl patch in https://issues.apache.org/jira/browse/NUTCH-601
>>> got committed today. So if you check out the latest trunk, you can
>>> recrawl without deleting the crawl directory.
>>>
>>> However, if you are using an older version, you may use the script at:
>>> http://wiki.apache.org/nutch/Crawl
>>>
>>> Regards,
>>> Susam Pal
>>>
>>> On Fri, Mar 14, 2008 at 3:48 AM, Bradford Stephens
>>>  wrote:
>>>> Greetings,
>>>>
>>>> A coworker and I are experimenting with Nutch in anticipation of a
>>>> pretty large rollout at our company. However, we seem to be stuck on
>>>> something -- after the crawler is finished, we can't manually re-crawl
>>>> into the same directory/index! It says "Directory already exists" when
>>>> we try to initiate a new crawl. Any ideas?
>>>>
>>>> Cheers,
>>>> Bradford
>>>>
>>
>> _________________________________________________________________
>> Changez votre Live en un clic !
>> http://get.live.com

_________________________________________________________________
Windows Live: une foule de solutions orginales pour partager vos souvenirs !
http://get.live.com

Reply via email to