Thanks Jeffrey,

since I had running the pre-compiled version of capture-HPC server
(capture-server-2.5.1-389-withLinuxRevert.zip)
I needed download from svn the capture directory with all java classes and
the
compile_revert_linux.sh script and after solved some minor issues all the
compilation was succesfull.

I have configured the plugin and executed the first time and I could see the
crawler output for a url working
right but the capture-HPC server  looks like stay in a constant reverting
state and not working with the
crawled urls:

--cut--
..
Depth=2  Crawling http://www.domain.com/resource.asp
Finished Crawling http://www.domain.com
Waiting for input URLs...

[sep 4, 2009 11:27:18 AM-172.21.1.44:902-11546362] Finished processing VM
item: revert
[sep 4, 2009 11:27:39 AM-172.21.1.44:902-11546362] Client inactivity,
reverting VM
[sep 4, 2009 11:27:39 AM-172.21.1.44:902-11546362] VMSetState:
WAITING_TO_BE_REVERTED
[sep 4, 2009 11:27:40 AM-172.21.1.44:902-11546362] VMSetState: REVERTING
Waiting for input URLs...
[sep 4, 2009 11:27:59 AM-172.21.1.44:902-11546362] VMSetState: RUNNING
Reverting same VM...just waiting a bit
[sep 4, 2009 11:28:05 AM-172.21.1.44:902-11546362] Finished processing VM
item: revert
[sep 4, 2009 11:28:45 AM-172.21.1.44:902-11546362] Client inactivity,
reverting VM
[sep 4, 2009 11:28:45 AM-172.21.1.44:902-11546362] VMSetState:
WAITING_TO_BE_REVERTED
[sep 4, 2009 11:28:45 AM-172.21.1.44:902-11546362] VMSetState: REVERTING
Waiting for input URLs...
[sep 4, 2009 11:29:07 AM-172.21.1.44:902-11546362] VMSetState: RUNNING
Reverting same VM...just waiting a bit
[sep 4, 2009 11:29:13 AM-172.21.1.44:902-11546362] Finished processing VM
item: revert
[sep 4, 2009 11:29:53 AM-172.21.1.44:902-11546362] Client inactivity,
reverting VM
[sep 4, 2009 11:29:53 AM-172.21.1.44:902-11546362] VMSetState:
WAITING_TO_BE_REVERTED
[sep 4, 2009 11:29:53 AM-172.21.1.44:902-11546362] VMSetState: REVERTING
Waiting for input URLs...
--end--


Thanks
Emilio


2009/9/3 JEFFREY S STEWART <jss1...@esu.edu>

>
> Emilio,
>
> Please reply via the mailing list so that others can find the solution if
> they have the same problem.
>
> The errors lead me to believe that the Preprocessor.java file is not being
> found when you build it.  Please check to make sure that the
> Preprocessor.java is in the capture directory along with the source files
> from the Crawler.tar I sent.
>
> Thanks,
> Jeff
>
>
>
> -----Original Message-----
> From: Emilio Casbas [mailto:ecasb...@yahoo.es <ecasb...@yahoo.es>]
> Sent: Thu 9/3/2009 4:11 AM
> To: JEFFREY S STEWART
> Subject: Re: [Capture-HPC] Capture-HPC Crawler Preprocessor
>
> Hi Jeffrey,
>
> congratulations for your support and excellent job with the capture-hpc
> project.
>
> I am interested in testing this feature but since I'm not a developer I'm
> having
> some problems installing it.
>
> Following the instructions, in the step 4, I run the "ant" command and
> after solved some
> issues I get this:
>
> compile:
>     [javac] Compiling 3 source files to
> /home/machine/capture-HPC/capture-with-crawl/build
>     [javac]
> /home/machine/capture-HPC/capture-with-crawl/source/Crawler.java:14: cannot
> find symbol
>     [javac] symbol  : class Preprocessor
>     [javac] location: package capture
>     [javac] public class Crawler extends capture.Preprocessor
>     [javac]                                     ^
>     [javac]
> /home/machine/capture-HPC/capture-with-crawl/source/Crawler.java:472: cannot
> find symbol
>     [javac] symbol  : method addUrlToCaptureQueue(java.lang.String)
>     [javac] location: class capture.Crawler
>     [javac]             addUrlToCaptureQueue(url + "::" + program + "::" +
> delay + priority);
>     [javac]             ^
>     [javac] 2 errors
>
> BUILD FAILED
> /home/machine/capture-HPC/capture-with-crawl/build.xml:34: Compile failed;
> see the compiler error output for details.
>
> Total time: 5 seconds
> mach...@pam-inv-03:~/capture-HPC/capture-with-crawl$
>
> Previously I had the capture-HPC program running successfully but I
> didn't compile the software I had installed a pre-configured version.
> Could you point me to some solution?
>
> I could help you in testing and troubleshooting the plugin.
>
> TIA
> Emilio
>
>
>
> >
> >De: JEFFREY S STEWART <jss1...@esu.edu>
> >Para: General discussion list for Capture-HPC users <
> capture-hpc@public.honeynet.org>
> >Enviado: lunes, 17 de agosto, 2009 15:11:41
> >Asunto: [Capture-HPC] Capture-HPC Crawler Preprocessor
> >
> >Capture-HPC Crawler Preprocessor >
>
> >
> >
> >All,
> >
> >>Attached is a preprocessor that I've made to add web crawler support to
> capture-HPC.  It only does http right now.  It works by finding links in the
> href field of the input page's HTML that you specify.  It has a bunch of
> options to configure that let you determine where it crawls, view the
> Crawler.README for a list of them.
> >
> >>One of the features, not really a crawler function, but decided it fit in
> nicely with scrapping pages.  (It queries google for site:
> safebrowsing.clients.google.com "the last time suspicious content was
> found on this site was on ) plus yesterday's date.  This has the result of
> getting all the malicious urls that google identified and crawled from
> yesterday.  (Good for when you don't have any malicious urls to crawl).
> Note, this feature doesn't fall google's TOS.
> >
> >>There are some more specific build instructions because the classes that
> I used have to be built with the project.  Take a look at the enclosed
> build.README.
> >
> >>If there are any questions or feedback, let me know.
> >
> >>Thanks
> >>Jeff
> >
> >
>
>
>
>
>
> _______________________________________________
> Capture-HPC mailing list
> Capture-HPC@public.honeynet.org
> https://public.honeynet.org/mailman/listinfo/capture-hpc
>
>
_______________________________________________
Capture-HPC mailing list
Capture-HPC@public.honeynet.org
https://public.honeynet.org/mailman/listinfo/capture-hpc

Reply via email to