Hey Darien, I added the patch files to the post...

I havent looked at heritrix yet. I am currently adding a preprocessor
framework to capture that would allow integration of tools like this
...thanks for the tip.

On the ossec, do you have some more info on what they are looking for? I
couldnt find anything on their site.

Cheers-
Christian

On Wed, May 7, 2008 at 4:03 PM, Kindlund, Darien F. <[EMAIL PROTECTED]>
wrote:

> Hi Christian,
>
> That's an excellent post, regarding caching DNS and HTTP traffic!  Any
> chance you could provide copies of the actual squid and pdnsd .conf
> files used in your described setup (or even .patch files)?  In any
> event, we'll update our setup and let you know if any additional tweaks
> can be effective.
>
> Also, have you thought about using the "Internet Archive's crawler"
> (aka. Heritrix)?
> http://crawler.archive.org/
>
> It's the main engine that the Monkey Spider project is apparently
> using:
> http://monkeyspider.sourceforge.net/documentation.html
>
> Also, on an unrelated note, apparently OSSEC is looking for an
> open-source Windows rootkit detection mechanism; Capture-BAT may be
> useful to them.
> http://www.ossec.net/main/
>
> Regards,
> -- Darien
>
> >-----Original Message-----
> >From: [EMAIL PROTECTED] [mailto:capture-hpc-
> >[EMAIL PROTECTED] On Behalf Of Christian Seifert
> >Sent: Wednesday, May 07, 2008 6:25 PM
> >To: General discussion list for Capture-HPC users
> >Subject: [Capture-HPC] Caching of web responses for DAC algorithm
> >ofCapture-HPC (and forensic analysis)
> >Importance: Low
> >
> >Folks,
> >
> >Capture-HPC 2.1 uses a divide-and-conquer algorithm to interact with
> >multiple web pages repeatedly to determine which one is actually the
> >malicious one. Because of anti-forensic techniques used by malicious
> >web pages, such as IP tracking and fast-flux networks, it is crucial
> >to interact with the identical web pages. The way I am accomplishing
> >this is to have the client honeypot connect to the potentially
> >malicious web servers through a proxy. This is not only a required
> >setup to have the divide-and-conquer algorithm work, but also it
> >assists in analyzing the page after it has been identified using the
> >vulnerable and/or non-vulnerable client. I have added a blog post in
> >which I describe how I configure the cache to accomplish this:
> >http://www.mcs.vuw.ac.nz/~cseifert/blog/pivot/entry.php?id=68#body<http://www.mcs.vuw.ac.nz/%7Ecseifert/blog/pivot/entry.php?id=68#body>
> >
> >Christian
> >
> >--
> >----
> >Web: http://www.mcs.vuw.ac.nz/~cseifert<http://www.mcs.vuw.ac.nz/%7Ecseifert>
> >
> >PGP key
> >http://www.mcs.vuw.ac.nz/~cseifert/pgpkey.txt<http://www.mcs.vuw.ac.nz/%7Ecseifert/pgpkey.txt>
> >Primary key fingerprint: E979 0D9A 9187 D821 F86F B712 C8DB 0583
> >B046 BAEF
> _______________________________________________
> Capture-HPC mailing list
> Capture-HPC@public.honeynet.org
> https://public.honeynet.org/mailman/listinfo/capture-hpc
>



-- 
----
Web: http://www.mcs.vuw.ac.nz/~cseifert

PGP key
http://www.mcs.vuw.ac.nz/~cseifert/pgpkey.txt
Primary key fingerprint: E979 0D9A 9187 D821 F86F B712 C8DB 0583 B046 BAEF
_______________________________________________
Capture-HPC mailing list
Capture-HPC@public.honeynet.org
https://public.honeynet.org/mailman/listinfo/capture-hpc

Reply via email to