Absolutely!
We're a bit thin on active committers at the moment, which will
probably limit our ability to take any highly active roles in your
development process.  But we do have a pile of code which you might be
able to leverage, and once there is common functionality available I
think we'd all prefer to use that rather than home-grown code.

How would you prefer that we proceed?

Karl


On Thu, Jun 2, 2011 at 11:11 AM, Julien Nioche
<[email protected]> wrote:
> Hi guys,
>
> I'd just like to mention Crawler Commons which is a effort between the
> committers of various crawl-related projects (Nutch, Bixo or Heritrix) to
> put some basic functionalities in common. We currently have mostly a top
> level domain finder and a sitemap parser, but are definitely planning to
> have other things there as well, e.g. robots.txt parser, protocol handler
> etc...
>
> Would you like to get involved? There are quite a few things that the
> crawler in Manifold could reuse or contribute to.
>
> Best,
>
> Julien
>
> --
> *
> *Open Source Solutions for Text Engineering
>
> http://digitalpebble.blogspot.com/
> http://www.digitalpebble.com
>

Reply via email to