> That's cool. couple of q's: > > 1. presumably you plan to make it a 3.0.0-style plugin once that's out? > it's a lot cleaner that way ;) Yes. I am planning on making a plugin out of this. That will get rid off all the patching required to Conf.pm and PerMsgStatus.pm to use the package and set things up properly.
> > 2. how do you query SC for the URLs? scrape the page via HTTP? Yes. I am using LWP to grab the page and HTML::LinkExtor (part of HTML::Parser) to get everything off of the page. I would hope that if we can generate enough community support for this approach, we may be able to convince spamcop.net to at least publish an RSS feed and possibly later a more complete listing of all the entries. --eric > > - --j. > -----BEGIN PGP SIGNATURE----- > Version: GnuPG v1.2.4 (GNU/Linux) > Comment: Exmh CVS > > iD8DBQFAU6hhQTcbUG5Y7woRAtp2AJ0WWOfkl/3E9nW2GjL7A4C6eEconwCfUc5j > Tof3Np+lhFFiI3Mx3D/mSac= > =8ol4 > -----END PGP SIGNATURE-----
