This distribution has been tested as part of the cpan-testers
effort to test as many new uploads to CPAN as possible.  See
http://testers.cpan.org/

Please cc any replies to [email protected] to keep other
test volunteers informed and to prevent any duplicate effort.

--
Dear Glenn Wood,
    
This is a computer-generated report for Scraper-3.05
on perl-5.8.8, created automatically by CPAN-Reporter-1.02 
and sent to the CPAN Testers mailing list.  

If you have received this email directly, it is because the person testing 
your distribution chose to send a copy to your CPAN email address; there 
may be a delay before the official report is received and processed 
by CPAN Testers.

Thank you for uploading your work to CPAN.  However, it appears that
there were some problems with your distribution.  If these results are 
not what you expect, please consult "Notes for CPAN Authors" on 
the CPAN Testers Wiki: http://cpantest.grango.org

Sections of this report:

    * Tester comments
    * Program output
    * Prerequisites
    * Environment and other context

------------------------------
TESTER COMMENTS
------------------------------

Additional comments from tester: 

[none provided]

------------------------------
PROGRAM OUTPUT
------------------------------

Output from '/usr/bin/make test':

PERL_DL_NONLAZY=1 /home/david/_/_/perl-5.8.8/bin/perl "-MExtUtils::Command::MM" 
"-e" "test_harness(0, 'blib/lib', 'blib/arch')" t/*.t
t/t......ok
All tests successful.
Files=1, Tests=1,  0 wallclock secs ( 0.00 usr +  0.03 sys =  0.03 CPU)
Result: PASS
PERL_DL_NONLAZY=1 /home/david/_/_/perl-5.8.8/bin/perl "-Iblib/lib" 
"-Iblib/arch" test.pl
# VERSIONS OF MODULES ON WHICH SCRAPER DEPENDS
#     using HTML::Form(1.054);
#     using HTML::TreeBuilder(3.23);
#     using HTTP::Cookies(1.39);
#     using HTTP::Request(1.40);
#     using HTTP::Response(1.53);
#     using HTTP::Status(1.28);
#     using LWP(5.805);
#     using LWP::RobotUA(1.27);
#     using LWP::UserAgent(2.033);
#     using Storable(2.15);
#     using Text::ParseWords(3.24);
#     using Tie::Persistent(1.00);
#     using URI(1.35);
#     using URI::Escape(3.28);
#     using URI::URL(5.03);
#     using URI::http();
#     using User(1.8);
#     using WWW::Search(2.550);
#     using XML::XPath(1.13);
1..10
ok 1 - 8 Scraper modules listed in MANIFEST (4,0,4)
ok 2 - WWW::Scraper loaded
ok 3 # skip This Scraper engine requires 'HTML Tidy' to scrub HTML before 
parsing.
Get this program from 'http://tidy.sourceforge.net/docs/Overview.html#Download'
Make sure it is in your execution search path.

not ok 4 - CraigsList
#     Failed test (test.pl at line 129)
ok 5 # skip This Scraper engine requires 'HTML Tidy' to scrub HTML before 
parsing.
Get this program from 'http://tidy.sourceforge.net/docs/Overview.html#Download'
Make sure it is in your execution search path.

not ok 6 - Google
#     Failed test (test.pl at line 129)
not ok 7 - Lycos
#     Failed test (test.pl at line 129)
ok 8 # skip NorthernLight's search engine seems to be down these days!?
ok 9 # skip Sherlock.pm is not working today; many changes out on the frontier 
that I haven't caught up with yet
not ok 10 - ZIPplus4
#     Failed test (test.pl at line 129)
# Can't locate object method "zipcode" via package 
"WWW::Scraper::Request::ZIPplus4_" at blib/lib/WWW/Scraper/Request.pm line 207.
# 4 tests had problems. See file 'test.trace' for details.
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
Operating system: linux
Perl version: 5.008008
VERSIONS OF MODULES ON WHICH SCRAPER DEPENDS
    using HTML::Form(1.054);
    using HTML::TreeBuilder(3.23);
    using HTTP::Cookies(1.39);
    using HTTP::Request(1.40);
    using HTTP::Response(1.53);
    using HTTP::Status(1.28);
    using LWP(5.805);
    using LWP::RobotUA(1.27);
    using LWP::UserAgent(2.033);
    using Storable(2.15);
    using Text::ParseWords(3.24);
    using Tie::Persistent(1.00);
    using URI(1.35);
    using URI::Escape(3.28);
    using URI::URL(5.03);
    using URI::http();
    using User(1.8);
    using WWW::Search(2.550);
    using XML::XPath(1.13);
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
LIST SCRAPER SUB-CLASSES, FROM THE MANIFEST
    + Beaucoup(1.07)
    + CraigsList(1.16)
    + Dogpile(1.11)
    - FieldTranslation will not be tested: it is not a Scraper sub-class.
    + Google(1.23)
    - Grub will not be tested: it is not a Scraper sub-class.
    + Lycos(1.00)
    + NorthernLight(1.00)
    + Sherlock(1.00)
    - Response will not be tested: it is not a Scraper sub-class.
    - Request will not be tested: it is not a Scraper sub-class.
    - TidyXML will not be tested: it is not a Scraper sub-class.
    - WSDL will not be tested: it is not a Scraper sub-class.
    + ZIPplus4(1.09)
    - Opcode will not be tested: it is not a Scraper sub-class.
    - ScraperDiscovery will not be tested: it is not a Scraper sub-class.
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
Test #0: CraigsList
Test #1: CraigsList 'bogus' search
Test #2: CraigsList one-page search
 + got 0 results for 'Honda'
 --- got 0 results for CraigsList 'Honda', but expected 50
 --- base URL: http://www.craigslist.org/cgi-bin/search?
 --- first URL: 
http://www.craigslist.org/cgi-bin/search?areaID=1&cat=all&catAbbreviation=car&group=J&max_ask=&min_ask=&new_cat=6&query=Honda&subAreaID=0&type_search=
 --- last URL: 
http://www.craigslist.org/cgi-bin/search?areaID=1&cat=all&catAbbreviation=car&group=J&max_ask=&min_ask=&new_cat=6&query=Honda&subAreaID=0&type_search=
 --- next URL: 
 --- response message: 200 OK
 --- content size (bytes): 2150
 --- ERRNO: 
 --- Extended OS error: 
Scraper engine CraigsList failed once: 
Test #0: CraigsList
Test #1: CraigsList 'bogus' search
Test #2: CraigsList one-page search
 + got 0 results for 'Honda'
 --- got 0 results for CraigsList 'Honda', but expected 50
 --- base URL: http://www.craigslist.org/cgi-bin/search?
 --- first URL: 
http://www.craigslist.org/cgi-bin/search?areaID=1&cat=all&catAbbreviation=car&group=J&max_ask=&min_ask=&new_cat=6&query=Honda&subAreaID=0&type_search=
 --- last URL: 
http://www.craigslist.org/cgi-bin/search?areaID=1&cat=all&catAbbreviation=car&group=J&max_ask=&min_ask=&new_cat=6&query=Honda&subAreaID=0&type_search=
 --- next URL: 
 --- response message: 200 OK
 --- content size (bytes): 2150
 --- ERRNO: 
 --- Extended OS error: 
Scraper engine CraigsList failed twice: 
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
Test #0: Google
Test #1: Google 'bogus' search
Test #2: Google one-page search
 + got 0 results for 'search scraper'
 --- got 0 results for Google 'search scraper', but expected 9
 --- base URL: http://www.google.com/search?
 --- first URL: 
http://www.google.com/search?btnG=Google+Search&hl=en&ie=UTF%2D8&lr=&q=search+scraper&safe=active
 --- last URL: 
http://www.google.com/search?btnG=Google+Search&hl=en&ie=UTF%2D8&lr=&q=search+scraper&safe=active
 --- next URL: 
 --- response message: 200 OK
 --- content size (bytes): 25033
 --- ERRNO: 
 --- Extended OS error: 
Scraper engine Google failed once: 
Test #0: Google
Test #1: Google 'bogus' search
Test #2: Google one-page search
 + got 0 results for 'search scraper'
 --- got 0 results for Google 'search scraper', but expected 9
 --- base URL: http://www.google.com/search?
 --- first URL: 
http://www.google.com/search?btnG=Google+Search&hl=en&ie=UTF%2D8&lr=&q=search+scraper&safe=active
 --- last URL: 
http://www.google.com/search?btnG=Google+Search&hl=en&ie=UTF%2D8&lr=&q=search+scraper&safe=active
 --- next URL: 
 --- response message: 200 OK
 --- content size (bytes): 25091
 --- ERRNO: 
 --- Extended OS error: 
Scraper engine Google failed twice: 
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
Test #0: Lycos
Test #1: Lycos 'bogus' search
Test #2: Lycos one-page search
 + got 0 results for 'turntable'
 --- got 0 results for Lycos 'turntable', but expected 9
 --- base URL: http://search.lycos.com/default.asp?
 --- first URL: 
http://search.lycos.com/default.asp?loc=searchhp&lpv=1&query=turntable&tab=web
 --- last URL: 
http://search.lycos.com/default.asp?loc=searchhp&lpv=1&query=turntable&tab=web
 --- next URL: 
 --- response message: 200 OK
 --- content size (bytes): 53789
 --- ERRNO: 
 --- Extended OS error: 
Scraper engine Lycos failed once: 
Test #0: Lycos
Test #1: Lycos 'bogus' search
Test #2: Lycos one-page search
 + got 0 results for 'turntable'
 --- got 0 results for Lycos 'turntable', but expected 9
 --- base URL: http://search.lycos.com/default.asp?
 --- first URL: 
http://search.lycos.com/default.asp?loc=searchhp&lpv=1&query=turntable&tab=web
 --- last URL: 
http://search.lycos.com/default.asp?loc=searchhp&lpv=1&query=turntable&tab=web
 --- next URL: 
 --- response message: 200 OK
 --- content size (bytes): 54034
 --- ERRNO: 
 --- Extended OS error: 
Scraper engine Lycos failed twice: 
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
##_##_##_##_##_##_##_##_##_##_##_##_##_##_##
Test #0: ZIPplus4
Test #1: ZIPplus4 'bogus' search
Scraper engine ZIPplus4 failed once: Can't locate object method "zipcode" via 
package "WWW::Scraper::Request::ZIPplus4_" at blib/lib/WWW/Scraper/Request.pm 
line 207.

Test #0: ZIPplus4
Test #1: ZIPplus4 'bogus' search
Scraper engine ZIPplus4 failed twice: Can't locate object method "zipcode" via 
package "WWW::Scraper::Request::ZIPplus4_" at blib/lib/WWW/Scraper/Request.pm 
line 207.

# Looks like you failed 4 tests of 10.
make: *** [test_dynamic] Error 4

------------------------------
PREREQUISITES
------------------------------

Prerequisite modules loaded:

requires:

    Module            Need  Have 
    ----------------- ----- -----
    HTML::Form        0.02  1.054
    HTML::TreeBuilder 0     3.23 
    HTTP::Cookies     0     1.39 
    HTTP::Request     0     1.40 
    HTTP::Response    0     1.53 
    HTTP::Status      0     1.28 
    LWP               5.48  5.805
    LWP::RobotUA      0     1.27 
    LWP::UserAgent    0     2.033
    Storable          0.6   2.15 
    Text::ParseWords  3.2   3.24 
    Tie::Persistent   0.901 1.00 
    URI               0     1.35 
    URI::Escape       0     3.28 
    URI::http         0     0    
    URI::URL          0     5.03 
    User              1.05  1.8  
    WWW::Search       2.35  2.550
    XML::XPath        0     1.13 

------------------------------
ENVIRONMENT AND OTHER CONTEXT
------------------------------

Environment variables:

    LANG = en_GB
    LANGUAGE = en_GB:en_US:en_GB:en
    PATH = /usr/local/bin:/usr/bin:/bin:/usr/bin/X11:/usr/games
    PERL5LIB = 
    PERL5_CPANPLUS_IS_RUNNING = 14468
    PERL5_CPAN_IS_RUNNING = 14468
    SHELL = /bin/bash
    TERM = screen

Perl special variables (and OS-specific diagnostics, for MSWin32):

    $^X = /home/david/_/_/perl-5.8.8/bin/perl
    $UID/$EUID = 1000 / 1000
    $GID = 1000 46 44 29 25 24 20 1000
    $EGID = 1000 46 44 29 25 24 20 1000

Perl module toolchain versions installed:

    Module              Have  
    ------------------- ------
    CPAN                1.9203
    Cwd                 3.25  
    ExtUtils::CBuilder  n/a   
    ExtUtils::Command   1.13  
    ExtUtils::Install   1.41  
    ExtUtils::MakeMaker 6.36  
    ExtUtils::Manifest  1.51  
    ExtUtils::ParseXS   n/a   
    File::Spec          3.12  
    Module::Build       0.2808
    Module::Signature   n/a   
    Test::Harness       3.00  
    Test::More          0.70  
    YAML                0.66  
    YAML::Syck          n/a   
    version             0.7203


--

Summary of my perl5 (revision 5 version 8 subversion 8) configuration:
  Platform:
    osname=linux, osvers=2.4.27-3-686, archname=i686-linux
    uname='linux pigsty 2.4.27-3-686 #1 tue dec 5 21:03:54 utc 2006 i686 
gnulinux '
    config_args='-de -Dprefix=/home/david/cpantesting/perl-5.8.8'
    hint=recommended, useposix=true, d_sigaction=define
    usethreads=undef use5005threads=undef useithreads=undef 
usemultiplicity=undef
    useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
    use64bitint=undef use64bitall=undef uselongdouble=undef
    usemymalloc=n, bincompat5005=undef
  Compiler:
    cc='cc', ccflags ='-fno-strict-aliasing -pipe -I/usr/local/include 
-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64',
    optimize='-O2',
    cppflags='-fno-strict-aliasing -pipe -I/usr/local/include'
    ccversion='', gccversion='3.3.5 (Debian 1:3.3.5-13)', gccosandvers=''
    intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=1234
    d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=12
    ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='off_t', 
lseeksize=8
    alignbytes=4, prototype=define
  Linker and Libraries:
    ld='cc', ldflags =' -L/usr/local/lib'
    libpth=/usr/local/lib /lib /usr/lib
    libs=-lnsl -lgdbm -ldl -lm -lcrypt -lutil -lc
    perllibs=-lnsl -ldl -lm -lcrypt -lutil -lc
    libc=/lib/libc-2.3.2.so, so=so, useshrplib=false, libperl=libperl.a
    gnulibc_version='2.3.2'
  Dynamic Linking:
    dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags='-Wl,-E'
    cccdlflags='-fpic', lddlflags='-shared -L/usr/local/lib'

Reply via email to