This is a weird one and I just ran into it this morning. It 
was hard to pin down exactly what was causing it... if I can get 
someone else to reproduce this, I'll file it as an official bug.

        I have a directory of HTML files given to me by another person 
who does little "slideshows" of real-estate and other stuff. There are 
97 .html files linking to 121 separate .jpg files. 

        The top-level page has 4 links to the 4 "albums" in the doc. 
Each album has about 23 separate html pages in it. Each page has a 
310x225 image and some little navigation jpegs below it '<< < [] > >>' 
basically (beginning, previous, home, next, last). So far so good...

        When I pluck this document with a maxdepth=30 maxwidth=300, 
maxdepth gets cut off at about 20, and only the first two albums are 
shown with usable links (albums 3 and 4 are "offsite", not fetched by 
Plucker).

        If I remove --maxwidth=300, the document contains all of the 
target links at the proper depth (though the right-side of the images 
are cut off because they are 310px wide).

        I confirmed this with 16bpp, 8bpp, and below. I can also 
confirm this by setting maxdepth=100 or higher, which _should_ get all 
of the links in this relatively-small tree.

        I also tried --depth-first, which didn't help either.

        What the heck is going on here? 


David A. Desrosiers
[EMAIL PROTECTED]
http://gnu-designs.com
_______________________________________________
plucker-dev mailing list
[email protected]
http://lists.rubberchicken.org/mailman/listinfo/plucker-dev

Reply via email to