> Here is my faulty script:
> I think the problem lies in the line that says:
> append _LINKS LinkCrawler (depth - 1) getlinks url _LINKS

Just some quick comments:

 1.
I didn't find UNIQUE in your code.
If you have a block of strings,
you can use unique to remove duplicates:

        b: ["abc" "abc" "bcd"]
        b: unique b
        ;== ["abc" "bcd"]


 2.
This is a snipping from your code:

        if find/last baseURL "html" [
        ; if it ends with .html then strip off the last part
        baseURL: rejoin [copy/part baseURL find/last baseURL "/"]
        ]

The comment doesn't seem to match the code.
find/last here searches in reverse starting at the tail
of the string and stops when it finds "html".
So see this:

        find/last "html-generator.c" "html"
        ;== "html-generator.c"

I don't think you want that.
Furthermore, your comment says it matches ".html",
however, it only matches "html".
So, I think you probably want:

        if all [
                match: find/last baseURL ".html"
                (length? match) = length? ".html"
        ][...]

Anton.
-- 
To unsubscribe from this list, please send an email to
[EMAIL PROTECTED] with "unsubscribe" in the 
subject, without the quotes.

Reply via email to