Hi Benoit,

Interesting question - I am sure you could do something like that with
JWebUnit (through xpath, etc), but I imagine using a piece of software
developed specifically for dumping a site would be better, unless you want
to verify some unit tests across an entire site?

Software like wget comes to mind.

Cheers
Jevon

On Tue, Apr 20, 2010 at 10:44 PM, Xhenseval, Benoit <
benoit.xhense...@credit-suisse.com> wrote:

> Hi All
>
> I'm new to JWebUnit.
>
> I'm trying to develop the most obvious check... A spider that would
> follow all links within a given domain.
>
> Is there support for such a thing?
>
> If not, how could I get all links from a given page?
>
> I'm not using Selenium.
>
> Thanks a lot
>
> Benoit
>
>
> ===============================================================================
> Please access the attached hyperlink for an important electronic
> communications disclaimer:
> http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html
>
> ===============================================================================
>
>
>
> ------------------------------------------------------------------------------
> Download Intel&#174; Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> JWebUnit-users mailing list
> JWebUnit-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/jwebunit-users
>
------------------------------------------------------------------------------
_______________________________________________
JWebUnit-users mailing list
JWebUnit-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jwebunit-users

Reply via email to