from the document, i know w3af will request a set of urls, then it scan
they all.
can I feed w3af some urls, then i continue crawl and feed it another?
I can split the scan task to many and scan it part to part, but is there a
easy way to achieve scan and crawl same time?
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
W3af-users mailing list
W3af-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/w3af-users