Tom,

On Tue, Apr 27, 2010 at 12:28 PM, Tom Ueltschi
<[email protected]> wrote:
> Hi all
>
> I briefly looked at the code (importResults plugin) and it doesn't look like
> XML file input. I also tried webscarab (besides burp) and it looks like it's
> easier to save the files (conversation directory).
>
> I also found out where the importResults comes from :-)
>
> https://svn.sqlmap.org/sqlmap/trunk/sqlmap/lib/core/option.py

    Yes, we're not fans of reinventing the wheel :)

> I'll try to use the webscarab files as input. hope it works :-)

    Ok, let us know how that went.

> Tom
>
>
> On Tue, Apr 27, 2010 at 12:24 PM, Tom Ueltschi <[email protected]>
> wrote:
>>
>> Hi Tiago
>>
>> thanks for the reply.  i missed adding a scope at the beginning, but tried
>> to do it afterwards in the proxy-history tab by selecting the list of url's.
>>
>> from proxy-history there is also the option of "save selected items" which
>> generates a XML file (with items, time, url, request, response etc. as
>> elements).
>>
>> what's the format expected by importResults input_burp?
>>
>> thanks,
>> Tom
>>
>>
>> On Tue, Apr 27, 2010 at 11:54 AM, Tiago Mendo <[email protected]>
>> wrote:
>>>
>>> On 2010/04/27, at 10:33, Tom Ueltschi wrote:
>>>
>>> Hi Andres and list,
>>>
>>> instead of the spiderMan plugin I would like to use another proxy (burp,
>>> webscarab) and import the URL's from a file. This way I just have to do it
>>> once for multiple scans (no interaction required).
>>>
>>> - The latest version from importResults says in its description:
>>>
>>>        Three configurable parameter exist:
>>>            - input_csv
>>>            - input_burp
>>>            - input_webscarab
>>>
>>> I've used paros proxy extensively, but don't know if I could export a url
>>> list in the "inpuc_csv" format.
>>>
>>> Has anyone done this with burp or webscarab proxy? Which on is easier to
>>> just create an url list?
>>>
>>> I know you can easily generate a list of URL GET requests with the free
>>> Burp. Just define a scope for your site, access it through the Burp proxy,
>>> and then right click the site in the history tab (I think it is the first
>>> one). Choose spider from here (or similar) and then right click again and
>>> choose one of the two export options. One of them will fill the clipboard
>>> with a list of GETs.
>>> I don't recall doing it with webscarab, so I can't give you more
>>> information.
>>>
>>>
>>> Can you do this with the free version of burp?
>>>
>>> yes.
>>>
>>> Do you know of the right menu entry to save the url file from burp or
>>> webscarab?  (I will try to find it myself with burp first)
>>>
>>> read above
>>>
>>> Thanks for any help.
>>>
>>> Cheers,
>>> Tom
>
>



-- 
Andrés Riancho
Founder, Bonsai - Information Security
http://www.bonsai-sec.com/
http://w3af.sf.net/

------------------------------------------------------------------------------

_______________________________________________
W3af-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/w3af-users

Reply via email to