-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 10/16/2008 10:19 AM, Matteo Cantoni wrote: > Hi List, > > I've attached a new auxiliary module download a http servers robots.txt file > and display all disallowed entries. > It's very simple but useful in some cases. You are free to modify or send to > /dev/null :) > > [*] http://192.168.1.10:80 <http://192.168.1.10/> [Apache/2.0.54 (Fedora)] > Disallow : /backup/ > [*] http://192.168.1.10:80 <http://192.168.1.10/> [Apache/2.0.54 (Fedora)] > Disallow : /tmp/ >
Hey Matteo, I actually sent a module like this to msfdev last year :) I've attached a verbatim copy of it, in case you want to use any of it in some way. I'd just appreciate a mention if you do ;) msf > use scanner/http/robots msf auxiliary(robots) > set RHOSTS youtube.com RHOSTS => youtube.com msf auxiliary(robots) > run [*] 208.65.153.253's disallowed entries: /profile /results /browse /t/terms /t/privacy /login /watch_ajax /watch_queue_ajax [*] Auxiliary module execution completed Thanks, Kris Katterjohn -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.6 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iQIVAwUBSPdj//9K37xXYl36AQIKpA/9FDgjxn8yl0QPohPrzlxzONvZN7UtEBHJ fuNegyRj1BolYoyOdS8jEsqM30XY/WRHhTZ/Y7V8YSotqOOxbrbsZuVVtCo3xyWU G+R8/ZPKz43AC6gcZre0zlPFYM9/gaTyj6gENs5mQoyUa3yTNscGDmlWZbu7R8ju xdhyaFPv4vbkt/lEqa/OiF6qm8ZjaFZvQckvihERKQRw4dyPMHM1aFiCE5dWN3rz 4CqAi7F0AA/Hf7WMnvd+BjIqJx9qNstOPObnYZVVxJ86FKjY+co7+YVb+a0Dt7+/ mIBS8gOe6QrBfMVqTVaCjRKOxcnkk12/MyvOd7+u/EE31ycTjk7r2dUHNK0gr4TN t7/gC6SK0H8n5XGGULNh5rN6Q6TQ7npgjcFYq91htjbDlGfHTDSJGkGREAOMKlZO NfeRUSissjTZTGesokltkLG7Lkv1tztKGlfh8h6JCXK4ozycT6E0neDZ+PH3as5+ d/JqOUys3mu7ttcRRBKol0/I4YS2TcYt+8Jg6QAxCjWMXCgGEh0cwTjooKOiQHqy U2NGH9eqG/k/Sq5Vr/e768eCyzbPpnK+ID5jdXtPh6emuql4W0ejX776xMi+mVol z2BLxL9QDzkFpf8G3bhPNOLXcCfd2ZLlSzjEeVern+UeHdX8H92/8RsVWOT03VjT bRh2fWgvkQg= =aJGB -----END PGP SIGNATURE-----
require 'msf/core' module Msf class Auxiliary::Scanner::Http::Robots < Msf::Auxiliary # Exploit mixins should be called first include Exploit::Remote::HttpClient # Scanner mixin should be near last include Auxiliary::Scanner def initialize super( 'Name' => 'HTTP robots.txt Displayer', 'Version' => '$Revision$', 'Description' => 'Displays disallowed entries from robots.txt', 'Author' => 'Kris Katterjohn <[EMAIL PROTECTED]>', 'License' => BSD_LICENSE ) end # Displays a server's robots.txt's disallowed entries, # similar to Eddie Bell's NSE script def run_host(ip) target_port = datastore['RPORT'] begin res = send_request_raw({ 'method' => 'GET', 'uri' => '/robots.txt' }, 8) return if not res if res.code != 200 print_status("#{ip} has no robots.txt") return end entries = disallowed(res.body) if entries.length > 0 print_status("#{ip}'s disallowed entries:\n#{entries}") else print_status("No disallowed entries in #{ip}'s robots.txt") end rescue ::Rex::ConnectionRefused, ::Rex::HostUnreachable, ::Rex::ConnectionTimeout rescue ::Timeout::Error, ::Errno::EPIPE end end # Strips out irrelevant lines and comments # Returns newline-delimited list of disallowed entries def disallowed(robots) b = robots.split("\n") b.delete_if do |line| line !~ /^Disallow: */ end b.each do |line| line.sub!(/^Disallow: */, "") line.sub!(/#.*/, "") line.strip! end b.join("\n").strip end end end
_______________________________________________ Framework-Hackers mailing list Framework-Hackers@spool.metasploit.com http://spool.metasploit.com/mailman/listinfo/framework-hackers