No dia 09/07/2013 18:25, "Andres Riancho" <[email protected]>
escreveu:
>
> Fabio,
>
> On Tue, Jul 9, 2013 at 2:15 PM, Fábio Rodrigues <[email protected]>
wrote:
> > Hello all
> > I have a issue with w3af that needed some help. When i try to run w3af
> > against a instance of the site i'm developing i get a very slow audit,
for
> > example i get the following after 8 hours of execution:
> >
> >
|----------------------------------------------------------------------------------------------------|
> > | Crawling http://seaamz.alice/index/newsletter/ | Method: POST |
> > Parameters:                        |
> > | (YII_CSRF_TOKEN="a275b6f26f...",
NewsletterSignupForm[gender]="female",
> > |
> > | NewsletterSignupForm[gender]="male", NewsletterSignupForm[email]="")
using
> > crawl.phpinfo           |
> > | Auditing http://seaamz.alice/index/newsletter/ | Method: POST |
> > Parameters:                        |
> > | (YII_CSRF_TOKEN="a275b6f26f...",
NewsletterSignupForm[gender]="female",
> > |
> > | NewsletterSignupForm[gender]="male", NewsletterSignupForm[email]="")
using
> > audit.eval              |
> > | Crawl phase: In (0.01 URLs/min) Out (0.01 URLs/min) Pending (0 URLs)
ETA
> > (None)                    |
> > | Audit phase: In (0.01 URLs/min) Out (0.01 URLs/min) Pending (0 URLs)
ETA
> > (None)                    |
> > | Requests per minute: 7
> > |
> >
|----------------------------------------------------------------------------------------------------|
> >
> > After some hours i even get to 0 request per minute. Anyone has even had
> > this behaviour in w3af.
>
> Well, that's interesting... never seen that before.
>
3 time in a row
> > ----------- More info ------------
> > local machine running:
> > Ubuntu 12.04
> > Memory: 8Gb Ram
> > Processor : I5 4 core @ 2.5Ghz
> > net interface: using localhost interface
> > server type: nginx 1.1.19 with php-fpm
>
> Should be more than enough for running a scan.
>
> > Profile used: custom one
> >
> > [grep.get_emails]
> >
> > [grep.meta_tags]
> >
> > [grep.error_pages]
> >
> > [grep.strange_reason]
> >
> > [grep.strange_parameters]
> >
> > [grep.strange_http_codes]
> >
> > [grep.strange_headers]
> >
> > [grep.credit_cards]
> >
> > [grep.error_500]
> >
> > [grep.csp]
> >
> > [grep.code_disclosure]
> >
> > [grep.analyze_cookies]
> >
> > [crawl.robots_txt]
> >
> > [crawl.web_spider]
> > only_forward = False
> > follow_regex = .*
> > ignore_regex =
> >
> > [crawl.phpinfo]
> >
> > [crawl.sitemap_xml]
> >
> > [output.html_file]
> >
> > [output.text_file]
> > verbose = True
> > output_file = ~/output.txt
> > http_output_file = ~/output-http.txt
>
> And what does the output file show? Do you see any error messages? I
> suspect an error at the TCP/HTTP level, like connection refused, http
> library errors, etc.
>
When I babysit the execution I don't see any connection refused, we crawls
the site well. Any way to get a connection log dumped to a file ?
> > [output.console]
> > verbose = True
> >
> > [audit.xpath]
> >
> > [audit.xss]
> > persistent_xss = True
> >
> > [audit.generic]
> >
> > [audit.un_ssl]
> >
> > [audit.format_string]
> >
> > [audit.preg_replace]
> >
> > [audit.sqli]
> >
> > [audit.eval]
> >
> > [infrastructure.find_vhosts]
> >
> > [infrastructure.dns_wildcard]
> >
> > [infrastructure.server_status]
> >
> > [infrastructure.hmap]
> >
> > [infrastructure.fingerprint_os]
> >
> > [target]
> > target =
> >
> > [misc-settings]
> > fuzz_cookies = False
> > fuzz_form_files = True
> > fuzz_url_filenames = False
> > fuzz_url_parts = False
> > fuzzed_files_extension = gif
> > fuzzable_headers =
> > form_fuzzing_mode = tmb
> > stop_on_first_exception = False
> > max_discovery_time = 120
> > interface = eth0
> > local_ip_address = 192.168.32.94
> > non_targets =
> > msf_location = /opt/metasploit3/bin/
> >
> > [http-settings]
> > timeout = 15
> > headers_file =
> > basic_auth_user =
> > basic_auth_passwd =
> > basic_auth_domain =
> > ntlm_auth_domain =
> > ntlm_auth_user =
> > ntlm_auth_passwd =
> > ntlm_auth_url =
> > cookie_jar_file =
> > ignore_session_cookies = False
> > proxy_port = 8080
> > proxy_address =
> > user_agent = w3af.org
> > max_file_size = 400000
> > max_http_retries = 2
> > always_404 =
> > never_404 =
> > string_match_404 =
> > url_parameter =
> >
> >
> >
> >
------------------------------------------------------------------------------
> > See everything from the browser to the database with AppDynamics
> > Get end-to-end visibility with application monitoring from AppDynamics
> > Isolate bottlenecks and diagnose root cause in seconds.
> > Start your free trial of AppDynamics Pro today!
> >
http://pubads.g.doubleclick.net/gampad/clk?id=48808831&iu=/4140/ostg.clktrk
> > _______________________________________________
> > W3af-users mailing list
> > [email protected]
> > https://lists.sourceforge.net/lists/listinfo/w3af-users
> >
>
>
>
> --
> Andrés Riancho
> Project Leader at w3af - http://w3af.org/
> Web Application Attack and Audit Framework
> Twitter: @w3af
> GPG: 0x93C344F3
------------------------------------------------------------------------------
See everything from the browser to the database with AppDynamics
Get end-to-end visibility with application monitoring from AppDynamics
Isolate bottlenecks and diagnose root cause in seconds.
Start your free trial of AppDynamics Pro today!
http://pubads.g.doubleclick.net/gampad/clk?id=48808831&iu=/4140/ostg.clktrk
_______________________________________________
W3af-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/w3af-users

Reply via email to