Hi Miroslav, i still believe there is a problem with unicode chars,

root@fr:~/sqlmap# ./sqlmap.py -u "http://www.XXdomainXX"; --crawl

    sqlmap/1.0-dev (r4143) - automatic SQL injection and database takeover
tool
    http://sqlmap.sourceforge.net

[!] legal disclaimer: usage of sqlmap for attacking targets without prior
mutual consent is illegal. It is the end user's responsibility to obey all
applicable local, state and federal laws. Authors assume no liability and
are not responsible for any misuse or damage caused by this program

[*] starting at 22:14:48

[22:14:48] [INFO] setting crawling options
please enter maximum depth [Enter for 1 (default)] 3
[22:14:51] [INFO] starting crawler
[22:14:51] [INFO] searching for links with depth 1
[22:14:53] [INFO] searching for links with depth 2

[22:14:59] [CRITICAL] unhandled exception in sqlmap/1.0-dev (r4143), retry
your run with the latest development version from the Subversion repository.
If the exception persists, please send by e-mail to
sqlmap-users@lists.sourceforge.net the following text and any information
required to reproduce the bug. The developers will try to reproduce the bug,
fix it accordingly and get back to you.
sqlmap version: 1.0-dev (r4143)
Python version: 2.6.5
Operating system: posix
Command line: ./sqlmap.py -u http://www.XXdomainXX --crawl
Technique: None
Back-end DBMS: None (identified)
Traceback (most recent call last):
  File "./sqlmap.py", line 77, in main
    init(cmdLineOptions)
  File "/root/sqlmap/lib/core/option.py", line 1827, in init
    __setCrawler()
  File "/root/sqlmap/lib/core/option.py", line 407, in __setCrawler
    crawler.getTargetUrls(depth)
  File "/root/sqlmap/lib/utils/crawler.py", line 96, in getTargetUrls
    runThreads(numThreads, crawlThread)
  File "/root/sqlmap/lib/core/threads.py", line 97, in runThreads
    threadFunction()
  File "/root/sqlmap/lib/utils/crawler.py", line 59, in crawlThread
    soup = BeautifulSoup(content)
  File "/root/sqlmap/extra/beautifulsoup/beautifulsoup.py", line 1522, in
__init__
    BeautifulStoneSoup.__init__(self, *args, **kwargs)
  File "/root/sqlmap/extra/beautifulsoup/beautifulsoup.py", line 1147, in
__init__
    self._feed(isHTML=isHTML)
  File "/root/sqlmap/extra/beautifulsoup/beautifulsoup.py", line 1189, in
_feed
    SGMLParser.feed(self, markup)
  File "/usr/lib/python2.6/sgmllib.py", line 104, in feed
    self.goahead(0)
  File "/usr/lib/python2.6/sgmllib.py", line 143, in goahead
    k = self.parse_endtag(i)
  File "/usr/lib/python2.6/sgmllib.py", line 320, in parse_endtag
    self.finish_endtag(tag)
  File "/usr/lib/python2.6/sgmllib.py", line 358, in finish_endtag
    method = getattr(self, 'end_' + tag)
UnicodeEncodeError: 'ascii' codec can't encode character u'\xae' in position
4: ordinal not in range(128)

[*] shutting down at 22:14:59

Thanks again for your fast resolutions i believe sqlmap is unique on this.

Regards,
Nicolas

On Mon, Jun 20, 2011 at 4:20 PM, Miroslav Stampar <
miroslav.stam...@gmail.com> wrote:

> hi Nicolas.
>
> it should be fixed in the latest commit.
>
> kr
>
> On Mon, Jun 20, 2011 at 2:29 PM, Nicolas Krassas <kr...@deventum.com>
> wrote:
> > Greetings,
> >
> >   I encountered the problem below whilst trying to check the new feature
> > "crawl". Also in a different website sqlmap with crawl option enabled and
> no
> > other tuning parameters successfully DOS apache service to a load of
> 100++.
> >
> > root@fr:~/sqlmap# ./sqlmap.py -u "http://www.XXdomainXX.com"; --crawl
> >
> >     sqlmap/1.0-dev (r4137) - automatic SQL injection and database
> takeover
> > tool
> >     http://sqlmap.sourceforge.net
> >
> > [!] legal disclaimer: usage of sqlmap for attacking targets without prior
> > mutual consent is illegal. It is the end user's responsibility to obey
> all
> > applicable local, state and federal laws. Authors assume no liability and
> > are not responsible for any misuse or damage caused by this program
> >
> > [*] starting at 15:24:16
> >
> > [15:24:16] [INFO] setting crawling options
> > please enter maximum depth [Enter for 1 (default)] 3
> > [15:24:23] [INFO] starting crawler
> >
> > [15:24:58] [CRITICAL] unhandled exception in sqlmap/1.0-dev (r4137),
> retry
> > your run with the latest development version from the Subversion
> repository.
> > If the exception persists, please send by e-mail to
> > sqlmap-users@lists.sourceforge.net the following text and any
> information
> > required to reproduce the bug. The developers will try to reproduce the
> bug,
> > fix it accordingly and get back to you.
> > sqlmap version: 1.0-dev (r4137)
> > Python version: 2.6.5
> > Operating system: posix
> > Command line: ./sqlmap.py -u http://www.XXdomainXX.com --crawl
> > Technique: None
> > Back-end DBMS: None (identified)
> > Traceback (most recent call last):
> >   File "./sqlmap.py", line 77, in main
> >     init(cmdLineOptions)
> >   File "/root/sqlmap/lib/core/option.py", line 1823, in init
> >     __setCrawler()
> >   File "/root/sqlmap/lib/core/option.py", line 407, in __setCrawler
> >     crawler.getTargetUrls(depth)
> >   File "/root/sqlmap/lib/utils/crawler.py", line 78, in getTargetUrls
> >     runThreads(numThreads, crawlThread)
> >   File "/root/sqlmap/lib/core/threads.py", line 97, in runThreads
> >     threadFunction()
> >   File "/root/sqlmap/lib/utils/crawler.py", line 57, in crawlThread
> >     soup = BeautifulSoup(content)
> >   File "/root/sqlmap/extra/beautifulsoup/beautifulsoup.py", line 1519, in
> > __init__
> >     BeautifulStoneSoup.__init__(self, *args, **kwargs)
> >   File "/root/sqlmap/extra/beautifulsoup/beautifulsoup.py", line 1144, in
> > __init__
> >     self._feed(isHTML=isHTML)
> >   File "/root/sqlmap/extra/beautifulsoup/beautifulsoup.py", line 1186, in
> > _feed
> >     SGMLParser.feed(self, markup)
> >   File "/usr/lib/python2.6/sgmllib.py", line 104, in feed
> >     self.goahead(0)
> >   File "/usr/lib/python2.6/sgmllib.py", line 143, in goahead
> >     k = self.parse_endtag(i)
> >   File "/usr/lib/python2.6/sgmllib.py", line 320, in parse_endtag
> >     self.finish_endtag(tag)
> >   File "/usr/lib/python2.6/sgmllib.py", line 358, in finish_endtag
> >     method = getattr(self, 'end_' + tag)
> > UnicodeEncodeError: 'ascii' codec can't encode characters in position
> 4-5:
> > ordinal not in range(128)
> >
> > [*] shutting down at 15:24:58
> >
> > Regards,
> > Nicolas
> >
> >
> ------------------------------------------------------------------------------
> > EditLive Enterprise is the world's most technically advanced content
> > authoring tool. Experience the power of Track Changes, Inline Image
> > Editing and ensure content is compliant with Accessibility Checking.
> > http://p.sf.net/sfu/ephox-dev2dev
> > _______________________________________________
> > sqlmap-users mailing list
> > sqlmap-users@lists.sourceforge.net
> > https://lists.sourceforge.net/lists/listinfo/sqlmap-users
> >
> >
>
>
>
> --
> Miroslav Stampar
>
> E-mail: miroslav.stampar (at) gmail.com
> PGP Key ID: 0xB5397B1B
>
------------------------------------------------------------------------------
EditLive Enterprise is the world's most technically advanced content
authoring tool. Experience the power of Track Changes, Inline Image
Editing and ensure content is compliant with Accessibility Checking.
http://p.sf.net/sfu/ephox-dev2dev
_______________________________________________
sqlmap-users mailing list
sqlmap-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/sqlmap-users

Reply via email to