Below is the code snippet, which I tried to use to lit a few hundred
python files.

I encountered a small problem

On 08/12/2011 01:40 AM, Gelonida N wrote:
> Finally I found a solution, which is good enough for me.
> 
> On 08/11/2011 12:26 PM, Gelonida N wrote:
> 
> See code below:
> #!/usr/bin/env python
> from pylint import lint
> from pylint.reporters.text import TextReporter
> from cStringIO import StringIO
> 
> filenames = [ __file__ , 'anotherfile.py' ]
> 
> for filename in filenames:
>     args = [ filename ] # GOOD
>     my_output = StringIO()
>     reporter = TextReporter(output=my_output)
>     lint.Run(args, reporter=reporter, exit=False)
>     output_str = my_output.getvalue()
>     print "Got %d characters" % len(output_str)
>     # do_something_with(output_str)

After a certain number of files I get error messages about 'too many
open file handles'


The reason lies very probably  in the source code, that I try to lint or
inthe function do_something_with()

I will investigate further whenever I have time again to look at this issue.
Currently I fell back to run one pylint process for each file, which is
hooribly slow (~40 minutes) but working as I have to finish some other
tasks urgently and as the run time is not the biggest problem at the moment.

What I wanted to know in general is following:
Does pylint 'only' analyze all files or does it really import the code
to be analyzed?

The reason why I'm asking is, is whether I should look out for commands,
which are not protected with

if __name__ == '__main__': statements
(This might be one reason for too  many open file handles)

_______________________________________________
Python-Projects mailing list
[email protected]
http://lists.logilab.org/mailman/listinfo/python-projects

Reply via email to