On 07 September 11:22, Gelonida N wrote:
> On 09/07/2011 10:48 AM, Sylvain Thénault wrote:
> > On 13 August 04:02, Gelonida N wrote:
> >> After a certain number of files I get error messages about 'too many
> >> open file handles'
> >  
> > several people have reported this pb, generally using windows.
> 
> Yes I was running on windows as the code to be analyzed contains some
> windows specific imports.
> An analysis under linux would be incomplete.
> 
> 
> > * pylint should drop/close it once processing of a module is over
> That would be great. :-)

yes, and even greater if someone else that me would give it a try :p
  
> >> I will investigate further whenever I have time again to look at this 
> >> issue.
> >> Currently I fell back to run one pylint process for each file, which is
> >> hooribly slow (~40 minutes) but working as I have to finish some other
> >> tasks urgently and as the run time is not the biggest problem at the 
> >> moment.
> > 
> > PyLint has not built its reputation by being quick ;)
> 
> For small files it's not really pylint analyzing the code, taking that
> much time, but mainly loading python and pylint, which is the main
> contributor with about 70 to 95 % of the time
> 
> That's why I tried to accelerate pylint by not having to load python for
> every file.
> The speedup would be considerable.
> 
> if I could avoid loading python / pylint for each file.

yes, in such case this is definitly a pb.
 
> >> The reason why I'm asking is, is whether I should look out for commands,
> >> which are not protected with
> >>
> >> if __name__ == '__main__': statements
> >> (This might be one reason for too  many open file handles)
> > 
> > As I said above, this is probably not the problem.
> >  
> Thanks again  for your detailed answer.
> 
> So it seems I am stuck with having to start pylint for each file or to
> have a system, where I create a new process for every N files to be
> analyzed.
> 
> The question is whether I can find a reasonable N.
> Not knowing the internals I am afraid, that the number of files to be
> linted before failure will depend on the contents of the python code and
> the amount of sub modules to be analyzed.

It's better to handle the targeted pb than spending time on this :)

-- 
Sylvain Thénault                               LOGILAB, Paris (France)
Formations Python, Debian, Méth. Agiles: http://www.logilab.fr/formations
Développement logiciel sur mesure:       http://www.logilab.fr/services
CubicWeb, the semantic web framework:    http://www.cubicweb.org

_______________________________________________
Python-Projects mailing list
[email protected]
http://lists.logilab.org/mailman/listinfo/python-projects

Reply via email to