Re: Multi-threaded FTP Question

2006-07-12 Thread Jeremy Jones

Dennis Lee Bieber wrote:
 On 11 Jul 2006 06:45:42 -0700, [EMAIL PROTECTED] declaimed the
 following in comp.lang.python:

   Could it be that the SERVER is limiting things to 5
 concurrent/parallel connections from any single IP?

   I know I've encountered sites that only allowed two FTP downloads at
 a time...

This is what I was starting to think as well.  The only thing that
looked funky with the OP's code was that it looked like he was writing
everything to a filename of  (unless he's intentionally modified his
code to not show where he's setting that).

- Jeremy M. Jones

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Editing File

2006-07-12 Thread Jeremy Jones

D wrote:
 Hi, I currently have a Python app with a Tkinter GUI frontend that I
 use for system administration.  Everytime it launches, it reads a text
 file which contains info about each host I wish to monitor - each field
 (such as IP, hostname, etc.) is delimited by !!.  Now, I want to be
 able to edit host information from within the GUI - what would  be the
 best way to go about this?  Basically I just need to either edit the
 original host line, or write a new host line and delete the
 original..thanks!

I would create a data structure of the contents of the file and let the
application reference that data structure.  Sounds like it's going to
be a list of lists or a list of dicts.  Each line of the file is going
to be an element of the main list.  Each element of the list is going
to be a dict or a list of the details of that particular host.  Make it
so that if your app changes the datastructure, you re-serialize it back
to the file.  This should work the same with adding a new host to
monitor.

It might be easier to use something like Yaml.  I'm doing something
similar with a little podcast grabber I'm working on.  Here's some old
code where I first incorporate using Yaml (down at the bottom of the
page): http://jeremymjones.com/articles/simple-podcast-grabber-python/
The version I have in SVN right now creates a configgish object off of
the Yaml and on re-assignment of either of the two main attributes, it
automatically reserializes it.

Anyway, hope this helps.

- Jeremy M. Jones

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Data access from multiple code modules

2006-07-12 Thread Jeremy Jones

[EMAIL PROTECTED] wrote:
 Lets say that I have an application consisting of 3 files. A main.py
 file, gui.py and a data.py which handles persistent data storage.
 Suppose data.py defines a class 'MyDB' which reads in data from a
 database, and main.py creates an instance of this object. How does code
 in gui.py access this object? Here's simplified pseudocode:

 MAIN.PY
 import gui, data
 DataObject = data.MyDB(blah)

 How do I write code in gui.py that can access DataObject? Is this
 entirely the wrong way to approach this sort of problem?

 Actualy the problem is more complex because the GUI consists of a main
 GUI form, and panels defined as seperate objects in seperate files.
 Various panels will contain controlls for manipulating data in the
 DataObject, or wherever data storage end up.

What does main.py do?  Are you creating an instance of the gui thingy?
If so, you could just pass DataObject into your gui thingy either into
the constructor or to a setter once you create an instance of it.  If
the gui needs any database stuff at instantiation time, you probably
need to pass it into the constructor.  However, a main.py, gui.py, and
db.py smells a little like your standard MVC, in which case, you would
get your controller to pass in the data pieces as the GUI needs them.

- Jeremy M. Jones

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Editing File

2006-07-12 Thread Jeremy Jones

D wrote:
 Thanks, guys.  So overall, would it just be easier (and not too rigged)
 if any changes were made by just editing the text file?  I want to do

snip

 [EMAIL PROTECTED] wrote:

snip

  Might be overkill - but pickle the  data memeber that contains the
  information.  If you use text instead of binary pickling it should
  still be editable by hand.  for a single line of text it may be a bit
  much - but it's still probably quicker than opening a file, parsing
  etc.

Look at pickle, but I'd recommend against it if you're anticipating
needing to edit the file by hand.  It's just a little on the ugly side.
 Glance at Yaml (I think it's the pyyaml project in the cheeseshop) as
well.  Here's the code needed to parse in a .yaml file:

config = yaml.load(open(self.config_file, r))

Here's the code needed to serialize it back in a pretty format:

yaml.dump(config, config_file_obj, default_flow_style=False)

And here's a piece of a .yaml file itself:

feeds:
  http://leo.am/podcasts/floss:
name: FLOSS Weekly
mode: dl
  http://revision3.com/diggnation/feed/high.mp3.xml:
name: Revision3 - Diggnation w/Kevin Rose  Alex Albrecht
mode: dl
  http://geekmuse.net/podcast/:
name: Geek Muse
mode: dl

http://www.itconversations.com/rss/category-rss.php?k=achange2005e=1:
name: Accelerating Change 2005
mode: dl

Nice and clean.

- Jeremy M. Jones

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Data access from multiple code modules

2006-07-12 Thread Jeremy Jones

[EMAIL PROTECTED] wrote:

snip

 Doh! How simple. Why didn't I think of that? I'm too used to procedural
 scripts where you'd just put everything in a global data structure. I
 know this is bad, but it's hard to get out of that mentality.

Sounds like you got it.  Just pass it on down as needed.

- jmj

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Multi-threaded FTP Question

2006-07-11 Thread Jeremy Jones
On 11 Jul 2006 06:45:42 -0700, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
I'm trying to use ftp in python in a multi-threaded way on a windowsbox - python version 
2.4.3.Problem is that it appears that it's onlypossible to have five instances/threads at one point in time.Errorslook like: File C:\Python24\lib\ftplib.py, line 107, in __init__self.connect
(host)File C:\Python24\lib\ftplib.py, line 132, in connectself.welcome = self.getresp()File C:\Python24\lib\ftplib.py, line 208, in getrespresp = self.getmultiline()
File C:\Python24\lib\ftplib.py, line 194, in getmultilineline = self.getline()File C:\Python24\lib\ftplib.py, line 184, in getlineif not line: raise EOFErrorEOFError
Is it possible to have more than five simultaneous ftp connections?Would you mind posting your code? Are you trying to pass the same FTP connection object to all 5 threads?
-- Jeremy M. Joneshttp://jeremymjones.com
 
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: Multi-threaded FTP Question

2006-07-11 Thread Jeremy Jones

[EMAIL PROTECTED] wrote:
 I'm trying to use ftp in python in a multi-threaded way on a windows
 box - python version 2.4.3.  Problem is that it appears that it's only
 possible to have five instances/threads at one point in time.  Errors
 look like:

File C:\Python24\lib\ftplib.py, line 107, in __init__
 self.connect(host)
   File C:\Python24\lib\ftplib.py, line 132, in connect
 self.welcome = self.getresp()
   File C:\Python24\lib\ftplib.py, line 208, in getresp
 resp = self.getmultiline()
   File C:\Python24\lib\ftplib.py, line 194, in getmultiline
 line = self.getline()
   File C:\Python24\lib\ftplib.py, line 184, in getline
 if not line: raise EOFError
 EOFError

 Is it possible to have more than five simultaneous ftp connections?

 Thanks.

 Derek

I replied to this about 4 hours ago from my gmail email account (not my
google groups account associated with the same email addres), but
haven't seen it show up, so I apologize if this is a dupe.

Would you mind posting your code?  Are you trying to pass the same FTP
connection object to all 5 threads?

- Jeremy M. Jones

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Why my modification of source file doesn't take effect when debugging?

2005-12-02 Thread Jeremy Jones
sandorf wrote:

I'm using the Windows version of Python and IDLE. When I debug my .py
file, my modification to the .py file does not seem to take effect
unless I restart IDLE. Saving the file and re-importing it doesn't help

either. Where's the problem? 

Thanks.

  

No problem.  Just reload() it.


- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: How could I ask Thread B to call B().Method() from inside Thread A's run?

2005-11-30 Thread Jeremy Jones
could ildg wrote:

 I have 2 thead instances,
 A and B,
 In A's run method, if I call B.Method(), it will be executed in thead A,
 but I want B.Method() to be executed in B's thread.
 That's to say, I want to tell Thead B to do B's stuff in B's thread,
 kinda like PostMessage in win32.
 Can I do it in python?
 How?
 Thank you in advance.

Here is a really simple, stupid example of what I think you're trying to 
do.  You probably want to use a Queue between your threads.

##
#!/usr/bin/python


import threading
import Queue


class A(threading.Thread):
def __init__(self, q):
self.q = q
threading.Thread.__init__(self)
def run(self):
for i in range(10):
print Thread A putting \something\ onto the queue
self.q.put(something)
self.q.put(stop)

class B(threading.Thread):
def __init__(self, q):
self.q = q
self.proceed = True
threading.Thread.__init__(self)
def do_something(self):
print Thread B doing something
def do_stop(self):
print Thread B should stop soon
self.proceed = False
def run(self):
while self.proceed:
print Thread B pulling sommething off of the queue
item = q.get()
print Thread B got %s % item
getattr(self, do_ + str(item))()

if __name__ == __main__:
q = Queue.Queue()
a = A(q)
a.start()
b = B(q)
b.start()
a.join()
b.join()
##

HTH,

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: help with using temporary files

2005-11-22 Thread Jeremy Jones
Gerard Flanagan wrote:

Hello

 I'm sure its basic but I'm confused about the error I get with the
following code.  Any help on basic tempfile usage?


ActivePython 2.4.1 Build 247 (ActiveState Corp.) based on
Python 2.4.1 (#65, Jun 20 2005, 17:01:55) [MSC v.1310 32 bit (Intel)]
on win32
Type help, copyright, credits or license for more information.
  

from tempfile import NamedTemporaryFile

tmp = NamedTemporaryFile()
tmp.write(Hello)
tmp.close()

print tmp.name


c:\docume~1\gerard\locals~1\temp\tmpxqn4yl
  

f = open(tmp.name)


Traceback (most recent call last):
  File stdin, line 1, in ?
IOError: [Errno 2] No such file or directory:
'c:\\docume~1\\gerard\\locals~1\\temp\\tmpxqn4yl'


Thanks

Gerard

  


It gets created:

In [24]: import tempfile 
In [25]: t = tempfile.NamedTemporaryFile() 
In [26]: t.name
Out[26]: '/tmp/tmp9bmhap'
In [27]: ls -l /tmp/tmp*
-rw---  1 jmjones jmjones 0 Nov 22 11:15 /tmp/tmp9bmhap

In [28]: t.write(123)

In [29]: t.flush()

In [30]: ls -l /tmp/tmp*
-rw---  1 jmjones jmjones 3 Nov 22 11:15 /tmp/tmp9bmhap

In [31]: t.close()

In [32]: ls -l /tmp/tmp*
ls: /tmp/tmp*: No such file or directory



 From the docstring, it gets automatically deleted on close:

def NamedTemporaryFile(mode='w+b', bufsize=-1, suffix=,
   prefix=template, dir=None):
Create and return a temporary file.
Arguments:
'prefix', 'suffix', 'dir' -- as for mkstemp.
'mode' -- the mode argument to os.fdopen (default w+b).
'bufsize' -- the buffer size argument to os.fdopen (default -1).
The file is created as mkstemp() would do it.

Returns an object with a file-like interface; the name of the file
is accessible as file.name.  The file will be automatically deleted
when it is closed.


HTH,

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


publicity for Python-related projects

2005-11-03 Thread Jeremy Jones
If anyone has a project written in Python or usable to Python
programmers, I'd like to blog about new releases (or other news) of your
project.  I'd really like to focus on open source projects, but I would
love to mention non-open source projects if I feel there is sufficient
benefit to the community.  This is something I'm planning on an ongoing
basis and not just a one-time thing.  Whenever you have news (a release,
corporate funding, new developer joining the team, etc.) just email me
at zanesdad at bellsouth dot net with the name of the project, a link to
the project's homepage, the release (or news) information, and what the
project is about.  I'm planning on posting nightly at around 5PM GMT +5
(10PM Eastern time in the states), so email that I receive before that
time should be posted the same day.

Jeremy Jones

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Web presentation layer/framework for python - recommendations?

2005-10-27 Thread Jeremy Jones
[EMAIL PROTECTED] wrote:

Hi,
I am a python newbie and need some advice.
I have been charged with redeveloping a web application with a front end 
written in python that has a backend of XML files.
Currently it doesn't adequately separate out the presentation code from the 
content code.
Frankly it’s a mess (think bowl of spagetti).

Does anyone have any recommendations for python toolkits or templating systems 
that would simplify the cleanup and make the code more maintainable in the 
future? I am a newbie, but not afraid to have a go. I also haven't done any 
real application programming for a while - bit of perl and java stuff a few 
years back. Just do perl/python/shell scripting these days.

All comments welcome :-)

Regards Anthony.

 Mr Anthony Hornby
Library Systems and Technology Coordinator
Charles Darwin University (CRICOS 300K)
Phone: +61 8 8946 6011
Email: [EMAIL PROTECTED]
(remove the .no-spam)
  

I've personally been pretty fond of TurboGears lately.  It encourages a 
really clean MVC with SQLObject as the default model layer.  I haven't 
checked, but I don't think SQLObject supports XML as a data source, but 
you could definitely write your own XML data access layer and use 
TurboGears.

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: socket.error: (32, 'Broken pipe'): need help

2005-10-27 Thread Jeremy Jones
Junhua Deng (AL/EAB) wrote:

Hi,
I have a simple server-client application with threading. It works fine when 
both server and client on the same machine, but I get the following error 
message if the server is on another machine:

... ...
self.socket.send(outgoingMsg)
socket.error: (32, 'Broken pipe')

I do not know where to start with? 

Thanks
Junhua
  

Can you tell if the recipient actually got any of the data?  At what 
point do you get this error?  Is the client able to connect to the 
server?  Could you extract some more code (connection made, the sender's 
sending code, receiver's receiving code, etc.) so we can see what you're 
doing?  Did you happen to tell the server to bind localhost or 127.0.0.1?

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


OT: Re: Windows vs Linux [was: p2exe using wine/cxoffice]

2005-10-26 Thread Jeremy Jones
Tim Golden wrote:
snip

As it happens, (and I suspect I'll have to don my flameproof suit here),
I prefer the Windows command line to bash/readline for day-to-day use, 
including in Python. Why? Because it does what I can't for the life of 
me get readline to do: you can type the first few letters of a 
previously-entered command and press F8. This brings up (going backwards

with further presses) the last command which starts like that. And
*then* 
you can just down-arrow to retrieve the commands which followed it. 
If someone can tell me how to do this with bash/readline I will be 
indebted to them and it will increase my chances of switching to Linux 
a bit!


Others have recommended ctrl-r in bash.  People's tastes vary, but that 
has an awkward feel for me.  In zsh, you just type in as many letters as 
you care to match and hit the up arrow.  But, honestly, this is a bit 
annoying to me.  If I've begun to type in a command, realize it's not 
what I want, but instead, I want to go back in my history a couple of 
commands, I hit the up arrow to find to my irritation that it's 
pattern matching rather than going directly back in my history.  I guess 
it comes in handy at times, though

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Problem ... with threads in Python

2005-10-26 Thread Jeremy Jones
Negoescu Constantin wrote:

 Hello.
  
 I know that Python is */not fully threadsafe/*. Unlike Java, where 
 threading was considered to be so important that it is a part of the 
 syntax, in Python threads were laid down at the altar of Portability. 
 But, i really have to finish a project  which uses multiple threads in 
 Python, and i shouldn't use the time.sleep() function.
 Is there any posibility to use multiple threads without using the 
 time.sleep() function ? And if so, what that way should be ?
  
 Best regards,
 _Cosmin

You might want to give us a little more detail on what you're trying to 
accomplish and how you're using sleep() so we'll be able to help you 
better.  Are you trying to use sleep() as a synchronization mechanism 
between threads?

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: write a loopin one line; process file paths

2005-10-19 Thread Jeremy Jones
Xah Lee wrote:

Peter Hansen wrote:
  

Xah Lee wrote:


If you think i have a point, ...
  

You have neither that, nor a clue.



Dear Peter Hansen,

My messages speak themselfs. You and your cohorts's stamping of it does
not change its nature. And if this is done with repetitiousness, it
gives away your nature.

It is not necessary to shout against me. But if you must refute (and
that is reasonable), try to put content into your posts.
(see Philosophies of Netiquette at
http://xahlee.org/UnixResource_dir/writ/phil_netiquette.html)
  

Xah,

Thanks for the comic relief of this link.  The first item of comedy came 
from the following two sentences:

'''
Then at the other extreme is the relatively rare Victorian propensity 
where each post is a gem of literature carefully crafted and researched 
for an entire century of readers to appreciate and archive. Xah, Erik 
Naggum, and [censored] posts are exemplary of this style, to name a few 
acquaintances like myself.
'''

I really don't know which is funnier, that you stated these sentences at 
all, or that you probably believe them.  Several things disqualify you 
from gaining my classification of scholarly (not that you give a fart 
what I think):

- poor spelling
- poor grammar
- rambling style with lack of cohesive thought
- non-interesting, non-original ideas in your posts
- invalid or incorrect points in your discourse

The next piece of humor came from these sentences:

'''
Go to a newsgroup archive such as dejanews.com and search for your 
favorite poster.  If you find a huge quantity of terse posts that is 
tiring, boring, has little content, and in general requires you to 
carefully follow the entire thread to understand it, then you know 
you've bumped into a conversationalist.
'''

By your definition, you mostly fit into the conversationalist 
category.  The only thing that may keep you out of that category is that 
your ramblings are typically lengthy.  So, what you provide is a large 
number of lengthy, tiring, boring, content-less, non-cohesive posts.  
Funny that you bash the conversationalists when you have so much in 
common with them.

The third point of humor in this link was the paypal link at the top of 
the page:

'''
If you spend more than 30 minutes on this site, please send $1 to me. Go 
to http://paypal.com/ and make a payment to [EMAIL PROTECTED] Or send to: 
P. O. Box 390595, Mountain View, CA 94042-0290, USA.
'''

It's humorous to think of anyone spending more than 30 minutes on your 
site (apart from the obvious stunned amazement at the content, quite 
like the can't stop watching the train wreck phenomenon).  It's even 
more humorous to think of anyone gaining value from it.  But I wouldn't 
be surprised to hear that some people have actually sent you money.

If you deem fit, create a alt.fan.XahLee, and spare the rest of Python
community of your politics. I appreciate your fandom.

 Xah
 [EMAIL PROTECTED]
∑ http://xahlee.org/

  


sorry-folks-for-feeding-the-troll-ly y'rs,

- jmj

-- 
http://mail.python.org/mailman/listinfo/python-list

Re: global interpreter lock

2005-10-18 Thread Jeremy Jones
[EMAIL PROTECTED] wrote:

I just need confirmation that I think right.

Is the files thread_xxx.h (xxx = nt, os2 or whatever) responsible for
the
global interpreter lock in a multithreaded environment?

I'm currently writing my own thread_VW for VxWorks, thats why I'm
asking. 

//Tommy

  

Someone can correct me if I'm wrong, but the lock actually lives in 
ceval.c, around here:



802 PyThread_release_lock(interpreter_lock);
803
804 /* Other threads may run now */
805
806 PyThread_acquire_lock(interpreter_lock, 1);


This was taken from what appears to be a 2.4.1 release rather than a CVS 
checkout.  It looks like the PyThread_type_lock is defined in the 
thread_xxx.h files, though.

HTH,

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Yes, this is a python question, and a serious one at that (moving to Win XP)

2005-10-13 Thread Jeremy Jones
Kenneth McDonald wrote:

For unfortunate reasons, I'm considering switching back to Win XP  
(from OS X) as my main system. Windows has so many annoyances that  
I can only compare it to driving in the Bay Area at rush hour (OS X  
is like driving in Portland at rush hour--not as bad, but getting  
there), but there are really only a couple of things that are really,  
absolutely preventing me from making the switch. Number one is the  
lack of a decent command line and command-line environment, and I'm  
wondering (hoping) if perhaps someone has written a Python shell-- 
something that will look like a regular shell, let users type in  
commands, maybe have some of the nice features of bash etc. like tab  
completion, etc, and will then execute an underlying python script  
when the command is entered. I'm not thinking of IDLE, but something  
that is really aimed more at being a system terminal, not a Python- 
specific terminal.
  

ipython -p pysh

IPython rocks as a Python shell.  I use zsh mostly, but IPython's pysh 
looks pretty good.  I hate to help you get back on Windows, though :-)


- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: New Python book

2005-10-07 Thread Jeremy Jones
Maurice LING wrote:

I had the opportunity to glance through the book in Borders yesterday. 
On the whole, I think it is well covered and is very readable. Perhaps I 
was looking for a specific aspect, and I find that threads did not get 
enough attention. Looking at the index pages, the topics on threads 
(about 4-5 pages) is mainly found in the context of GUI programming.

maurice

  

I don't have my hard copy of the book, but from memory and grepping over 
the soft copy, you appear to be correct.  Remember, though, that this is 
a beginning book on Python and *I* would consider threading a more 
advanced topic.  I think putting threading in the context of GUI 
programming is just about right for an intro book.

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: How to create temp file in memory???

2005-10-05 Thread Jeremy Jones
Wenhua Zhao wrote:

A.T.T

Thanks a lot.
  

If you could elaborate a bit more, it might be helpful.  I'm guessing 
you want something like StringIO or cStringIO.


- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: How to create temp file in memory???

2005-10-05 Thread Jeremy Jones
Wenhua Zhao wrote:

I have a list of lines. I want to feed these lines into a function.
The input of this function is a file.
I want to creat a temp file on disk, and write the list of lines into 
this temp file, then reopen the file and feed it to the function.
Can I create a this temp file on memory???



Jeremy Jones wrote:
  

Wenhua Zhao wrote:



A.T.T

Thanks a lot.
 

  

If you could elaborate a bit more, it might be helpful.  I'm guessing 
you want something like StringIO or cStringIO.


- jmj


If the function takes a file object as an argument, you should be able 
to use StringIO or cStringIO.


- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: New Python book

2005-10-05 Thread Jeremy Jones
Dick Moores wrote:

(Sorry, my previous post should not have had Tutor in the subject header.)

Magnus Lie Hetland's new book, _Beginning Python: From Novice to
Professional_ was published by Apress on Sept. 26 (in the U.S.). My copy
arrived in the mail a couple of days ago. Very much worth a look, IMHO.
But what do the experts here think?

http://www.bestbookdeal.com/book/compare/159059519X

Dick Moores
[EMAIL PROTECTED]


  

I don't know what the experts think, but I thought it was excellent.  
I had the pleasure of serving as tech editor/reviewer for this book.  My 
dead tree version hasn't arrived yet, but should be on its way. 

The style is extremely readable, not a hint of dryness in it at all.  
The concepts are clearly and thoroughly presented.  This is an excellent 
resource for someone starting Python, but definitely useful for those 
already familiar with Python.  One thing that kept coming to mind as I 
was reading it, especially toward the end during the projects at the 
end, was that this would probably also be an excellent educational 
resource for teachers in a classroom setting teaching students Python.  
I would be interested to hear some teachers' opinion on that to see if 
that's a correct assessment.

Anyway, I highly recommend this book.

- jmj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: 1 Million users.. I can't Scale!!

2005-09-28 Thread Jeremy Jones


[EMAIL PROTECTED] wrote:

Damjan Is there some python module that provides a multi process Queue?

Not as cleanly encapsulated as Queue, but writing a class that does that
shouldn't be all that difficult using a socket and the pickle module.

Skip

  

What about bsddb?  The example code below creates a multiprocess queue.  
Kick off two instances of it, one in each of two terminal windows.  Do a 
mp_db.consume_wait() in one first, then do a mp_db.append(foo or some 
other text here) in the other and you'll see the consumer get the 
data.  This keeps the stuff on disk,  which is not what the OP wants, 
but I *think* with flipping the flags or the dbenv, you can just keep 
stuff in memory:

#!/usr/bin/env python

import bsddb
import os

db_base_dir = /home/jmjones/svn/home/source/misc/python/standard_lib/bsddb

dbenv = bsddb.db.DBEnv(0)
dbenv.set_shm_key(40)
dbenv.open(os.path.join(db_base_dir, db_env_dir),
#bsddb.db.DB_JOINENV |
bsddb.db.DB_INIT_LOCK |
bsddb.db.DB_INIT_LOG |
bsddb.db.DB_INIT_MPOOL |
bsddb.db.DB_INIT_TXN |
#bsddb.db.DB_RECOVER |
bsddb.db.DB_CREATE |
#bsddb.db.DB_SYSTEM_MEM |
bsddb.db.DB_THREAD,
)

db_flags = bsddb.db.DB_CREATE | bsddb.db.DB_THREAD


mp_db = bsddb.db.DB(dbenv)
mp_db.set_re_len(1024)
mp_db.set_re_pad(0)
mp_db_id = mp_db.open(os.path.join(db_base_dir, mp_db.db), 
dbtype=bsddb.db.DB_QUEUE, flags=db_flags)



- JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: 1 Million users.. I can't Scale!!

2005-09-28 Thread Jeremy Jones
[EMAIL PROTECTED] wrote:

Damjan Is there some python module that provides a multi process Queue?

Skip Not as cleanly encapsulated as Queue, but writing a class that
Skip does that shouldn't be all that difficult using a socket and the
Skip pickle module.

Jeremy What about bsddb?  The example code below creates a multiprocess
Jeremy queue.

I tend to think multiple computers when someone says multi-process.  I
realize that's not always the case, but I think you need to consider that
case (it's the only practical way for a multi-process application to scale
beyond a few processors).

Skip
  

Doh!  I'll buy that.  When I hear multi-process, I tend to think of 
folks overcoming the scaling issues that accompany the GIL.  This, of 
course, won't scale across computers without a networking interface.

- JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: File processing

2005-09-23 Thread Jeremy Jones
Gopal wrote:

Hello,

I'm Gopal. I'm looking for a solution to the following problem:

I need to create a text file config.txt having some parameters. I'm
thinking of going with this format by having Param Name - value. Note
that the value is a string/number; something like this:

PROJECT_ID = E4208506
SW_VERSION = 18d
HW_VERSION = 2

In my script, I need to parse this config file and extract the Values
of the parameters.

I'm very new to python as you can understand from the problem. However,
I've some project dealines. So I need your help in arriving at a simple
and ready-made solution.

Regards,
Gopal.

  

Would this 
(http://www.python.org/doc/current/lib/module-ConfigParser.html) do what 
you need?  It's part of the standard library.


- JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: File processing

2005-09-23 Thread Jeremy Jones
Gopal wrote:

Thanks for the reference. However, I'm not understanding how to use it.
Could you please provide with an example? Like I open the file, read
line and give it to parser?

Please help me.

  

I had thought of recommending what Peter Hansen recommended - just 
importing the text you have as a Python module.  I don't know why I 
recommended ConfigParser over that option.  However, if you don't like 
what Peter said and would still like to look at ConfigParser, here is a 
very simple example.  Here is the config file I created from your email:

[EMAIL PROTECTED]  8:36AM configparser % cat foo.txt
[main]
PROJECT_ID = E4208506
SW_VERSION = 18d
HW_VERSION = 2


Here is me running ConfigParser from a Python shell:

In [1]:  import ConfigParser

In [2]:  p = ConfigParser.ConfigParser()

In [3]:  p.read(foo.txt)
Out[3]:  ['foo.txt']

In [4]:  p.get(main, PROJECT_ID)
Out[4]:  'E4208506'


Note that the value of (main, PROJECT_ID) is a string which contains 
double quotes in it.  If you take Peter's advice, you won't have that 
problem; the config file will preserve your types for you.

HTH,

- JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: batch mkdir using a file list

2005-09-23 Thread Jeremy Jones
DataSmash wrote:

Hello,
I think I've tried everything now and can't figure out how to do it.
I want to read in a text list from the current directory,
and for each line in the list, make a system directory for that name.

My text file would look something like this:
1144
1145
1146
1147

I simply want to create these 4 directories.
It seems like something like the following
code should work, but it doesn't.

import os

file = open(list.txt, r)
read = file.read()
print Creating directory  + str(read)
os.mkdir(str(read))

Appreciate any help you can give!
R.D.  Harles

  

Untested code:

import os
for line in open(list.txt, r):
os.mkdir(line)


- JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Perl's documentation come of age

2005-09-21 Thread Jeremy Jones
Ed Hotchkiss wrote:

 I'm new to Python, not programming. I agree with the point regarding 
 the interpreter. what is that? who uses that!? Why are most examples 
 like that, rather than executed as .py files?

I think showing examples at the Python interpreter prompt is *very* 
helpful and IMHO a preferred method in plenty of cases.  If I'm showing 
someone a piece of code that returns some object the type of which 
you're not really that familiar with, would you rather be running it in 
a script, or on a command prompt (or, my preference is to either copy 
and paste the example to a script an run it with ``python -i`` or paste 
it to an edit in IPython)?  With IPython (or vanilla Python interpreter 
with parse-and-bind tab completion turned on), you can inspect the 
object quite easily.  Again, IMHO, much easier than from a script.

  
 Another problem that I have (which does get annoying after awhile), is 
 not using foo and bar. Spam and Eggs sucks. It's not funny, although 
 Monty Python does rock. Why not use silly+walks instead.

Eh.  Life's too short for me to get up in a roar about such as this.  
And Python's too good of a language for me to be overly bothered by 
example naming conventions.  YMMV.

  
 ***/me puts on Monty Python and turns the computer off***
  
 -edward

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Chronological Processing of Files

2005-09-21 Thread Jeremy Jones
yoda wrote:

This feels like a stupid question but I'll ask it anyway.
  

Definitely not a stupid question.

How can I process files chronologically (newest last) when using
os.walk()?

  


Try this:

In [16]:  file_list = [(os.stat(f)[8], f) for f in [os.path.join(i[0], 
j) for i in os.walk(/home/jmjones/public_html) for j in i[2]]]

In [17]:  file_list.sort()

In [18]:  sorted_file_list = [f[1] for f in file_list]


I *think* os.stat()[8] is the modification time element...but this 
should probably work for you.  That first list comprehension looks like 
a booger if you're not familiar with them.  If you have any trouble with 
it, just shoot a message back to the list and I'll decypher it for you.

- JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Britney Spears nude

2005-09-15 Thread Jeremy Jones
Will McGugan wrote:

Tim Peters wrote:
  

[john basha]



send me the britney nude photos
  

Because they're a new feature, you'll have to wait for Python 2.5 to
be released.



She has just spawned a child process. Give her to Python 2.6 to get back 
in shape.


Will McGugan
  

daemon process, no doubt.



JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: python profiling, hotspot and strange execution time

2005-09-08 Thread Jeremy Jones
[EMAIL PROTECTED] wrote:
snip

I am not sure to understand the big difference between time spent in
different areas of code and how long did this thing take to run?.
Looking at python doc for deterministic profiling, I understand the
implementation difference, and the performance implications, but I
don't see why deterministic profiling would not give me an overall
picture ?
  

I think from below you said you were more clear on this.  Cool.

snip

Well, your example make actually more sense to me :) I understand the
difference between CPU time and time spent in the python code (even if
I was not clear in my previous post about it...). But this case does
not apply to my code, as my code is never idled, takes 100 % of the
cpu, with no other CPU consuming task

  

I would attribute the wall clock and profile time difference to the
overhead of hotshot.  While hotshot is miles better than the regular
profiler, it can still take a little time to profile code.



Well, if hotshot reported a timing which is longer than the execution
time without it, I would have considered that to be normal. Even in C,
using gprof has a non negligeable overhead, most of the time.
  

Actually, I'd expect the opposite, but not as extreme for your case.  I 
would expect it to *report* that a piece of code took less time to 
execute than I *observed* it taking.  Reasons in the snipped area 
above.  Unless you're calling a C extension, in which case, IIRC, it's 
supposed to report the actual execution time of the C call (and I guess 
plus any overhead that hotshot may cause it to incur) in which case you 
would be IMO 100% correct.  I hope you're not calling a C extension, or 
my head's gonna explode.

What I don't understand is why hotshot reports that do_foo is executed
in 2 seconds whereas it effectively takes more than 10 seconds ? Is it
because I don't understand what deterministic profiling is about ?
  

The profiler is supposed to be smart about how it tracks time spent in 
execution so it doesn't get readings that are tainted by other processes 
running or other stuff.  I could easily see a 2-10 second disparity if 
your process were idling somehow.  Now, if you're doing a lot of IO, I 
wonder if the profiler isn't taking into consideration any blocking 
calls that may max out the CPU in IOWAIT...  Are you doing a lot of IO?

David

  

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Manging multiple Python installation

2005-09-08 Thread Jeremy Jones
Roel Schroeven wrote:

Jeremy Jones wrote:

  

Andy Leszczynski wrote:


Is there any way to pass the prefix to the make install? Why make 
depends on that?

A.

  

What does it matter?  If you *could* pass it to make, what does that buy 
you?  I'm not a make guru, but I'm not sure you can do this.  Someone 
else better versed in make will certainly chime in if I'm wrong.  But I 
think make just looks at the Makefile and does what it's going to do.  
If you want different behavior, you edit the Makefile or you get the 
Makefile created differently with configure.



That way you could install to a different directory without having to 
rebuild the whole thing. I don't think that uses case happens very 
often, but I've certainly encountered it (not in relation to Python though).

  

I guess I'm still having a hard time understanding what does it 
matter?.  Even if he reconfigures, he's not going to rebuild the whole 
thing unless he does a make clean.  For example, I just built Python 
twice, once with a prefix of /usr/local/apps/pytest1 and then with a 
prefix of /usr/local/apps/pytest2 and timed the compile:

BUILD 1:

[EMAIL PROTECTED]  7:16AM Python-2.4.1 % cat compile_it.sh
./configure --prefix=/usr/local/apps/pytest1
make
make install

./compile_it.sh  107.50s user 9.00s system 78% cpu 2:28.53 total



BUILD 2:

[EMAIL PROTECTED]  7:18AM Python-2.4.1 % cat compile_it.sh
./configure --prefix=/usr/local/apps/pytest2
make
make install

./compile_it.sh  21.17s user 6.21s system 56% cpu 48.323 total


I *know* a significant portion of the time of BUILD 2 was spent in 
configure.  So if he's really eager to save a few CPU seconds, he can 
edit the Makefile manually and change the prefix section.  Maybe I'm 
just a slow file editor, but I would do configure again just because 
it'd probably be faster for me.  Not to mention potentially less error 
prone.  But he's going to have to build something again.  Or not.  He 
*should* be able to just tar up the whole directory and it should just 
work.  I moved /usr/local/apps/pytest1 to /usr/local/apps/pyfoo and 
imported xml.dom.minidom and it worked.  I'm guessing the python binary 
searches relative to itself first (??).  But if I move the python binary 
to a new location, it doesn't find xml.dom.minidom.

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: determine if os.system() is done

2005-09-07 Thread Jeremy Jones
Thomas Bellman wrote:

Xah Lee [EMAIL PROTECTED] writes:

  

suppose i'm calling two system processes, one to unzip, and one to
tail to get the last line. How can i determine when the first
process is done?



  

Example:



  

subprocess.Popen([r/sw/bin/gzip,-d,access_log.4.gz]);



  

last_line=subprocess.Popen([r/usr/bin/tail,-n 1,access_log.4],
stdout=subprocess.PIPE).communicate()[0]



  

of course, i can try workarounds something like os.system(gzip -d
thiss.gz  tail thiss), but i wish to know if there's non-hack way to
determine when a system process is done.



Have you tried reading the manual for the subprocess module?  You
just *might* find the answer to your question if you look at what
you can do with Popen objects.

Oh, come on.  Don't you know that all Python documentation is rubbish 
and not worth reading, written by IT idiots who throw around useless 
jargon and indulge in extreme forms of self-gratification?  Someone of 
the caliber of Xah Lee would *never* stoop so low as to actually read 
the documentation.  It is beneath him.  Instead, he posts messages to a 
group of IT idiots who throw around useless jargon and indulge in 
extreme forms of self-gratification in posting answers to questions.

snip

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Manging multiple Python installation

2005-09-07 Thread Jeremy Jones
Andy Leszczynski wrote:

Hi,
I run Mandrake 10.0 with python 2.3 installed by default. I want to keep 
it as it is but need another, very customized Python installation based 
of 2.3 as well. I would prefer to have it the way it is on Windows, one 
folder e.g. /opt/mypython with all the stuff under that. It would be 
unlike that standard installation where everything is scattered across 
/usr /bin/ /.../doc. That way I can easily tar it and distribute to 
whatever machine I want.

How can I achieve that? Please help, Andy
  

Download the source, untar, cd to the new directory, run:

./configure --prefix=/opt/mypython
make
make install

HTH,

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Manging multiple Python installation

2005-09-07 Thread Jeremy Jones
Andy Leszczynski wrote:

Jeremy Jones wrote:
  

Andy Leszczynski wrote:

Download the source, untar, cd to the new directory, run:

./configure --prefix=/opt/mypython
make
make install



Is there any way to pass the prefix to the make install? Why make 
depends on that?

A.

What does it matter?  If you *could* pass it to make, what does that buy 
you?  I'm not a make guru, but I'm not sure you can do this.  Someone 
else better versed in make will certainly chime in if I'm wrong.  But I 
think make just looks at the Makefile and does what it's going to do.  
If you want different behavior, you edit the Makefile or you get the 
Makefile created differently with configure.

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: dual processor

2005-09-06 Thread Jeremy Jones
Michael Sparks wrote:

Jeremy Jones wrote:
  

snip

  

And maybe  
Steve's magical thinking programming language will have a ton of merit.



I see no reason to use such derisory tones, though I'm sure you didn't mean
it that way. (I can see you mean it as extreme skepticism though :-)
  

None of the above, really.  I thought it was a really great idea and 
worthy of pursuit.  In my response back to Steve, the most skeptical 
thing I said was that I think it would be insanely difficult to 
implement.  Maybe it wouldn't be as hard as I think.  And according to a 
follow-up by Steve, it probably wouldn't.

snip

I would almost bet money that the majority of code would 
not be helped by that at all.  



Are you so sure? I suspect this is due to you being used to writing code
that is designed for a single CPU system. 

Not really.  I've got a couple of projects in work that would benefit 
tremendously from the GIL being lifted.  And one of them is actually 
evolving into a funny little hack that will allow easy persistent 
message passing between processes (on the same system) without having to 
mess around with networking.  I'm betting this is the case just because 
of reading this list, the tutor list, and interaction with other Python 
programmers. 

snip

That's my point too. I don't think our opinions really diverge that far :)
  

We don't.  Again (as we have both stated), as systems find themselves 
with more and more CPUs onboard, it becomes more and more absurd to have 
to do little hacks like what I allude to above.  If Python wants to 
maintain its position in the pantheon of programming languages, it 
really needs to 1) find a good clean way to utilize muti-CPU machines 
and 2) come up with a simple, consistent, Pythonic concurrency paradigm.

Best Regards,


Michael.

  

Good discussion.


JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: python profiling, hotspot and strange execution time

2005-09-06 Thread Jeremy Jones
[EMAIL PROTECTED] wrote:

Hi there,

   I have some scientific application written in python. There is a
good deal of list processing, but also some simple computation such
as basic linear algebra involved. I would like to speed things up
implementing some of the functions in C. So I need profiling.

   I first tried to use the default python profiler, but profiling my
application multiplies the execution time by a factor between 10 and
100 ! So I decided to give a try to hotspot.

OK - first of all, as someone else has asked, what platform are you 
running?  I'm assuming it's windows since you're referring to 
time.clock() and then later saying wall clock.

Next, what are you hoping that the profiler will give you?  If you're 
expecting it to give you the big picture of your application's 
performance and give you real runtime numbers, you may be 
disappointed.  It is a deterministic profiler and will give you CPU time 
spent in different areas of code rather than and overall how long did 
this thing take to run?.

 I just followed the
example of the python library reference, but I have some strange
results concerning cpu time. My profiling script is something like the
following:

def run_foo():
print time.clock()

function_to_profile()

print time.clock()

prof = hotshot.Profile(essai.prof)
benchtime= prof.runcall(run_foo)
prof.close()
stats = hotshot.stats.load(essai.prof)
stats.strip_dirs()
stats.sort_stats('time', 'calls')
stats.print_stats(20)
  


Well, let's just add more confusion to the pot, shall we?  Look at this 
example (a slight hack from yours)::

import time
import hotshot
import hotshot.stats


def run_foo():
print time.clock()
print time.time()

time.sleep(5)

print time.clock()
print time.time()

prof = hotshot.Profile(essai.prof)
benchtime= prof.runcall(run_foo)
prof.close()
stats = hotshot.stats.load(essai.prof)
stats.strip_dirs()
stats.sort_stats('time', 'calls')
stats.print_stats(20)

and the output::

0.24
1126011669.55
0.24
1126011674.55
 1 function calls in 0.000 CPU seconds

   Ordered by: internal time, call count

   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
10.0000.0000.0000.000 tmphQNKbq.py:6(run_foo)
00.000 0.000  profile:0(profiler)



I inserted a time.time() call since I'm on Linux and time.clock() 
returns a process's CPU time and wanted to show the wall clock time as 
it were.  So, the stats show 0 time taken, whereas time.time() shows 5 
seconds.  It's because the time.sleep() took a negligable amount of CPU 
time which is what the profiler looks at.

The goal is to profile the function function_to_profile(). Running this
script gives me a CPU executime time of around 2 seconds, whereas the
difference between the two clock calls is around 10 seconds !

I would attribute the wall clock and profile time difference to the 
overhead of hotshot.  While hotshot is miles better than the regular 
profiler, it can still take a little time to profile code.

 And I
don't run any other cpu consuming tasks at the same time, so this
cannot come from other running processes. Is there something perticular
about hotspot timing I should know ? I am not sure how I can get more
accurate results with hotspot.

I would appreciate any help, 

Thanks

  

HTH,


JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python xml.dom, help reading attribute data

2005-09-06 Thread Jeremy Jones
Thierry Lam wrote:

Let's say I have the following xml tag:

para role=success1/para

I can't figure out what kind of python xml.dom codes I should invoke to
read the data 1? Any help please?

Thanks
Thierry

  

In [20]: import xml.dom.minidom

In [21]: s = '''para role=success1/para'''

In [22]: x = xml.dom.minidom.parseString(s)

In [23]: print x.firstChild.firstChild.data
1


I doubt this really answers what you're really wanting to ask.  And this 
is a really really brittle way of trying to parse XML.  But it answers 
exactly what you're asking.  Hope it gives you the start you need.  Post 
a follow-up when you have more questions.

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: dual processor

2005-09-05 Thread Jeremy Jones
Steve Jorgensen wrote:

On Mon, 05 Sep 2005 21:43:07 +0100, Michael Sparks [EMAIL PROTECTED] wrote:

  

Steve Jorgensen wrote:



On 05 Sep 2005 10:29:48 GMT, Nick Craig-Wood [EMAIL PROTECTED] wrote:

  

Jeremy Jones [EMAIL PROTECTED] wrote:


 One Python process will only saturate one CPU (at a time) because
 of the GIL (global interpreter lock).
  

I'm hoping python won't always be like this.


I don't get that.  Python was never designed to be a high performance
language, so why add complexity to its implementation by giving it
high-performance capabilities like SMP? 
  

It depends on personal perspective. If in a few years time we all have
machines with multiple cores (eg the CELL with effective 9 CPUs on a chip,
albeit 8 more specialised ones), would you prefer that your code *could*
utilise your hardware sensibly rather than not.

Or put another way - would you prefer to write your code mainly in a
language like python, or mainly in a language like C or Java? If python,
it's worth worrying about!

If it was python (or similar) you might only have to worry about
concurrency issues. If it's a language like C you might have to worry
about  memory management, typing AND concurrency (oh my!).
(Let alone C++'s TMP :-)

Regards,


Michael



That argument makes some sense, but I'm still not sure I agree.  Rather than
make Python programmers have to deal with concurrentcy issues in every app to
get it to make good use of the hardware it's on, why not have many of the
common libraries that Python uses to do processing take advantage of SMP when
you use them.  A database server is a good example of a way we can already do
some of that today.  Also, what if things like hash table updates were made
lazy (if they aren't already) and could be processed as background operations
to have the table more likely to be ready when the next hash lookup occurs.
  

Now, *this* is a really interesting line of thought.  I've got a feeling 
that it'd be pretty tough to implement something like this in a 
language, though.  An application like an RDBMS is one thing, an 
application framework another, and a programming language is yet a 
different species altogether.  It'd have to be insanely intelligent 
code, though.  If you had bunches of Python processes, would they all 
start digging into each list or generator or hash to try to predict what 
the code is going to potentially need next?  Is this predictive behavior 
going to chew up more CPU time than it should?  What about memory?  
You've got to store the predictive results somewhere.  Sounds great.  
Has some awesomely beneficial implications.  Sounds hard as anything to 
implement well, though.

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: dual processor

2005-09-04 Thread Jeremy Jones
John Brawley wrote:

Greetings, all.
I have a program I'm trying to speed up by putting it on a new machine.
The new machine is a Compaq W6000 2.0 GHz workstation with dual XEON
processors.
I've gained about 7x speed over my old machine, which was a 300 MHz AMD
K6II, but I think there ought to be an even greater speed gain due to the
two XEONs.
However, the thought occurs that Python (2.4.1) may not have the ability to
take advantage of the dual processors, so my question:
Does it?
  

Sure, but you have to write the program to do it.  One Python process 
will only saturate one CPU (at a time) because of the GIL (global 
interpreter lock).  If you can break up your problem into smaller 
pieces, you can do something like start multiple processes to crunch the 
data and use shared memory (which I haven't tinkered with...yet) to pass 
data around between processes.  Or an idea I've been tinkering with 
lately is to use a BSD DB between processes as a queue just like 
Queue.Queue in the standard library does between threads.  Or you could 
use Pyro between processes.  Or CORBA. 

If not, who knows where there might be info from people trying to make
Python run 64-bit, on multiple processors?
Thanks!

John Brawley


--
peace
JB
[EMAIL PROTECTED]
http://tetrahedraverse.com
NOTE! Charter is not blocking viruses,
Therefore NO ATTACHMENTS, please;
They will not be downloaded from the Charter mail server.
__Prearrange__ any attachments, with me first.


  

HTH,

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: dual processor

2005-09-04 Thread Jeremy Jones
Paul Rubin wrote:

Jeremy Jones [EMAIL PROTECTED] writes:
  

to pass data around between processes.  Or an idea I've been tinkering
with lately is to use a BSD DB between processes as a queue just like
Queue.Queue in the standard library does between threads.  Or you
could use Pyro between processes.  Or CORBA.



I think that doesn't count as using a the multiple processors; it's
just multiple programs that could be on separate boxes.
Multiprocessing means shared memory.
  

I disagree.  My (very general) recommendation implies multiple 
processes, very likely multiple instances (on the consumer side) of the 
same program.  The OP wanted to know how to get Python to take 
advantage of the dual processors.  My recommendation does that.  Not in 
the sense of a single process fully exercising multiple CPUs, but it's 
an option, nonetheless.  So, in that respect, your initial no was 
correct.  But,

This module might be of interest:  http://poshmodule.sf.net

  

Yeah - that came to mind.  Never used it.  I need to take a peek at 
that.  This module keeps popping up in discussions like this one.

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python Doc Problem Example: os.system

2005-09-04 Thread Jeremy Jones
Xah Lee wrote:

Python Doc Problem Example: os.system

Xah Lee, 2005-09

today i'm trying to use Python to call shell commands. e.g. in Perl
something like

output=qx(ls)

in Python i quickly located the the function due to its
well-named-ness:

import os
os.system(ls)


however, according to the doc
http://www.python.org/doc/2.4/lib/os-process.html the os.system()
returns some esoteric unix thing, not the command output. 



*system*(   command)

Execute the command (a string) in a subshell. This is implemented by
calling the Standard C function system(), and has the same
limitations. Changes to |posix.environ|, |sys.stdin|, etc. are not
reflected in the environment of the executed command.

On Unix, the return value is the exit status of the process encoded
in the format specified for wait(). Note that POSIX does not specify
the meaning of the return value of the C system() function, so the
return value of the Python function is system-dependent.

On Windows, the return value is that returned by the system shell
after running command, given by the Windows environment variable
COMSPEC: on *command.com* systems (Windows 95, 98 and ME) this is
always |0|; on *cmd.exe* systems (Windows NT, 2000 and XP) this is
the exit status of the command run; on systems using a non-native
shell, consult your shell documentation.

Availability: Unix, Windows.



Yup.  Nothing more esoteric than a process's exit status.  That's one of 
those really tricky jargons that computer scientist idiots like to throw 
around.  You've got to watch out for those.

The doc
doesn't say how to get the output of the command.

by chance someone told me that in python 2.4 the os.system is
supplanted by subprocess.call(), but this isn't mentioned in the doc!
  

I'm presuming you mean in the os.system docs as you mention below that 
you found such documentation.

upon finding the new doc location
http://www.python.org/doc/2.4/lib/module-subprocess.html i'm told that
this module replaces:

os.system
os.spawn*
os.popen*
popen2.*
commands.*


interesting.




  6.8 subprocess -- Subprocess management

New in version 2.4.

The subprocess module allows you to spawn new processes, connect to 
their input/output/error pipes, and obtain their return codes. This 
module intends to replace several other, older modules and functions, 
such as:

os.system
os.spawn*
os.popen*
popen2.*
commands.*


Yeah.  There's a really tricky word up there in the beginning of the 
subprocess doc.  intends.  In this context, it means that it is 
currently the plan of the Python developers to replace said modules with 
the subprocess module, *however*, that has not totally come about now.  
If the doc had said, This module *has replaced* several others, then I 
would have to agree with you that the stated module docs should be 
updated to reflect the fact that they have been deprecated.

 Since i'm not Python expert

Really?

, i like to look at these. But
fuck, the incompetent doc gives ample gratis links to OpenSource this
or that or author masturbation

OK - I just scanned through the subprocess module docs and I really 
don't see where you're getting this from.  I'll just chalk the former up 
to your bad experience with the regular expression module docs referring 
to the book Mastering Regular Expressions.  And since you're quite the 
linguistic scholar, I'll chalk up the latter to your unique construction 
of the book title I just cited.

 links to remote book i don't really care
about, but here there's no link.

Problem summary:

* does not focus on the task users need to do. Instead, the doc is
oriented towards tech geeking.
  

Are you talking about the subprocess docs?  If so, I'd like to see an 
example of what you're talking about.  Subprocess docs seem really 
straightforward, terse, and to the point.

* does not inform the reader at the right place where a new function is
replacing the old.
  

I would leave it in the hands of the Python doc maintainers what to do 
with this since subprocess hasn't yet totally replaced the other modules.

* does not provide relevant cross-links. (while provding many
irrelevant links because of OpenSource or Tech Geeking fanaticism)
  

I'd really like to see what you're talking about here.  I just went 
through the subprocess docs *again* and I don't see *any* links to any 
other open source anything and I don't see any tech geeking to use 
your jargon.  I think you're full of crap.  And I think you don't have 
the balls to reply back to this message and show me what you're talking 
about.  You're just a little boy inside, making a call to a bowling 
alley and asking if they have 15 pound balls and hanging up laughing 
after they reply yes and you reply Then how do you walk!!!  Oh, 
you're so witty.

Solution Suggestion:

* Add examples.
  

Yeah, the




  6.8.3.2 Replacing shell pipe line

output=`dmesg | grep hda`
==
p1 = Popen([dmesg], stdout=PIPE)
p2 = 

Re: NYC Opening

2005-09-02 Thread Jeremy Jones
Diez B. Roggisch wrote:

Kevin McGann wrote:
  

-Expert Java or C++



Now why exactly do you post that in c.l.python?

  

THEY ARE LOCATED IN NEW YORK, THIS IS FULL-TIME ONLY, WILL NOT CONSIDER
ANYONE FROM OUTSIDE THE US! THIS TEAM IS AN ELITE TEAM, YOU BETTER BE
GOOD



I'm pretty sure you've some spelling wrong here, you certainly want
ELITE to be written as 733T to get the attention of the proper people.
  

I think you've got a typo here.  I think you meant 1337.  My H4X0R 
interpretation of what you wrote is teet, which *may*, however, have 
the desired effect.


SCNR,

Diez

  

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python and file locking - NFS or MySQL?

2005-09-02 Thread Jeremy Jones
Christopher DeMarco wrote:

Hi all...

...I've got a Python script running on a bunch of boxen sharing some
common NFS-exported space.  I need (not want :) to lock files for
writing, and I need (not want :) to do it safely (i.e. atomically).
I'm doing this in Linux; NFS4 is available.  As I understand it, my
options are:

1.  Python's fcntl() is an interface to the fcntl(2) system call,
which is claimed to work mostly over NFS v = 3.
  

I would go with this one, but test the crap out of it.  This *should* 
work just fine for you on NFS, but again, test the crap out of it.  
Write a script that spawns slightly beyond the number of processes (by 
spawning either threads or processes) you expect to actually occur and 
mercilessly lock, update, unlock the file while checking for the results 
to be consistent with what you think they ought to be.

2.  open(2) is atomic on a local FS, I've read discussions that imply
that link(2) is atomic over NFS (is it?), so I can link from local
lockfile to remote target atomically.  I don't grok this; open(2) +
link(2) + stat(2) == 3 calls on my fingers.  HTH is this supposed to
work?

3.  Atomically update a MySQL database indicating that the file is
locked - MySQL has atomic transactions now, right?  And how good is
the Python MySQL API - IIRC Perl didn't have atomic transactions last
year; will this work in contemporary Python?

I've got a while (several weeks) to chew this problem over (my current
implementation is ``assert(Poof!  File locked)'').

What are my options for safely locking files via NFS?  I don't want to
get involved with NLM as my impression is it's being buggy and
unwieldy.  Thanks in advance!


I was present at an undersea, unexplained mass sponge migration.
  

HTH,

JMJ
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Well, another try Re: while c = f.read(1)

2005-08-20 Thread Jeremy Jones
James Kim wrote:

Robert Kern wrote:
  

http://www.catb.org/~esr/faqs/smart-questions.html



Is it a *smart* way or *necessary* way?
  

Of course it's not *necessary*.  I mean, the world isn't going to come 
to an end if it doesn't happen.  There is no logical contingency making 
it so.  But, if everyone in the group adheres to the ESR smart 
questions guide, what's the difference?

Plus, my question was not for the detail description but for the 
intuitive guide leading the beginner's further study.
  

But, I'll try to answer your question the best I can.  From a 
quasi-sensory intuitive level, ``iter`` is red - kinda warm - and smells 
a little like cinnamon, but not too strong.  ``lambda`` on the other 
hand is blue-green, sometimes grey, cooler, almost cold, has a damp feel 
to it, and tastes like pork - not chicken, mind you - that's the ``for`` 
statement.

I understand that too many repeated talks make cyberian tired. However, 
over and over discussions of basic concepts is also very important for 
technology enhancements. 

Here's the deal.  If you have a general question about something, ask 
it.  But ask smartly.  For example, What is the benefit of using 
``iter`` as opposed to something else?  What are the alternatives to 
using ``iter``?  Asking questions like What are the meanings of 
Commands 'iter' and 'lambda' will not fly well here - and you may find 
less so elsewhere.  The reason is, it smells of laziness (I'm not saying 
you *are* lazy - that's just the impression it leaves) and this group is 
full of people who have reached for the docs, wrestled with them, and 
have come away from it better informed programmers. 

Thus, Commands 'iter' and 'lambda' should be 
discussed over and over about their necessity and convenience 

This is different from what you were asking.  I quoted your exact words 
above and it's different from what you're asking here.  And I'm not so 
sure I would put a *should* on your statement.  I think usage 
discussions of different functions, standard library modules, practices, 
etc. *will* arise perpetually.  But I don't think we *need* to 
constantly bat around the necessity of X keyword or Y function or Z 
module.  Convenience - probably.  Necessity - no.

in the 
news-group as long as they are the principle keywords distinguished from 
the conventional languages like c/c++, pascal, etc.

-James
  

So, if you have a question that's in line with Robert's advice, please 
post it and it will have a much higher chance of getting answered.  I 
sincerely hope this helps.


Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: file handling in a server (.py) file using xmlrpc

2005-07-08 Thread Jeremy Jones
uwb wrote:

I've got a call to glob in a .py file sitting in an apache cgi-bin directory
which refuses to work while the exact same code works from a python console
session.

I'm guessing that in order to read or write files from any sort of a script
file sitting in the cgi-bin directory on a server, something has to be set
to allow such activity.  I'd appreciate it if anybody with as clue as to
what that was could tell me about it.  



  

So, what do you mean refuses to work?  Is the cgi script not executing 
at all?  Spitting out an error?  If so, what error?  (And is it an error 
to the browser calling the cgi script, or in your apache logs?)

Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: file handling in a server (.py) file using xmlrpc

2005-07-08 Thread Jeremy Jones
uwb wrote:

Jeremy Jones wrote:

  

uwb wrote:



I've got a call to glob in a .py file sitting in an apache cgi-bin
directory which refuses to work while the exact same code works from a
python console session.

I'm guessing that in order to read or write files from any sort of a
script file sitting in the cgi-bin directory on a server, something has to
be set
to allow such activity.  I'd appreciate it if anybody with as clue as to
what that was could tell me about it.



 

  

So, what do you mean refuses to work?  Is the cgi script not executing
at all?  Spitting out an error?  If so, what error?  (And is it an error
to the browser calling the cgi script, or in your apache logs?)

Jeremy Jones




The script executes, no error messages, but the glob call turns up nothing
while the identical call running from a console does in fact turn up files
names as expected.
  

Wild guess, but I'm thinking your webserver process doesn't have 
permissions to look in your directory. 

Following is alternating root shell and IPython shell:

[EMAIL PROTECTED]:~ # chmod 777 /bam
[EMAIL PROTECTED]:~ # ls -ld /bam
drwxrwxrwx  2 root root 96 Jul  8 14:53 /bam

In [4]: glob.glob(/bam/*txt)
Out[4]: ['/bam/foo.txt', '/bam/bar.txt']

[EMAIL PROTECTED]:~ # chmod 000 /bam
[EMAIL PROTECTED]:~ # ls -ld /bam
d-  2 root root 96 Jul  8 14:53 /bam

In [5]: glob.glob(/bam/*txt)
Out[5]: []


HTH,

Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Legacy data parsing

2005-07-08 Thread Jeremy Jones
gov wrote:

Hi,

  

snip

If anyone could give me suggestions as to methods in sorting this type
of data, it would be appreciated.

  

Maybe it's overkill, but I'd *highly* recommend David Mertz's excellent 
book Text Processing in Python: http://gnosis.cx/TPiP/  Don't know 
what all you're needing to do, but that small snip smells like it needs 
a state machine which this book has an excellent, simple one in (I 
think) chapter 4. 

Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: inheriting file object

2005-07-06 Thread Jeremy Jones
Jeremy wrote:

Hello all,
   I am trying to inherit the file object and don't know how to do it.  I 
need to open a file and perform operations on it in the class I am 
writing.  I know the simple syntax is:

class MyClass(file):
   ...

but I don't know how to make it open the file for reading/writing.  Can 
anyone help me out with this?
Thanks,
Jeremy

  

Something like this?  I put the following code in test_file.py:

class MyFile(file):
def doing_something(self):
print in my own method


And used it like this:

In [1]: import test_file

In [2]: f = test_file.MyFile(foobar.file, w)

In [3]: f.write(foo\n)

In [4]: f.doing_something()
in my own method


But do you really need to subclass file, or can you just use a file 
instance in your class?


Jeremy Jones  
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: I have a question.

2005-06-30 Thread Jeremy Jones
Nathan Pinno wrote:

  Hi all,

  Does Python have a random function? If so, can you show me an example
using it?

  Thanks,
  Nathan Pinno
  http://www.npinnowebsite.ca/



  

Take your pick:

In [5]: import random

In [6]: random.choice(range(10))
Out[6]: 2

In [7]: random.choice(range(10))
Out[7]: 7

In [8]: random.choice(range(10))
Out[8]: 8

In [9]: random.choice(range(10))
Out[9]: 8


In [14]: random.random()
Out[14]: 0.56386154889489271

In [15]: random.random()
Out[15]: 0.47322827346926843

In [16]: random.random()
Out[16]: 0.39921336622176518

In [17]: random.random()
Out[17]: 0.65521407248459007

In [18]: random.random()
Out[18]: 0.74525381787627598
In [20]: r = range(10)

In [21]: random.shuffle(r)

In [22]: r
Out[22]: [6, 4, 9, 7, 2, 0, 8, 3, 5, 1]


Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: tree functions daily exercise: Table

2005-06-21 Thread Jeremy Jones
Xah Lee wrote:

Very very nice! I don't know scheme well... but oh the macros, such a
wonderful facility...
  

Macros suck.  They created by moron so-called computer scientists and IT 
puntits in order opress the programming masses.  But I say we must bring 
freedom to all programmers.  In order abolish this criminality against 
humanity and even programmers, I recommend creating a flurry of 
irrelevant messages cross-posting to noncommonalitous groups.  If you 
need dictionary to find what that word means, you a moron!  F*ck 
dictionaries!  F*ck macros!  F*ck ignoramous computer scientists!  F*ck 
so-called netiquette!  Power to programmers!

Only when we have created sufficient confusion and frustration among all 
programming entities, then they will beg us to shut up.  Only then they 
be willing remove macros from all languages if we willing shut up.  Then 
we shut up and see macros removed.  But only for a little while.  Then 
we campaign to get floating point roman numerals implemented.

Functional lang never let me down.
  

Functional languages evil.  Any language support functions come from 
ivory tower brain lacking idiots who think they know something.  All 
other languages should take a lesson from Python which does not support 
functions.  If Python supports functions, I have not read that informations.

I haven't worked on a Java version yet... but i wonder what pain i'll
have to endure for a lang that lacks eval. Since i'm not Java expert...
i wonder if i can even do it in a few days.

 Xah
 [EMAIL PROTECTED]
∑ http://xahlee.org/
  


Power to programmers!

JJ
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: references/addrresses in imperative languages

2005-06-19 Thread Jeremy Jones
 17 PRINT_NEWLINE
 18 LOAD_CONST   0 (None)
 21 RETURN_VALUE

In [8]: dis.dis(plus_equals)
  2   0 LOAD_FAST0 (lst)
  3 LOAD_FAST1 (item)
  6 INPLACE_ADD
  7 STORE_FAST   0 (lst)

  3  10 LOAD_FAST0 (lst)
 13 PRINT_ITEM
 14 PRINT_NEWLINE
 15 LOAD_CONST   0 (None)
 18 RETURN_VALUE

Figure it out.

-
References:

for a analysis of the same situation in Java, see
http://xahlee.org/java-a-day/assign_array_to_list.html

How to write a tutorial
http://xahlee.org/Periodic_dosage_dir/t2/xlali_skami_cukta.html

 Xah
 [EMAIL PROTECTED]
 http://xahlee.org/

  

It's really bad enough that you waste the time of the folks on 
comp.lang.python.  Why cross post like you are?  I really fail to see 
the point.


Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: collect data using threads

2005-06-14 Thread Jeremy Jones
Qiangning Hong wrote:

A class Collector, it spawns several threads to read from serial port.
Collector.get_data() will get all the data they have read since last
call.  Who can tell me whether my implementation correct?

class Collector(object):
def __init__(self):
self.data = []
spawn_work_bees(callback=self.on_received)

def on_received(self, a_piece_of_data):
This callback is executed in work bee threads!
self.data.append(a_piece_of_data)

def get_data(self):
x = self.data
self.data = []
return x

I am not very sure about the get_data() method.  Will it cause data lose
if there is a thread is appending data to self.data at the same time?

Is there a more pythonic/standard recipe to collect thread data?

  

This looks a little scary.  If a thread is putting something in 
self.data (in the on_received() method) when someone else is getting 
something out (in the get_data() method), the data that is put into 
self.data could conceivably be lost because you are pointing self.data 
to an empty list each time get_data() is called and the list that 
self.data was pointing to when on_received() was called may just be 
dangling.  Why not use the Queue from the Queue module?  You can push 
stuff in from one side and (have as many threads pushing stuff onto it 
as you like) and pull stuff off from the other side (again, you can have 
as many consumers as you'd like as well) in a thread safe manner.

HTH,

Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list


couple of new python articles on onlamp

2005-06-03 Thread Jeremy Jones
I've got a couple of new articles on ONLamp:

Writing Google Desktop Search Plugins
http://www.onlamp.com/pub/a/python/2005/06/01/kongulo.html

and

Python Standard Logging
http://www.onlamp.com/pub/a/python/2005/06/02/logging.html


Comments, criticisms, flames all welcome.


Jeremy Jones


-- 
http://mail.python.org/mailman/listinfo/python-list


file corruption on windows - possible bug

2005-05-09 Thread Jeremy Jones
I've written a piece of code that iterates through a list of items and
determines the filename to write some piece of data to based on
something in the item itself.  Here is a small example piece of code to
show the type of thing I'm doing::

#
file_dict = {}

a_list = [(a, a%s % i) for i in range(2500)]
b_list = [(b, b%s % i) for i in range(2500)]
c_list = [(c, c%s % i) for i in range(2500)]
d_list = [(d, d%s % i) for i in range(2500)]


joined_list = a_list + b_list + c_list + d_list

for key, value in joined_list:
outfile = file_dict.setdefault(key, open(%s.txt % key, w))
outfile.write(%s\n % value)

for f in file_dict.values():
f.close()
#

Problem is, when I run this on Windows, I get 14,520 null (\x00)
characters at the front of the file and each file is 16,390 bytes long. 
When I run this script on Linux, each file is 13,890 bytes and contains
no \x00 characters.  This piece of code::

#
import cStringIO

file_dict = {}

a_list = [(a, a%s % i) for i in range(2500)]
b_list = [(b, b%s % i) for i in range(2500)]
c_list = [(c, c%s % i) for i in range(2500)]
d_list = [(d, d%s % i) for i in range(2500)]


joined_list = a_list + b_list + c_list + d_list

for key, value in joined_list:
#outfile = file_dict.setdefault(key, open(%s.txt % key, w))
outfile = file_dict.setdefault(key, cStringIO.StringIO())
outfile.write(%s\n % value)

for key, io_string in file_dict.items():
outfile = open(%s.txt % key, w)
io_string.seek(0)
outfile.write(io_string.read())
outfile.close()
#

results in files containing 16,390 bytes and no \x00 characters on
Windows and 13,890 bytes on Linux and no \x00 characters (file size
difference on Windows and Linux is due to line ending).  I'm still doing
a setdefault on the dictionary to create an object if the key doesn't
exist, but I'm using a cStringIO object rather than a file object.  So,
I'm treating this just like it was a file and writing it out later.

Does anyone have any idea as to why this is writing over 14,000 \x00
characters to my file to start off with where printable characters
should go and then writing the remainder of the file correctly? 


Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: file corruption on windows - possible bug

2005-05-09 Thread Jeremy Jones




Fredrik Lundh wrote:

  Jeremy Jones wrote:

  
  
#
file_dict = {}

a_list = [("a", "a%s" % i) for i in range(2500)]
b_list = [("b", "b%s" % i) for i in range(2500)]
c_list = [("c", "c%s" % i) for i in range(2500)]
d_list = [("d", "d%s" % i) for i in range(2500)]


joined_list = a_list + b_list + c_list + d_list

for key, value in joined_list:
   outfile = file_dict.setdefault(key, open("%s.txt" % key, "w"))

  
  
you do realize that this opens the file again every time, so you end up having
4x2500 file handles pointing to 4 physical files.  that's a bad idea.
  

That *is* a bad idea, and no, I didn't realize that would be the
result. From the "mapping types" page:


  

  a.setdefault(k[,
  x])
  a[k] if k
in a, else x (also setting it)
  (5)

  



  (5)
  setdefault() is like get(), except
that if k is missing, x is both returned and
inserted into
the dictionary as the value of k. x defaults to
None.


I took this to mean that setdefault was a short-circuit and only
created the default object once if the dictionary didn't contain the
specified key. But I guess it's *me* who is passing setdefault a new
file handle a few thousand times :-) Ouch.

  
if you replace

outfile = file_dict.setdefault(key, open("%s.txt" % key, "w"))

with

outfile = file_dict.get(key)
if outfile is None:
file_dict[key] = outfile = open("%s.txt" % key, "w")

or, if you prefer,

try:
outfile = file_dict[key]
except KeyError:
file_dict[key] = outfile = open("%s.txt" % key, "w")

your code won't depend on any undefined behaviour, and will work properly
on all platforms.

/F 



  

Thanks (and thanks to you, Duncan, for your reply as well),


Jeremy


-- 
http://mail.python.org/mailman/listinfo/python-list

Re: SimpleXMLRPCServer - disable output

2005-04-14 Thread Jeremy Jones
codecraig wrote:

Hi,
  I thought I posted this, but its been about 10min and hasnt shown up
on the group.
  Basically I created a SimpleXMLRPCServer and when one of its methods
gets called and it returns a response to the client, the server prints
some info out to the console, such as,

localhost - - [14/Apr/2005 16:06:28] POST /RPC2 HTTP/1.0 200 -

Anyhow, is there a way I can surpress that so its not printed to the
console? I looked at SimpleXMLRPCServer.py ...it doesn't explicitly
print that, I think perhaps std is...but not sure.   Any ideas??

thanks.

  

Here's the entire SimpleMLRPCServer class from SimpleXMLRPCServer.py:


class SimpleXMLRPCServer(SocketServer.TCPServer,
 SimpleXMLRPCDispatcher):
Simple XML-RPC server.

Simple XML-RPC server that allows functions and a single instance
to be installed to handle requests. The default implementation
attempts to dispatch XML-RPC calls to the functions or instance
installed in the server. Override the _dispatch method inhereted
from SimpleXMLRPCDispatcher to change this behavior.


def __init__(self, addr, requestHandler=SimpleXMLRPCRequestHandler,
 logRequests=1):
self.logRequests = logRequests

SimpleXMLRPCDispatcher.__init__(self)
SocketServer.TCPServer.__init__(self, addr, requestHandler)

You should be able to change logRequests to 0 and that should fix it.  I just 
tested it at a prompt and it worked just fine.


Jeremy Jones

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: SimpleXMLRPCServer - disable output

2005-04-14 Thread Jeremy Jones




codecraig wrote:

  Jeremy Jones wrote:
  
  
codecraig wrote:



  Hi,
 I thought I posted this, but its been about 10min and hasnt shown
  

  
  up
  
  

  on the group.
 Basically I created a SimpleXMLRPCServer and when one of its
  

  
  methods
  
  

  gets called and it returns a response to the client, the server
  

  
  prints
  
  

  some info out to the console, such as,

localhost - - [14/Apr/2005 16:06:28] "POST /RPC2 HTTP/1.0" 200 -

Anyhow, is there a way I can surpress that so its not printed to the
console? I looked at SimpleXMLRPCServer.py ...it doesn't explicitly
print that, I think perhaps std is...but not sure.   Any ideas??

thanks.



  

Here's the entire SimpleMLRPCServer class from SimpleXMLRPCServer.py:


class SimpleXMLRPCServer(SocketServer.TCPServer,
 SimpleXMLRPCDispatcher):
"""Simple XML-RPC server.

Simple XML-RPC server that allows functions and a single instance
to be installed to handle requests. The default implementation
attempts to dispatch XML-RPC calls to the functions or instance
installed in the server. Override the _dispatch method inhereted
from SimpleXMLRPCDispatcher to change this behavior.
"""

def __init__(self, addr,

  
  requestHandler=SimpleXMLRPCRequestHandler,
  
  
 logRequests=1):
self.logRequests = logRequests

SimpleXMLRPCDispatcher.__init__(self)
SocketServer.TCPServer.__init__(self, addr, requestHandler)

You should be able to change logRequests to 0 and that should fix it.

  
   I just tested it at a prompt and it worked just fine.
  
  

Jeremy Jones

  
  
Jeremy,
  So can you explain what I can do to set logRequests = 0?  Do i just
do..

server = SimpleXMLRPCServer(0)  ???

I am sorta new to python thanks.

  

You've got a couple of options. You can either set it in the
constructor (server = SimpleXMLRPCServer(addr,
requestHandler=somehandler, logRequests=0)) or you can set it after you
have an instance of it (create an instance named foo;
foo.logRequests = 0).

HTH,

Jeremy Jones


-- 
http://mail.python.org/mailman/listinfo/python-list

Re: newbie: dictionary - howto get key value

2005-03-10 Thread Jeremy Jones
G. Völkl wrote:
Hello,
I use a dictionary:
phone = {'mike':10,'sue':8,'john':3}
phone['mike']   -- 10
I want to know who has number 3?
3 --  'john'
How to get it in the python way ?
Thanks
  Gerhard
 

How 'bout a list comprehension:
In [1]:phone = {'mike':10,'sue':8,'john':3, 'billy':3}
In [2]:phone.items()
Out[2]:[('billy', 3), ('mike', 10), ('john', 3), ('sue', 8)]
In [3]:[i[0] for i in phone.items() if i[1] == 3]
Out[3]:['billy', 'john']
I added an additional person named billy with a number of 3 since 
values in a dictionary don't have to be unique.

Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: logging addLevelName

2005-03-09 Thread Jeremy Jones
[EMAIL PROTECTED] wrote:
I am trying to add a new logging level.
logging.config.fileConfig(bengineLog.cfg)
logging.CLIENT = logging.INFO + 1
logging.addLevelName( logging.CLIENT, 'CLIENT' )
logging.root.setLevel( [logging.INFO, logging.CLIENT, logging.WARNING,
logging.ERROR, logging.CRITICAL] )
logger = logging.getLogger(None)
logging.Logger.client('test')
I get error:
AttributeError: class Logger has no attribute 'client'
Any help?
 

Looks like what you want is logger.log().  Here is an example that takes 
your addLevelName code and logs at levels info to critical:

#!/usr/bin/env python
import logging
#create new log level
logging.CLIENT = logging.INFO + 1
logging.addLevelName(logging.CLIENT, CLIENT)
logging.root.setLevel([logging.INFO, logging.CLIENT, logging.WARNING, 
logging.ERROR, logging.CRITICAL])

#create logger with mylogger
logger = logging.getLogger(mylogger)
logger.setLevel(logging.INFO)
#create file handler and set level to debug
fh = logging.FileHandler(test.log)
fh.setLevel(logging.DEBUG)
#create console handler and set level to debug
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
#create formatter
formatter = logging.Formatter(%(asctime)s - %(name)s - %(levelname)s - 
%(message)s)
#add formatter to handlers
fh.setFormatter(formatter)
ch.setFormatter(formatter)
#add handlers to logger
logger.addHandler(fh)
logger.addHandler(ch)

logger.debug(this is debug)
logger.info(this is info)
logger.log(logging.CLIENT, this is client)
logger.warning(this is warning)
logger.error(this is error)
logger.critical(this is critical)
It produces this output:
2005-03-09 12:28:33,399 - mylogger - INFO - this is info
2005-03-09 12:28:33,401 - mylogger - CLIENT - this is client
2005-03-09 12:28:33,458 - mylogger - WARNING - this is warning
2005-03-09 12:28:33,460 - mylogger - ERROR - this is error
2005-03-09 12:28:33,518 - mylogger - CRITICAL - this is critical
HTH,
Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


Re: [N00B] What's %?

2005-02-10 Thread Jeremy Jones
administrata wrote:
Hi! it's been about a week learning python!
I've read 'python programming for the absolute begginer'
I don't understand about % like...
107 % 4 = 3
7 % 3 = 1
I'm confused with division :/
Please help me...
thx 4 reading.
 

% is the remainder operator (I think it's also called modulus).
107 % 4 == 3
because
107 / 4 == 26 R3
and 7 % 3 == 1
because 7 / 3 == 2 R1
HTH,
Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


Re: multi threading in multi processor (computer)

2005-02-09 Thread Jeremy Jones
Pierre Barbier de Reuille wrote:
[EMAIL PROTECTED] a écrit :
Hello,
Is anyone has experiance in running python code to run multi thread
parallel in multi processor. Is it possible ?
Can python manage which cpu shoud do every thread?
Sincerely Yours,
Pujo
There's just no way you can use Python in a multi-processor environment,
This isn't exactly correct.  There is no way with plain, out of the box 
Python (and writing plain Python code) to take full advantage of 
multiple processors using a single process.  A single plain vanilla 
Python process will saturate at most one CPU.  I think that's what you 
were trying to express, but I thought it would be best to clarify.  The 
machine I'm running on right now is a dual-CPU machine.  I can 
*definitely* run Python on it.  I haven't tried threading just yet since 
it's a new-to-me machine.  But if I were to watch gkrellm, I would 
expect to see a CPU intensive (multithreaded) process flip back and 
forth between the two CPUs, taking its turn totally saturating both of 
them, one at a time.


because the GIL (Global Interpreter Lock) will prevent two threads 
from running concurrently. When I saw this discussed, the Python 
developper were more into multi-process systems when it comes to 
multi-processors.
I think I even heard some discussion about efficient inter-process 
messaging system, but I can't remember where :o)

Hope it'll help.
Pierre
Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: Loop in list.

2005-02-08 Thread Jeremy Jones
Jim wrote:
Where did this type of structure come from:
mat = ['a' for i in range(3)]?
This will produce a list of three elements but
I don't see reference for it in any of the books.
 

It's called a list comprehension and it appeared in Python 2.0.
http://www.amk.ca/python/2.0/index.html#SECTION00060

Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


IPython Article on O'Reilly's ONLamp site

2005-01-27 Thread Jeremy Jones
I've written an article on IPython which is now live on O'Reilly's 
ONLamp site at 
http://www.onlamp.com/pub/a/python/2005/01/27/ipython.html.  All 
feedback is welcome.  Regardless of what you may think of the article, I 
hope it encourages everyone to at least try out IPython.  IPython has 
become an indispensible tool in my toolbox.  I cannot say enough great 
things about it.

Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


when self-absorbed narcissists discover usenet [ was: how to write a tutorial ]

2005-01-26 Thread Jeremy Jones
Brian van den Broek wrote:
Terry Reedy said unto the world upon 2005-01-26 14:08:
Xah the arrogant wrote, among other things,

SNIP
However, there are several errors in the above that would mislead a 
Python learner.  I advise any such to ignore Xah's writings.

Terry J. Reedy

Hi all,
here's a thought:
There isn't any doubt that these 'tutorials' are generally unwelcome 
and unhelpful. Numerous people have kindly taken the time to flag some 
of the problems. So much so that any competent google of the archives 
would quickly reveal the group consensus on their instructional merit.

I submit that continued corrections and advice of this sort are 
counter-productive. I understand the good intentions behind the 
corrections. (Indeed, my own level of Python-fu is such that it is 
possible that I might have been mislead the 'tutorials' without these 
corrections; I thus appreciate the correctors' efforts.) But, such 
corrections are troll-food and make it unlikely that the 'game' of 
posting such tutorials will soon loose its magical power to amuse the 
OP. They all but ensure that there will be more such 'tutorials' to 
correct.
snip
I couldn't agree with you more.  *However*, when the person posting is 
self-absorbed to the extent that he doesn't realize that others exist 
and don't give a crap about their wishes or discomforts, it puts you in 
a damned if you do, damned if you don't situation.  I honestly think 
that we're stuck with the inane ramblings of Xah Lee regardless of 
whether we feed his trolling or ignore him.  But I do think that 
responding to him in order to preach some sense into him is futile.  He 
is right about everything and can't be swayed by the likes of us mere 
mortals.  So, ignore him, post responses for the benefit of others out 
there, entertain yourself by pointing out to yourself and others his 
folly, but don't waste your time replying back to him and trying to talk 
sense.  Like I said, we're stuck with him.

Jeremy
--
http://mail.python.org/mailman/listinfo/python-list


Re: DevX: Processing EDI Documents into XML with Python

2005-01-21 Thread Jeremy Jones
Claudio Grondi wrote:
You don't have to rely on expensive and proprietary EDI conversion software
to parse, validate, and translate EDI X12 data to and from XML; you can
build your own translator with any modern programming language, such as
Python.
 by Jeremy Jones
 http://www.devx.com/enterprise/Article/26854
 Excerpt:
 Python is an object-oriented, byte-compiled language with a clean
syntax, clear and consistent philosophy, and a strong user community. These
attributes (both of the language and the community) make it possible to
quickly write working, maintainable code, which in turn makes Python an
excellent choice for nearly any programming task. Processing any flavor of
EDI is no exception.
Hi,
just wanted to share with you, that the last issue
of the DevX newsletter comes with a Python related
article as first item in the list of subjects.
Claudio
 

Anyone interested in processing EDI with Python will probably be 
interested in giving it a read.  Please feel free to scrutinize the code 
mercilessly.  I plan on creating a project on Sourceforge with the code 
that is attached to that article (and hopefully with modifications 
coming from user input in the ensuing months).  Comments are greatly 
appreciated.

Thanks for posting this, Claudio.
Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


Re: ftplib with unknown file names

2005-01-10 Thread Jeremy Jones
rbt wrote:
How can I use ftplib to retrieve files when I do not know their names? 
I can do this to get a listing of the directory's contents:

ftp_server.retrlines('LIST')
The output from this goes to the console and I can't figure out how to 
turn that into something I can use to actually get the files (like a 
list of file names). I read a bit about the callback function that can 
be passed to retrlines but I couldn't figure out how to use it.

Any help is appreciated.
Thanks!
ftp object.nlst(argument) will return a list of file names.  Here are 
the docs for the nlst command:

http://www.python.org/doc/current/lib/ftp-objects.html
HTH,
Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


Re: collaborative editing

2004-12-10 Thread Jeremy Jones
Robert Kern wrote:
Michele Simionato wrote:
Suppose I want to write a book with many authors via the Web. The 
book has a hierarchical structure with chapter, sections, 
subsections, subsubsections, etc. At each moment it must be possible 
to print the current version of the book in PDF format. There must be 
automatic generation of the table of contents,
indices, etc. Conversions to many formats (Latex, DocBook, etc.) 
would be
welcome. Does something like that already exists? Alternatively, I 
would need
some hierarchical Wiki with the ability of printing its contents in an
structured way. 

Does it have to be via the Web? Why doesn't LaTeX/DocBook with a 
central CVS/Subversion repository work for what you want to do?

Personally, I loathe writing at any length inside a Web browser and 
prefer to use a real editor at all times.

Actually, in a sense, if you put up a SVN repository under Apache, the 
repository is available via the Web just as the OP requires.  He'll 
have to specify further, but I'm not so sure this is a requirement to be 
able to edit via a browser over the web, or just have the repository on 
the web somewhere that multiple people can access it.

But SVN will work pretty well, I think, for what you're doing.  
Especially if the authors are editing different sections of the file.  
If two (or more) authors try to edit the same section of file, the first 
one to commit the change will win and any subsequent commits will not 
be able to commit the change until the resolve the conflict.

Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


Re: Trying to understand a little python

2004-12-06 Thread Jeremy Jones




McCarty, Greg wrote:

  
  
  
  
  Ok, I'm new to python,
and I'm trying to come to grips with
a few things. Got
  lots of years of
experience with Java and asp/aspx,
etc. Trying to relate
  Python's behavior to what
I already know.
  
  Here's the python code
(line #'s added for my question) -
  
  01 class Tester:
  02 def __init__
(self):
  03 print
"I'm initializing Tester"
  04
  05 def test(klass=Tester):
  06 klass.stuff =
"setting
stuff"
  07 print "I'm in
test: " +
klass.stuff
  08
  09
test() # results 1: I'm
in test: setting stuff
  10 a=Tester() #
results 2: I'm
initializing Tester
  11 a.stuff #
results 3: 'setting stuff'
  12 b=Tester() #
results 4: I'm
initializing Tester
  13 b.stuff #
results 5: 'setting stuff'
  14 a.stuff="changed!"
  15 b.stuff #
results 6: 'setting stuff'
  16 a.stuff #
results 7:'changed!'
  
  And here's my questions -
  
  Line 09 - I expected the
default argument assignment of line
05 to 
  create an object of type
Tester and assign it to the var
klass. Thus I
  expected Tester.__init__
to fire, which it didn't.
What does 'klass=Tester' 
  actually do on line 05?
  
  

klass=Tester sets the default value of the variable klass to the
*class* Tester. This isn't creating an instance of Tester. Only
pointing klass to the class itself.

  
  Line 10 - Seems that the
syntax 'Tester()' actually causes
the __init__ method to 
  fire. Is this the only
case?
  

Mostly. You can use getattr if you like. And probably eval or exec.
But I would try to stick with the Tester() syntax.

  
  
  
  Line 12 - At this point,
I was thinking of Tester.stuff as a
static variable
  of the Tester class.
  

When you set a.stuff to "changed", you are setting an instance
attribute on "a" to "changed". "b" is still pointing to the "static
variable" on Tester. Look at this:

In [12]: a = Tester()
I'm initializing Tester

In [13]: b = Tester()
I'm initializing Tester

In [14]: Tester.stuff
Out[14]: 'setting stuff'

In [15]: Tester.stuff = "FOOBAR"

In [16]: a.stuff
Out[16]: 'FOOBAR'

In [17]: b.stuff
Out[17]: 'FOOBAR'

In [18]: a.stuff = "A.STUFF"

In [19]: a.stuff
Out[19]: 'A.STUFF'

In [20]: b.stuff
Out[20]: 'FOOBAR'

I re-set Tester.stuff to "FOOBAR". "a" and "b" attributes "stuff" were
pointing to that for a second. Then I pointed the attribute "stuff" on
"a" to "FOOBAR".

  
  
  
  Line 15 - We'll, I guess
stuff isn't a static variable!
What is the
  explanation here?
  
  Thanks for any help.
  
  Greg McCarty
  Senior Technical Advisor / ManTech
IST
  ph: 410-480-9000 x2804
or
703-674-2804 fx: 410-480-0916
  
  

HTH.

Jeremy


-- 
http://mail.python.org/mailman/listinfo/python-list

Re: file descriptors fdopen

2004-12-06 Thread Jeremy Jones
Scott Frankel wrote:
Why does os.fdopen('foo.txt', 'w') require an integer?
Ultimately, I want to create a new file on disk.
Funny, I can't seem to suss-out how to create a new file without 
resorting
to os.system('touch foo.txt').  ... Or maybe not so funny ...

 foo = os.fdopen('foo.txt', 'w')
Traceback (most recent call last):
  File stdin, line 1, in ?
TypeError: an integer is required
Thanks in advance!
Scott

If you just want to create a file for writing to, you probably want:
foo = open('foo.txt', 'w')
Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


Re: A little threading problem

2004-12-02 Thread Jeremy Jones
Alban Hertroys wrote:
Jeremy Jones wrote:
(not waiting, because it already did happen).  What is it exactly 
that you are trying to accomplish?  I'm sure there is a better approach.

I think I saw at least a bit of the light, reading up on readers and 
writers (A colleague showed up with a book called Operating system 
concepts that has a chapter on process synchronization).
It looks like I should be writing and reading 3 Queues instead of 
trying to halt and pause the threads explicitly. That looks a lot 
easier...

Thanks for pointing out the problem area.
That's actually along the lines of what I was going to recommend after 
getting more detail on what you are doing.  A couple of things that may 
(or may not) help you are:

* the Queue class in the Python standard library has a maxsize 
parameter.  When you create a queue, you can specify how large you want 
it to grow.  You can have your three threads busily parsing XML and 
extracting data from it and putting it into a queue and when there are a 
total of maxsize items in the queue, the next put() call (to put data 
into the queue) will block until the consumer thread has reduced the 
number of items in the queue.  I've never used 
xml.parsers.xmlproc.xmlproc.Application, but looking at the data, it 
seems to resemble a SAX parser, so you should have no problem putting 
(potentially blocking) calls to the queue into your handler.  The only 
thing this really buys you won't have read the whole XML file into memory.
* the get method on a queue object has a block flag.  You can 
effectively poll your queues something like this:

#untested code
#a_done, b_done and c_done are just checks to see if that particular 
document is done
while not (a_done and b_done and c_done):
   got_a, got_b, got_c = False, False, False
   item_a, item_b, item_c = None, None, None
   while (not a_done) and (not got_a):
  try:
 item_a = queue_a.get(0) #the 0 says don't block and raise an 
Empty exception if there's nothing there
 got_a = True
  except Queue.Empty:
 time.sleep(.3)
   while (not b_done) and (not got_b):
  try:
 item_b = queue_b.get(0)
 got_a = True
  except Queue.Empty:
 time.sleep(.3)
   while (not c_done) and (not got_c):
  try:
 item_c = queue_c.get(0)
 got_c = True
  except Queue.Empty:
 time.sleep(.3)
   put_into_database_or_whatever(item_a, item_b, item_c)

This will allow you to deal with one item at a time and if the xml files 
are different sizes, it should still work - you'll just pass None to 
put_into_database_or_whaver for that particular file.

HTH.
Jeremy Jones
--
http://mail.python.org/mailman/listinfo/python-list


Re: Semaphore or what should I use?

2004-12-01 Thread Jeremy Jones
Bastian Hammer wrote:
Hi
I´m wondering why there are so few examples with Semaphore.
Is it obsolete?
I´ve got a Class Data.
It offers 2 Threads methods for updating, editing, .. a private
dictionary.
Now I have to make sure, that both threads are synchronal, 
1 thread edits something and the other is blocked until the first
thread is ready.

Isn´t it a good idea to do this with a semaphore?
And if I should use a Semaphore here, could anybody give me an example
how it should look like?
Everything that I test throws errors :(
Thank you :)
Bye, Bastian
 

Sure, you can use a Semaphore.  But it sounds like you are really 
wanting an exclusive lock.  Semaphore can do that for you - actually 
it's the default behavior.  You could try using a regular old Lock.  
Semaphores are locking counters.  You set the counter at initialization 
to some number (the default is 1).  When you enter into a  semaphored 
area of code (using the .acquire() method), the counter attempts to 
decrement and will do so if it doesn't push it beneath 0.  Upon exiting 
the area of semaphored code (by calling the .release() method on the 
semaphore), the counter is incremented.  An example would look like this:

import threading
class locking_writer:
   def __init__(self, some_file):
   self.sema = threading.Semaphore()
   self.f = open(some_file, 'w')
   def write(self, content):
   self.sema.acquire()
   self.f.write(content)
   self.sema.release()
and used like this:
In [16]: l = locking_writer('/tmp/foo')
In [17]: l.write('test')
I haven't tested this with multiple threads, so I'll leave that up to 
you if you want to use it.

Now, with all that said, the preferred way of synchronizing between 
threads is to use a Queue (import Queue\nq = Queue.Queue()).  If you 
have a file that more than one thread needs to update, you probably want 
to create a thread just to update that file and have the threads 
responsible for getting information to update it with pass that 
information into a queue.  You may have reasons for not wanting to do 
that, but it's worth looking into and considering.

Jeremy Jones
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: A little threading problem

2004-12-01 Thread Jeremy Jones




Alban Hertroys wrote:
Hello
all,
  
  
I need your wisdom again. I'm working on a multi-threaded application
that handles multiple data sources in small batches each time. The idea
is that there are 3 threads that run simultaneously, each read a fixed
number of records, and then they wait for eachother. After that the
main thread does some processing, and the threads are allowed to
continue reading data.
  
  
I summarized this part of the application in the attached python
script, which locks up rather early, for reasons that I don't
understand (I don't have a computer science education), and I'm pretty
sure the problem is related to what I'm trying to fix in my
application. Can anybody explain what's happening (Or maybe even show
me a better way of doing this)?
  
  
Regards,
  
  
Alban Hertroys,
  
MAG Productions.
  
  

import sys
import threading

class AThread(threading.Thread):
	def __init__(self, name, mainCond, allowedCond):
		self.counter	= 0
		self.name		= name
		self.mainCond	= mainCond
		self.condAllowed = allowedCond
		self.waitUntilRunning = threading.Condition()

		threading.Thread.__init__(self, None, None, name, [])

	def start(self):
		threading.Thread.start(self)

		# Let the main thread wait until this thread is ready to accept Notify
		# events.
		self.waitUntilRunning.acquire()
		self.waitUntilRunning.wait()
		self.waitUntilRunning.release()

	def run(self):
		threading.Thread.run(self)

		# Print numbers 1 - 25
		while self.counter  25:
			self.condAllowed.acquire()

			# Tell the main thread that we're ready to receive Notifies
			self.waitUntilRunning.acquire()
			self.waitUntilRunning.notify()
			print "Running"
			self.waitUntilRunning.release()

			# Wait for a Notify from the main thread
			print "Wait"
			self.condAllowed.wait()
			self.condAllowed.release()

			self.counter += 1

			print "Thread %s: counter = %d" % (self.name, self.counter)


			# Tell the main thread that a thread has reached the end of the loop
			self.mainCond.acquire()
			self.mainCond.notify()
			self.mainCond.release()

class Main(object):
	def __init__(self):
		self.condWait = threading.Condition()
		self.condAllowed = threading.Condition()

		self.threads = [
			AThread('A', self.condWait, self.condAllowed),
			AThread('B', self.condWait, self.condAllowed),
			AThread('C', self.condWait, self.condAllowed),
		]

		# Start the threads
		for thread in self.threads:
			thread.start()

		while True:
			# Allow the threads to run another iteration
			self.condAllowed.acquire()
			print "Notify"
			self.condAllowed.notifyAll()
			self.condAllowed.release()

			# Wait until all threads reached the end of their loop
			for thread in self.threads:
self.condWait.acquire()
self.condWait.wait()
self.condWait.release()


main = Main()

  

You've got a deadlock. I modified your script to add a print "T-%s" %
self.name before an acquire and after a release in the threads you spun
off (not in the main thread). Here is the output:

[EMAIL PROTECTED] threading]$ python tt.py
T-A: acquiring condAllowed
T-A: acquiring waitUntilRunning
T-A: Running
T-A: released waitUntilRunning
T-A: Wait
T-B: acquiring condAllowed
T-B: acquiring waitUntilRunning
T-B: Running
T-B: released waitUntilRunning
T-B: Wait
T-C: acquiring condAllowed
T-C: acquiring waitUntilRunning
T-C: Running
T-C: released waitUntilRunning
T-C: Wait
Notify
T-A: released condAllowed
T-A: counter = 1
T-A: acquiring mainCond
T-A: released mainCond
T-A: acquiring condAllowed
T-A: acquiring waitUntilRunning
T-A: Running
T-A: released waitUntilRunning
T-A: Wait
T-C: released condAllowed
T-C: counter = 1
T-C: acquiring mainCond
T-C: released mainCond
T-C: acquiring condAllowed
T-C: acquiring waitUntilRunning
T-C: Running
T-C: released waitUntilRunning
T-C: Wait
T-B: released condAllowed
T-B: counter = 1
T-B: acquiring mainCond
T-B: released mainCond
T-B: acquiring condAllowed
Notify -Here is
your problem
T-A: released condAllowed
T-A: counter = 2
T-A: acquiring mainCond
T-A: released mainCond
T-A: acquiring condAllowed
T-A: acquiring waitUntilRunning
T-A: Running
T-A: released waitUntilRunning
T-A: Wait
T-B: acquiring waitUntilRunning
T-B: Running
T-B: released waitUntilRunning
T-B: Wait
T-C: released condAllowed
T-C: counter = 2
T-C: acquiring mainCond
T-C: released mainCond
T-C: acquiring condAllowed
T-C: acquiring waitUntilRunning
T-C: Running
T-C: released waitUntilRunning
T-C: Wait

Notify is called before thread B (in this case) hits the
condAllowed.wait() piece of code. So, it sits at that wait() for
forever (because it doesn't get notified, because the notification
already happened), waiting to be notified from the main thread, and the
main thread is waiting on thread B (again, in this case) to call
mainCond.notify(). This approach is a deadlock just wanting to happen
(not waiting, because it already did happen). What is it exactly that
you are trying to accomplish? I'm sure the