Hi Guys, 

Could not make the main meeting tonight :(

But here is a puzzler. 


On Server 1 
I have a simple HTML forms file 
that invokes a php program that 
writes the values input into the form
into a simple text file and then 
redirects to tat text file thus displaying it.

This works fine. I surf in from Client 1
fill in some values hit Submit and see 
them displayed. 

So  far so good. 

But I actually want to pull those down on 
another box, lets call it Client 2, for  use 
in another program. If use wget to grab 
the file written by the previous process
all is well. I can go back to my form on
Client 1 and repeat the entire sequence.

Now though I want to replace wget with 
a simple Python program. So I write
pget.py

#!/usr/bin/python

from urllib import urlopen

call_url = "http://some_server/some_dir/test.txt";
u = urlopen(call_url)
doc = u.reaad()
u.close()
print doc

Now this little pget.py works fine and grabs the
file just like wget. Except there is a problem. 
Now when I go back to Client 1 and try to 
enter new values the Client hangs for a long
time upon hitting Submit, ie. maybe a minute
or more (I have not actually timed it.) 

Color me puzzled. I am not even sure how
to debug it. 

Both Client 1 and Client 2 are behind a 
Linksys router and access Server 1
through a cable modem and the Internet. 
I don't think that is the problem though ...
after all wget does not cause this issue. 

Anybody have a clue? Or a suggestion as to 
how to proceed? I really don't want to be 
reduced to packet sniffing. 

BobLQ










-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-lpsg

Reply via email to