This is half design question, half technical question.  I'm hoping
someone in this community may have experience to share.

I need to feed data to a proprietary system over a TCP connection
based on information entered into a web form.  Currently I do this
using django, where the django page opens the connection, sends the
data, and closes the connection again.  This has worked fine for
relatively low volumes but is a lot of overhead (setting up and taking
down the TCP connection) and our number of transactions is about to
dramatically increase.

The design question:

If we want to make a persistent TCP connection with a data queue, is
django the best place (or even a possible place) for this connection
layer to live?  I see these options:

1) Write a separate TCP daemon app which accepts the input from django
and maintains a permanent connection to the remote proprietary
service.  (Which IPC mechanism would work best is another question:
using a database table as a queue?  Using a unix socket and a memory
queue on the daemon?)

2) Find somewhere in django where I can create a persistent queue,
have the http transaction write to the queue and have a django service
persistently checking the queue while maintaining the outbound TCP
connection.  This is where I am especially short on technical
knowledge.  Is there any way to have code running persistently in the
django environment that starts when the web server is started, or is
everything entirely event driven?

If anyone in the community has any ideas or comments, I'd love to hear
them!

Thanks,
Peter


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to