is there a way for 2 (or more) completely different and unrelated
applications to be able to exchange persistent information (i.e. data,
objects) via some channel/server?

i've thought of using shared memory, but my obstacle was applications
that ran on different machines that are connected by a network.

socket programming is too OS specific and too much a burden for an
application to incorporate into it consciously.

databases are too demanding for systems that need to exchange small
chunk size data objects.

currently, i am working on xmlgos -- XML Generic Object Server -- which
would act as a repository of data objects which offers user
authentication (and maybe later on SSL ;)) and a simple
storage|retrieval system ala FTP. this project is still in its alpha
stages, as i am still figuring out some details as to how i will
implement the commands and the XML wrapping that i will employ (libxml).

now, im working on the server and the protocol. i'll test it with a
simple client at first, then provide an API (header files and shared
libraries) for accessing it from other applications. something to the
effect of the syslog() call, but then it goes to my server instead.
(hope that made sense).

am i duplicating something already? this is just a hobby project, and
i'd like to solicit advice this early in the development process.

-- 
-=[mikhail]=-

aka Dean Michael C. Berris
mobile +63 917 8901959
work +63 49 5680024
http://free.net.ph/Members/mikhailberis
pgp key ID = 0xF9501761

_
Philippine Linux Users Group. Web site and archives at http://plug.linux.org.ph
To leave: send "unsubscribe" in the body to [EMAIL PROTECTED]

Fully Searchable Archives With Friendly Web Interface at http://marc.free.net.ph

To subscribe to the Linux Newbies' List: send "subscribe" in the body to [EMAIL 
PROTECTED]

Reply via email to