Hi all,
I have created a desktop database front end to handle some of our company's
workflow processes. We are now hoping to modify the app so individual users
can run the app on there local machine (both on and off line) and sync data
to a central server.
The server will most likely be a Linux box using php as middleware and MySQL
as a database. The system will have approx. 20-30 unique users who will have
different levels of access (controlled by the database.)
In planning for this upgrade I have identified 2 potential issues;
1) determining best way to handle local to network data sync.
I will be creating a UUID for each record so I'm not worried about duplicate
record ID's. What does concern me is how to determine what records need to
be synced. The only thing I can think of is to have a "last modified" date
field for server queries and an "to update" Boolean field. When the sync
command is called, the desktop app would get a recordset based on the to
update flag then generate the sql to be sent to the server. The app would
then request all records updated after the last sync (local property).
Does this make sense or is there a better way to deal with synchronizing
data?
2) What would be the best way to handle network SQL strings. Send the whole
string over the network? Send the parameters and an SQL identifier and have
the php middleware assemble the SQL?
I don't think either would be that complicated but I am worried about
security.
Thanks,
Rob
_______________________________________________
Unsubscribe or switch delivery mode:
<http://www.realsoftware.com/support/listmanager/>
Search the archives of this list here:
<http://support.realsoftware.com/listarchives/lists.html>