I had to to do something similar a couple of years ago (between
several waste water plants and the control center) and ended using a
similar approach to what nick name said:
- In the control center I used mysql
- In the waste water plants I used a sqlite database per day
(initializating the
I would set a Dropbox shared folder on both machines (there is a server
dropbox.py for linux)
So create a script which copies the .sqlite in to Dropbox folder.
On city machine a scheduled task read records on sqlite and import new
records directly to mysql.
You are going to need a 'signal'
Hi,
Well, i need to work with that, so i will try to develop some features
to support this!
Saludos,
Alfonso de la Guarda
Twitter: @alfonsodg
Redes sociales: alfonsodg
Telef. 991935157
1024D/B23B24A4
5469 ED92 75A3 BBDB FD6B 58A5 54A1 851D B23B 24A4
On
On Wednesday, July 11, 2012 6:26:00 PM UTC-4, Massimo Di Pierro wrote:
I am planning to improve this functionality but it would help to know if
it works for you as it is and what problems you encounter with it.
I originally used the export-to-csv, but a few months ago, I switched to
just
There are two issue: 1) protocol for transferring data; 2) exporting and
importing from database.
rabbitmq etc. only address 1 and you do not need any. Web2py already has a web
server a many RPC systems you can use.
The real issue is 2. If your tables have a uuid field, db.export_to_csv_field
5 matches
Mail list logo