Hi, we are developing an application with some presentation functionality. Presentation has to run without internet connection (local wifi cell) on a trainer's laptop. Students will connect to the server running on the trainer's laptop for some interactive part. There are multiple trainers each of them having their own Django instance running on their laptops. And each of them able to create/update/delete presentation material. The question is how to keep this presentation material in sync between the different Django instances.
The current idea is: - Have a central server which hosts database dumps - when internet connection is available be able to sync database dumps (update local/update remote presentation material) - to update local presentation material do something like a 3 way merge of selected database tables * have a common ancestor of remote and local database versions * load the remote and ancestor db dumps * map database entries by a custom primary key (probably guid) to avoid addition of same primary key in remote and local versions for different datatable records * have timestamped models to determine updates to existing datatable records * thus determine creation/update/deletion of a datatable entry depending on pk existence and by use of 'modified' timestamp when comparing local/ancestor or remote/ancestor versions * in a first step only warn that local changes will be lost by updating to the remote presentation material but later be able to merge where no conflict exists and ideally let the user handle merge conflicts Does anyone have experience with a similar scenario? Or are there tools that might help implement this? I am grateful for advice! -- You received this message because you are subscribed to the Google Groups "Django users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/django-users/e066dab4-c396-4f77-85cc-a9a2a5765590%40googlegroups.com.

