>
> So, we need some sort of data architecture that can
> * figure out which layers need updating after a user profile changes
I suggest we hook up the post_save[1] signal handler for the Profile
model and take ``self.user.layer_set.all()`` as the list of layers to
be updated. If we can do bulk updates to GeoNetwork, it would be great
to add that as a LayerManager method.

Which makes me think: Is this update operation expected to be
expensive(in terms of time) ? If it is, then we better take it out of
the request/response cycle, for example creating a
``update_geonetwork`` management command that runs every minute and
sees if there are pending updates (by checking a PendingUpdates table
or similar).

[1] 
http://docs.djangoproject.com/en/dev/ref/signals/#django.db.models.signals.post_save


> * update just the fields corresponding to that user profile (actually, GN is
> basically storing the metadata documents as blobs so we will have to
> overwrite everything... but we need to make sure that we don't clobber the
> fields that aren't being modified)
From your comments I get that this is not feasible, am I correct?
Which one is supposed to be the authoritative data source for
metadata, our GeoNode or GeoNetwork? Can we safely assume that every
GeoNode instance starts off with a fresh GeoNetwork?

> One possible implementation would be to have a more relational model in
> GeoNode and use the typical "WHERE owner.uid = updated_profile.uid" kind of
> query to figure out what documents to update, and then just generate entire
> new metadata documents to clobber the pre-existing ones.  To preserve the
> fields that aren't coming from GeoNetwork, we'd probably want to store
> everything in the layer's Django representation.
BTW, If we are going to replicate a lot of the GeoNetwork metadata in
the Django db, I wonder why we still need GeoNetwork, only for
searching?

Ariel

Reply via email to