On Thu, 2009-02-05 at 09:44 +0100, Matteo Pelucco wrote:
> Jan Haderka ha scritto:
> > How many public instances do you have? 
> 
> Hi, Jan, we are working on 1 author and 2 public (but in future maybe 3 
> or 4...) so I think the XA activation must be kept turned on...

Sure in this case turning of XA is not an option.

> 
> > Other option would be to change the extraction rule and activate content
> > in bigger chunks rathern then piece by piece ... 
> 
> Can it be done programmatically? How?

Yes, activation command has method setRule() which allows you to specify
custom rule for content collection. The default rule will collect all
properties, metadata and paragraphs for given node, but will ignore the
children so you effectively activate one piece of content after another
even if they are in same hierarchy. You can change that to include also
the content type of the node itself in which case you would collect all
the content downstream in one go. 
Limitations:
- you can collect this way only all content in same hierarchy
- transferring big trees in this way will increase amount of memory
server needs at runtime as import/export are memory intensive
operations.
- once you extract all the content in the hierarchy, you also need to
send it over to your public instance. Sending big chunks of data
increases likelihood of failure during transfer in comparison to
transferring smaller files.

You can see how the default rule is constructed in the RuleBasedCommand
implementation.

> > OTOH if you rely on
> > observation to start your activation it might not be possible for you
> > (Since you get notifications one by one for each piece of content).
> 
> This is my case: one notification for each added node...
> 
> > Possibly unrelated, but I'd like to know why do you activate nodes from
> > data module by observing the changes and then reacting on that, rather
> > then setting "activateImport" of your import handler to "true"? 
> 
> We have an external data source that periodically syndicates to our 
> author instance. Every night an http request is done by an external 
> agent and an update procedure is started within a listener, parsing the 
> request and adding nodes. It is not done in standard SV XML, because we 
> don't have access to the request caller, so we must keep the original 
> XML format.

Well, with import handler you can turn this process around. You write
your own import handler to initiate http (or other) request to external
data provider, then parse the data you get in the response and update
all the content. Magnolia sets up the context for you so you can access
the repository from the handler, since you have full control of parsing
the response data it can be in any proprietary format and doesn't need
to conform to anything as long as your code can understand it. 
With the data module comes sample implementation of import handler (see
info.magnolia.module.data.samples.DateTimeImportHandler for details)
The import handler is then configured
at /modules/data/config/importers/example
There you can configure the automated execution (when and how often
should the import handler run), whether or not should the imported data
be automatically activated and whether or not you want to keep the old
data or flush them on new import (useful when your import handler it
always taking full snapshot of external data rather then incremental
updates)

Cheers,
Jan

> 
> In this
> > case it would be import handler activating your content once the import
> > is done and since it knows of all the content that needs to be pushed to
> > public, you might be able to push it more effectively from there.
> 
> I need more info on that, I've not yet manage this object before... but 
> yes, I agree with you, "on paper" it sounds better than my previous choice.
> 
> > 
> > Yet another option that comes to my mind is that you can run the import
> > on both, author and public instance so you don't need to activate data
> > nodes at all, but that depends on what you are actually doing with those
> > data nodes and whether they aree referenced by uuids or by path only.
> 
> They are referenced by uuid, so I must kept them related.
> 
> > 
> > Cheers,
> > Jan
> > 
> 
> Thanks a lot, Jan!
> If you have time, can you give me more info on how to import data 
> programmatically? Only on thing, I can not reproduce (for now, but if 
> the next time budget will let me, I'll do!) the SV XML generation.
> 
> Matteo
> 
> 
> ----------------------------------------------------------------
> For list details see
> http://www.magnolia-cms.com/home/community/mailing-lists.html
> To unsubscribe, E-mail to: <[email protected]>
> ----------------------------------------------------------------
-- 
Best regards,

Jan Haderka
Magnolia International Ltd.

----------------------------------------------------------------------
[email protected]                http://www.magnolia-cms.com
Magnolia®  - Simple Open Source Content Management
----------------------------------------------------------------------


----------------------------------------------------------------
For list details see
http://www.magnolia-cms.com/home/community/mailing-lists.html
To unsubscribe, E-mail to: <[email protected]>
----------------------------------------------------------------

Reply via email to