Ray Allis schrieb:
(...)
"One key feature of Lenya (...) is that all content is stored in XML
files. (...) Another advantage of this approach is that all content is
human and machine readable, so there is no possibility of being
tied to
a closed binary format: Lenya data should be free data forever."
Moving to storing information in Jackrabbit means a binary format;
not a
closed one of course, but still ... is this a concern ? (Or is it just
me ;
I don't understand why jackrabbit == binary. JSR-170 just specifies an
interface, and says "It should not be tied to any particular underlying
architecture, data source or protocol."
Right, JSR-170 specifies the API, so it's not _closed_, but Jackrabbit
(an implementation of the API) writes the content in a form only
Jackrabbit will be able to read, IIUC, so it is _binary_
My concern is when something goes wrong, e.g. the site structuring
somehow gets screwed up. Up to now, I can check the sitetree XML on the
server and if I see a mistake (due to a bug or some unforeseen usage of
Lenya) I fix it. If all is stored in Jackrabbit it will be more difficult.
I know, the sitetree should get never get screwed up, nor any other
source, if it happens, I should check for the bug which caused this
(assuming I can reproduce it in the first place). But what do you do
when you have a user in front of you complaining "X doesn't show like it
should, and I need it right now, people are waiting for it?"
This is one of the reasons I am wondering whether the way to go is: make
Lenya's content handling JCR-compliant, and keep on writing atomic files
as they are now for the backend; whoever wants to switch that to
Jackrabbit for a publication can do that. To be honest, I don't know how
realistic that is, but from a functionality perspective I think that
would be best.
--
Wolfgang
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]