This one's for the hard-core XWiki gurus...

My group is interested in going heavily into using XWiki to manage 
(semi) structured information that will be partially shared with other 
applications (non-wikis, possibly desktop apps).  We figured since XWiki 
has an extensible object-model and accompanying persistence layer, we 
might let it be the authoritative keeper of all data in the distributed 
system.

After playing with XWiki for a couple of weeks and prototyping some 
pages/objects plus a component to package and exchange data with another 
Java application, we're starting to wonder if there couldn't be a more 
efficient way to access/distribute object data, and even to define a 
common data model across applications and wiki:

 - Regarding access/distribution:  As an example, we're prototyping a 
graphical tool that would want to get a whole graph of objects at 
start-up to build a node/link picture of some situation (where each node 
or link would also be a page/object in the wiki).  Given the existing DB 
schema, we feel like we're struggling at this point to imagine a DB 
query that would pull all relevant object data for a given graph at one 
go, and even if we could pull the data efficiently, we're not sure how 
best to package it up for transmission.

 - Regarding more efficient object class definition: We're imagining a 
system with maybe 25 classes (and growing), and it would be nice to 
define them once and have them exist both as Java POJOs (for the other 
application bits) and XWiki classes (for XWiki to generate page views).  
We've been using the Eclipse Modeling Framework (EMF) quite a bit lately 
to do that kind of thing--e.g. to define a model in a UML-ish way, and 
generate different artifacts, including Java class code.  I drag EMF 
into this because I like the idea of having one modeling point for all 
parts of the system, and because its native generated POJO object 
implementations are pretty efficient (it also provides freebie 
object-graph serialization to XML, which we've used successfully for the 
"distribution" issue above in other systems, and it even allows for 
dynamic runtime class definitions).

One thought we had was to create modified EMF generator templates to 
generate code that would invoke XWiki APIs to build needed XWiki object 
class structures, as well as packer/unpacker routines to convert between 
XWiki objects and EMF-POJO objects.  This approach has some challenges:  
(1) I don't know if there's an exposed XWiki API for defining object 
classes, (2) if there is, I don't know how tricky it might be to use it 
correctly, and (3) hacking the EMF generator would be a pretty daunting 
task too.

Another thought we had was to totally replace the underlying XWiki 
object model.  This approach is likely even more challenging, but could 
produce a better (more efficient) overall implementation.  We checked 
out the XWiki source from the public SVN and on first glance it appear 
com/xpn/xwiki/objects and com/xpn/xwiki/objects/classes are the packages 
to look at.  It seems like it might be possible to use EMF-POJOs as our 
data model, by either wrapping or extending them to satisfy 
ObjectInterface and ClassInterface.  If we could do that, it might also 
make sense to also replace the persistence layer (com/xpn/xwiki/store ?) 
by using EMF's ability to generate DB-persistence through Teneo and 
Hibernate (if we do our own persistence, we can perhaps set up 
schema/indexing that better supports key high-volume queries in our 
application).

So here's the question part of this question:

1- Do the overall goals of using XWiki in this kind of 
distributed/multi-application system make sense and fit within the 
envisioned envelope of what XWiki is reasonably intended to support?

2- Has anyone worked on pulling large sets of objects from the XWiki DB 
in one go, and packaging them for transmission over the wire to some 
other Java application?

3- Is there a programmatic/API way to drive XWiki to define new object 
classes, and if so, is there any documentation on how to use it?

4- Is replacing the data and/or persistence models underlying XWiki 
something that is supposed to be doable?  How hard/complicated should we 
expect it to be?  Is that the kind of thing we could get help with on 
this list?

5- Do you have any other comments or advice on how to think about this 
kind of problem?

Hope this strikes someone as interesting and worthy of comment.  Thanks 
for any guidance you can offer.

--Eric

=========================================================================
Eric Domeshek                         Phone: 617-902-2223
AI Project Manager                      Fax: 617-902-2225
Stottler Henke Associates, Inc.       EMail: domes...@stottlerhenke.com
86 Sherman St., Cambridge, MA 02140     Web: www.stottlerhenke.com
=========================================================================

_______________________________________________
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users

Reply via email to