Well this is a very interesting topic of discusssion and me and the XWiki SAS 
Research team
would like to do whatever we can to help evolve XWiki forward (but don't let us 
get in the way!)


On 04/11/16 14:34, Vincent Massol wrote:
Hi devs,

We’ve been developing XWiki over 10 years now. I’d like to start a discussion 
about major architecture changes we may want to do in the coming 5 years.

With this discussion I have 2 goals in mind:
* Prepare XWiki for the challenges ahead (prevent it from being obsolete, have 
an architecture that can adapt to changes, etc)
* Make XWiki an interesting project to develop on for its developers 
(interesting technologies, interesting intellectual challenges, etc)


The futurist in me sees an ongoing struggle between hot new technologies and 
good working systems
as a reflection of the struggle between engineers who love to play and learn 
vs. managers who
have to balance budget. Of course we want something of a middle road where we 
pluck out the good
*ideas* from modern technologies without being swept up in every fad.



Let me start with a few large architecture items I can think about:

* Polyglotism for the front end. This is the ability to code the XWiki UI in 
any language (PHP, Node.js, javascript, native applications, etc). Right now 
XWiki UIs are coded mostly using Velocity (and a bit of javascript). It would 
be nice to be more agnostic and to attract non java developers. The core 
(Server part) could still be in Java but it would be nice that UIs and 
extensions could be developed in any language. One architecture that could 
achieve this would be to have a Core Server exposing all operations as REST 
APIs. This would decouple the Server part from the Client part. It also means 
having all XWiki API modules offering REST APIs and making it extra easy to add 
new REST APIs.


I agree with the thinking because I feel Velocity has lived out it's useful 
life, it drops all of
its variables to the global scope and doesn't have functions which makes 
debugging terrible and
it is positively alien to almost everyone who comes to XWiki.

That said, I want to caution about the lure of anything-anywhere-polyglotism. 
Each language has
it's own ecosystem. Virt.x project made a system where they advertized you 
could program in Java,
Javascript, Groovy, Ruby or Ceylon. As far as I know, this project has not 
gotten any significant
traction and I think this is because their Javascript is still alien to the 
Nodejs community,
their Ruby is still alien to the Rails community, etc.

My opinion is we must choose between being a little piece of a bigger ecosystem 
(node, php, etc.)
or we must build our own ecosystem (what we have been doing so far). That said, 
I still think we
should begin phasing out Velocity.

My recommendation is we pick a language which works reasonably well for our 
needs yet has
significant familiarity and then begin to make this language the Lingua franca 
of XWiki and begin
porting our Velocity code and documentation over to it.



* Greatly simplify development of XWiki Extensions. There are several 
directions that I can think of:
** Direction 1: Expose the resources making up an Extension on the file system 
and let users use their favorite tools to write the content (web IDEs, text 
editors, etc)
** Direction 2: Develop an IDE in the cloud. Again there are 2 options here:
*** Take ownership of the XWiki Web IDE application  and push it much further
*** Go in the direction of integrating with a well-known cloud IDE such as 
Eclipse Che/CodeEnvy (with pre-built workspaces, docker VMs and one click 
deploy to test your extension, direct deployment of extensions to e.x.o, etc).
** Other: In general, make it faster to develop extensions. The advantage of 
the Cloud IDE or Web IDE is that it removes the burden of setting up the tools 
on your local machine (maven, java, etc) so someone new to XWiki should be 
operational under 5 minutes to run the first tutorial.


From observation of the patterns which engineers choose to take when not 
pushed, I find that they
are generally more comfortable with direction 1. Obviously we in the Research 
team would be thrilled
to see WebIDE take on a new life as an XWiki product but I find that engineers 
will go to war for
their text editor and prefer a process which is more inline with what they know 
from nodejs, php or
ruby on rails.

The issues which I have observed blocking people who come to the platform from 
outside are:
* APIs are complex and undiscoverable, it's ok to put them on the website (php, 
nodejs do this) but they need to be first on google when you search XWiki API.
* No clear connection between development and github.
* Release process is crazy: requires maven flags, settings.xml, PGP keys, java 
versions. Compared to `npm publish` or bower where you need only git tag and 
you're done, this is stone age.
* Fragility because of layers of legacy code: for people used to throwing 
things together with node, ruby or php, the level of fault handling which is 
required to make a stable XWiki app is like working on the OS kernel.

1-3 are resolvable but #4 is something that is usually only fixed by doing a 
ground-up rewrite of
the application, something which will become easier and easier as time goes on 
due to the ever
maturing tools available in different ecosystems. This is an oppertunity and a 
risk, the oppertunity
is that we can take our body of knowledge and use it to build something new 
which is cheaper to
maintain and more exciting for others to use, the risk is that others will be 
increasingly able to
do the same.

A solution to everything is that we don't think of XWiki so much as a 
programming platform and try
to just make a very good KM tool, then we only fight #4 internally.



* Promote our SOLR Query Language as the main query language for XWiki and 
deprecate XWQL/HQL. Goal: make querying independent of the stores (right now we 
have a single store implemented on a RDBMS but we can imagine in the future 
moving to another type of store, and even to multiple stores).
** Make it easy for extensions to contribute new SOLR indexes for their needs.
** Once we use only SOLR QL we can then more easily switch to a different 
database model. We would also need the next Generation XAR format to be able to 
fully export an XWiki instance (or some part of it) into a XAR and be able to 
reimport it (right now lots of stuff are not in the XAR such as data found in 
the permanent directory).


Personally I would run the other direction, limit the database type in order to 
limit the scope
of problems which we might have to solve so we can concentrate on things which 
make customer value.
One way to do this is use an internal database (absolute control) paired with a 
"replication" store
(absolute flexibility).



* Move towards a container-based approach to install/deploy XWiki (docker, 
etc). The idea being to start being microservices-friendly. Right now we 
already have 2 such services in XWiki: external SOLR + office imports with an 
office server. We need to be able to include those in distributions and add 
more. It should be easy to develop a new microservice and set it up in XWiki 
for redistribution.


Docker images make problems go away for a tester so I agree with the idea, 
beware of microservices
for microservices sake though and also word on the street is nobody uses docker 
in production so
after the testing phase is over, people still need a solution to install for 
real.


** It may be interesting to investigate infrastructure frameworks such as 
kubernetes, apache karaf and the like. The goal being to build systems that are 
more responsive, resilient, auto-adaptive, elastic (see the reactive manifesto 
at http://www.reactivemanifesto.org/).

Buzzwords... please identify the pain-point first.

I suppose this is a good place to say that right now, everybody in the tech 
space runs the risk of
jumping from one dead technology to another because when an organization 
identifies that a
particular technology in their stack is dead, they are often lead to adopt the 
one which is making
the most buzz in the hopes of being a long time away from the (assumed 
inevitable) death of that
technology.

However, we must not lose sight of the Darwinian nature of technology 
lifecycles.

Moving from a dead technology such as PrototypeJS to a "hot" technology such as 
AngularJS runs a
great risk of picking a loser and then when we find that it's parent company 
abandons it in favor
of Angular2 and the community abandons both in favor of ReactJS, we're hosed. 
If instead we move
to a well established yet still reasonably modern technology such as JQuery, we 
have very little
risk of this being uprooted.


Globally I would recommend identifying what it is you want to make, then ensure 
there is a way to
make business around it (otherwise the project will go the way of 
GNU/everything) and then finally
start looking for the technologies/ecosystems which reach the best happy-medium 
between getting you
where you want to be, making innovation so they will still be with you in the 
future and being
stable and unlikely to go away.


Thanks,
Caleb


** We could imagine to be able to package our office import micro-services (ie 
including an office server) as an extension that can be installed and deployed 
in the XWiki system automatically.

Ok that’s already a good start.

Let me know if you feel excited by some of those and feel free to add more. 
Note that the idea here is to brainstorm about large architecture changes.

Thanks
-Vincent








_______________________________________________
devs mailing list
[email protected]
http://lists.xwiki.org/mailman/listinfo/devs

_______________________________________________
devs mailing list
[email protected]
http://lists.xwiki.org/mailman/listinfo/devs

Reply via email to