The idea is to split the work between a generic message integration
platform - a standard API - for module developers, and the actual sync
implementation - which might be too context dependent. Dukai, for
instance, implemented the openerp-to-openerp integration using the
generic platform. O2O integration is very good, but for me, the ability
to easily integrate with a company's existing software infrastructure
is very powerful.
Overriding create/write is one way to achieve trigger message sending.
I did it because I wanted to sync non-workflow objects, like
res_partner create/update.
Ideally it would be part of the workflow:
def action_send_new_sales_order(...):
data = "">
headers = {'action':'new_sales_order'}
self.mbi.send(headers, data)
It could be a nice idea to implement message sending in a transaction
that would send on commit, not for performance but for integrity. BTW,
STOMP support transactions.
def action_post_sales_order(...):
data = "">
headers = {'action':'post'}
self.mbi.send_on_commit(headers, data)
and then override cursor.commit() to send the queue.
However, I don't want the module to do unexpected things like not
sending a message when the developer intends to. For example:
def action_do_something(...):
# Get the data
data = "">
self.mbi.send(data)
# Do something that changes the data
new_data = modify_data(data) self.mbi.send(new_data)
# Restore the original data
restore_old_values(data) self.mbi.send(data)
If I used a hash set, on commit it would send two messages and end
up in an inconsistent state. It's better to leave up to the developer
to decide when to send the message.
ETL, is a very nice technology and Kettle specially has been a life
saver on former projects. "Enterprise Integration Patters"
(http://www.eaipatterns.com) is a great book on integration and talks
extensively when ETL, Messaging, RPC or database sharing should be
used.
Raphaël Valyi escreveu:
Cloves, I saw your STOMP stuff of course, I was actually
thinking Dukai made something new over it.
When you say you have overwritten the create and write methods, I think
this can be made compatible with the SARTRE rule engine by Smile that
will only fire if a set of conditions is matches. I mean, it's
technically possible, now I'm not sure how useful that would be.
Now, overriding create and write made me wonder a lot. Indeed, if you
look at the audittrails module, while it have obvious implementation
limitations (doesn't catch workflow changes for instance), it's not
done this way, it's done intercepting methods at the osv layer.
I'm not sure that's the reason, but I thought that it seems to happen
very often some complex methods, especially when overriden by modules
call the write methods several times upon the same object instance...
Meaning you are likely to fire way too many messages to your MQ...
I wonder if the audittrails method is the right approach, or if your
approach could say accumulate the write params in a hash each time
write is written and fire the message to the MQ only at the end of the
action (same moment as the commit if you like).
Did you guys had that issue and work around it?
Now, I like that approach to be streamline OpenERP to OpenERP
communication because even if Posgres rocks, SQL systems are
intrinsically unable to scale past to a certain size (that's why GMail,
Facebook, Twitter... barely use SQL; it's easier for those who are not
ERP of course ). For instance I would never recommend using the same
OpenERP for two holdings of the same trans-national company located on
two different continents: the network latency would totally kill the
speed and no real time multi-master SQL replication at this scale would
be efficient either, not to speak about the insupportable coupling the
two companies would have functionally and operationally.
Hope you can answer my question and that my pointers might help some
(better doc coming soon).
NB: tomorrow, OOOR-1.2.7 will ship with an API making it
straightforward to find resources by their ir_model_data ids or create
absolute ids along with the resources.
On Mon, Feb 22, 2010 at 10:02 PM, Cloves
Almeida <[email protected]>
wrote:
For the details of the implementation, we used the STOMP protocol to
communicate with the ActiveMQ server. I embedded the stomppy client
into the module for easier distribution - it's pure Python and very
light. ActiveMQ do the STOMP-JMS translation, mnkieaning any
application
that can understand JMS and XML/YAML can produce/receive OpenERP
messages.
Why not XML-RPC? Because is both synchronized and point-to-point. I
could build an async, multi-point layer over XML-RPC, but then the end
result would look like a crappy, non standards compliant MQ broker :).
To trigger the sending of messages, I overloaded the "write" and
"create" methods. It would call helper methods to serialize and send
the message to a preconfigured "queue" in a local MQ. There's no need
for timed triggers, the "de facto" sending of messages through the
network is MQ's responsibility - there are a number of methods for
reliable MQ-to-MQ bridge. To receive the message, one would subscribe
to the queue on a MQ server (the same or bridged, local or not) and
register listener methods that would be called on method arrival.
Again, no need for timers, since this is done as soons as the message
is available to server - message persistence and delivery is the MQ's
responsibility. A simple message header "if" condition or XPath query
serve as a "filter".
Exception handling, enterprise quality monitoring and administration,
and a powerful routing engine (Apache Camel) are also features of
ActiveMQ - even if I didn't use them.
I started to develop the module for integrating into our existing POS
system, but the system was so bad we decided to replace it entirely. It
meant I had to pause developing the module to work on installing the
new POS system. But in a few months I'll roll out an OpenERP instance
for our accounting dept. and I'll definitely need to integrate it with
a number of existing systems and I intend to use it.
-
On 22-02-2010 19:33, Raphaël Valyi wrote:
Hello,
maturing OpenERP to use asynch message bus is interesting
indeed, especially for multi-company and/or larger companies which will
be used to J2EE ESB's.
I didn't look into details about what you and Cloves did,
but I
let you know that Tiny did a parallel effort with the Twisted Python
asyn messaging system. Here is the work of Stephane Wirtel:
In any case, I hope you can join efforts rather than double
work
on those issues.
I think this is also a place where a total or partial (eg
in redundancy with a CPython instance) usage of OpenERP on Jython would
shine as the Java platform has plenty of valuable and mature ESB
systems generally built upon the JMS protocol.
Finally, may be something that could be used as a rule
engine to
trigger an anych message emission could be the "SARTRE" rule module by
Smile in extra addons now (currently use to trigger alerts and standard
server actions). Then message emission would be a regular server action
triggered by that rule engine upon given conditions. Not sure, but
something to investigate may be. Would also be nice to have the
reverse: a rule engine that know how to analyse the message queue en
trigger a given server action if the message matches a given condition,
may be similar to what the Smile SARTRE module does. Again not sure.
Hello!
There are tools already present that could be used to integrate
OpenERPs (And
other systems, too. It's based on xml.) and it would work on slow,
unreliable
connections.
IMHO to develop something like this would take two workdays:
-Confirming a PO creates an SO at the supplier's side.
-It matches products with bar code and partners with VAT number for
example.
Or it uses a unique identifier for products and in the first run maps
those
identifiers to bar codes, and if didn't succeed, uses
product.supplierinfo.
-Confirming the SO at the supplier's side validates the PO at the
customer's
side.
-Any number of connections can be used and they are relatively easy to
set up
for an administrator.
-It stores the messages if you or the other parties aren't online and
automatically sends them as soon as possible.
This is all based on Cloves Almeida's mbi module that is the tool to
turn any
object into xml/read from xml and send/receive it using stomp (we use
it with
an ActiveMQ broker).
I've created a module mbi_openerp that adds easy setup routing, easy
setup
CRUD operations synchronization between OpenERPs.
My original intent was to provide even more: send workflow events and
send the
call of methods, too. That way I would have synchronized basic OpenERP
instances with only the stock module installed (used at warehouses)
with the
company's main db. Although this approach didn't prove to be reliable,
stripping out the workflow and method sending code would result in a
small base
module suitable for B2B integration that I mentioned above.