Thanks to your help :D
2018년 4월 3일 (화) 오전 1:29, Francesco Chicchiriccò <[email protected]>님이 작성: > Hi Elena, > my personal congrats, it seems you've got most of the picture :-) > > If your HA setup is correct, in fact, writing any data via REST on node A > and then read the same data from node B is perfectly fine. > > When such data are in fact related to connector configuration (or even > resource configuration, if you are using override), things are effectively > a bit different, because the Spring bean associated to each connector gets > automatically refreshed only on the node where the REST create / update was > sent to. > > So, if connector update was sent to node A, provisioning tasks occurring > on node B will still have the old configuration, you're right. > > Workarounds: > > * invoke POST /connectors/reload on node B - this will make all connector > Spring beans to refresh their configuration from the underlying db > * restart the Java EE container on node B > * disable Quartz jobs execution on node B [1] > > The actual fix would be to implement a custom RemoteCommitListener [2] > which triggers the connector's Spring bean refresh on node B. > > Let me finally add that normally this issue is not very important because > the connector configuration is rather stable as soon as the deployment > reaches out HA environments (as it was fine-tuned in lower environments). > > Regards. > > [1] http://syncope.apache.org/docs/reference-guide.html#quartz > [2] > http://openjpa.apache.org/builds/2.4.2/apidocs/org/apache/openjpa/event/RemoteCommitListener.html > > > On 02/04/2018 11:05, Elena Hong wrote: > > oops, I have mistake in my question. sorry. > > I have trouble when do provisioning not call read API. > > As you say, read connector API is working well. It loads data from DB(Of > course My A and B server using same DB). > > But during provisioning, Syncope loads data from spring bean factory is > managed in instance's InMemory(I guessed). > > When run provisioning task, call doExecute method from > AbstractProvisioningJobDelegate.java > > And it loads connector data via ConnectorFactory.java 's getConnector > method. > > I can see getConnector method in ConnectorManager.java. It loads data from > beanFactory. > @Override > > public Connector getConnector(final ExternalResource resource) { > > // Try to re-create connector bean from underlying resource (useful for > managing failover scenarios) > > if > (!ApplicationContextProvider.getBeanFactory().containsBean(getBeanName(resource))) > { > > registerConnector(resource); > > } > > > return (Connector) ApplicationContextProvider.getBeanFactory(). > getBean(getBeanName(resource)); > > } So, After I update connector via Syncope server A , It updates DB and > own spring bean. > Then read connector API works well and I can see the updated connector > data from management console. > But server B's bean is not updated yet. In this case, If provisioning run > in Server B, B has 'old' connector data. > > > > > 2018-04-02 11:03 GMT+09:00 Elena Hong <[email protected]>: > >> How can each syncope servers in high available environment share >> connector which saved as spring bean at inmemory? >> >> >> >> * My environment. >> >> I set high available with two syncope servers called A, B and nginx. >> >> >> >> * My problem >> >> 1. I call connector update api to nginx. >> >> 2. nginx call syncope server A, and update connector 'new' data in DB and >> spring bean. >> >> 3. I call connector read api to nginx. >> >> 4. nginx call syncope server B, then B returned 'old' data at spring bean. >> >> >> >> How can I solved it..? >> give me a tip please.. >> > > -- > Francesco Chicchiriccò > > Tirasa - Open Source Excellencehttp://www.tirasa.net/ > > Member at The Apache Software Foundation > Syncope, Cocoon, Olingo, CXF, OpenJPA, > PonyMailhttp://home.apache.org/~ilgrosso/ > >
