Hi All,
What we need to log as audit log is,
{Time, SequenceId, UserId, Action, Subject, optional(OldSubject),
optional(NewSubject)}
This will allow one to do audit trace on the question "Who did what on
which?"
We will not log all the detail on the OldSubject or NewSubject. This needs
to be
Hi Susinda,
Currently I am evaluating approaches for generating avro schema from XSD
keeping option 1 in mind.
[1] seems to be a reliable option, but the generated avro schema is not in
the accepted format for us.
[1] https://github.com/Nokia/Avro-Schema-Generator
Regards
Awanthika
For Datamapper we have to give an option to do data mapping starting from
xml payload (that user may already have). For this we need to to create
avro schema from xml instance. Possible approaches may be
1. Generate xml schema(XSD) from xml and then create avro schema from XSD.
2. Directly
Hi,
We have implemented multiple versions support to WebApps and Sites in AppM.
Here, we allow users to create multiple versions of a WebApp/Site and
select a particular version as the default.
So users can access the default version of the app with the default URL
(eg: host_name/context) and
adding azeez, Sameera,
what are your thoughts on having these REST services version-ed.
On Wed, Feb 24, 2016 at 7:03 PM, Asitha Nanayakkara wrote:
> Hi Hemika,
>
> Having a version number (which is separate from the product version number)
> would be beneficial in my opinion.
Hi Sanjiva,
In the current approach also, we do have mediator level information. Though
ESB publishes aggregated info to DAS, using a relational provider
implemented in DAS, we split the aggregated info in to mediator-level info
while retrieving for analyzes. (basically, when we load the
Hi Hemika,
Having a version number (which is separate from the product version number)
would be beneficial in my opinion. If there is an API change between
releases, this would prevent clients from calling the new API with a client
written for an older API. In addition if we have no API changes
IMO we're worrying about the wrong thing.
No one will ever WANT to trace 1000 TPS unless its tracing for audit
purposes. If its for tuning/debugging then what we need is APIs that will
let people dynamically turn on traciny VERY selectively.
I prefer if we log per mediator because then we can do
Hi All,
We are planning move all MB admin services which was supported through web
services to REST services.
The proposed REST services will built on top of org.wso2.msf4j.feature
feature along with Jersey. The REST service component will reside alongside
with the broker in the andes bundle(the
+1 looks good, for testing prepuces we can also deploy some related
executionplans, stream and publishers to tap intermediate analytics
results.
Regards
Suho
On Wed, Feb 24, 2016 at 1:44 PM, Sachith Withana wrote:
> Hi all,
>
> We are going to implement the integration tests
Hi Indika,
The jaas authentication will handle the AMQP URL username/password. It will
have a custom callback handler to extract the username and password from
the URL and create a authentication subject to be authorized from.
I planned to keep the create queue/topic permission separately from
Hi Sanjiva,
Yes we are indeed using a stream definition and publishing the events using
Thrift.
But in doing so, there were two approaches we considered:
1. ESB publishing a single event per mediator in a message flow.
2. ESB publishing a single event per message flow (rather than for
Hi Indika,
comments inline.
On Wed, Feb 24, 2016 at 2:32 PM, Indika Sampath wrote:
> Hi Akalanka,
>
> I have few questions.
>
> 1. How are we authorize user's who connect to broker via JMS client
> program? In that case we only passing username and password in AMQP URL.
>
Hi Akalanka,
I have few questions.
1. How are we authorize user's who connect to broker via JMS client
program? In that case we only passing username and password in AMQP URL.
User may not assigned to any role at the moment other than default role
(Internal/everyone).
2. Is subscribe action
Hi all,
We are going to implement the integration tests to cover the alerting
scripts being implemented in the process of developing Analytics for APIM.
The integration test would be run against the DAS distribution for APIM
which would contain all the artifacts related to the APIM analytics
15 matches
Mail list logo