Re: [Dev] upgrading to analytics 6.4.0

2018-11-01 Thread Ramindu De Silva
Hi Bernard,

EI analytics 6.3.0 uses WSO2 DAS 
runtime. EI analytics 6.4.0 uses WSO2 SP
 runtime. WSO2 SP is a total rewrite
of the WSO2 DAS, which has a feature enhancements as well.

On Tue, Oct 23, 2018 at 1:50 PM Bernard Paris 
wrote:

> Hi devs,
>
> we were using these only 4 DBs to make analytics aggregating datas from
> our ESB until version 6.3.0 ;
>
> ANALYTICS_CARBON_DB
> ANALYTICS_METRICS_DB
> ANALYTICS_EVENT_STORE_DB
> ANALYTICS_PROCESSED_DATA_STORE_DB
>
> We have our from the scratch data aggregation mechanism

instead of using apache spark which was in DAS. And that the the
explanation for not having configurations for ANALYTICS_EVENT_STORE_DB and
ANALYTICS_PROCESSED_DATA_STORE_DB's .
We still have the ANALYTICS_CARBON_DB and ANALYTICS_METRICS_DB.
Please look into Monitoring Stream Processor
 in-order
to configure the metrics for WSO2 SP.

>
>
> These were postgres databases.
> Now I see there are more then 10 databases preconfigured in the default
> analytics 6.4.0 config (conf/dashboard/deployment.yaml
>  and conf/worker/deployment.yaml).
> Well, …  this is suggesting me lot of questions.
>
> First of all, is it still recommended (like it is for ESB databases)
> to do *not* use local H2 databases in production environment ?
> This question comes because the 6.4.0 analytics seems to me to be used as
> it is, out of the box, lot of DBs and no documentation for a any
> configuration as it was for previous versions.
>
Yes. We still recommend NOT to use the embedded H2 databases.

>
> If we need to create external DBs for all the stuffs, what exactly are
> each DB for ?
>
Please refer Configuring Datasources


>
> Is there any migration tool and/or documentation about migrating from
> analytics 6.3.0 to 6.4.0 ?  (
> https://docs.wso2.com/display/EI640/Upgrading+from+WSO2+EI+6.3.0 does't
> talk about that)
>


> Any matching between former 4 DBs and the 6.4.0 new ones ?
>
ANALYTICS_CARBON_DB - In-order to use/ migrate this database the previous
analytics, please answer the following questions

Seems, you are not using a usr-mgt database, Do you have add additional
users? If not its not necessary to migrate the carbon db. You just can move
on with a new db

ANALYTICS_METRICS_DB - Metrics values stored in the DB depends on the node
that we run. But IMO since this is a new version of the product, there is
no use of migrating the older metrics data into the new one.
ANALYTICS_EVENT_STORE_DB and ANALYTICS_PROCESSED_DATA_STORE_DB - This is
replaced by aggregation tables. And aggregation

will be done via Siddhi 

>
> Is there a way to keep (transfer into 6.4.0) datas we collected with
> previous analytics  version ?
>
We are currently looking at several methods in-order to migrate the
ANALYTICS_EVENT_STORE_DB and ANALYTICS_PROCESSED_DATA_STORE_DB data and we
will update you on that regard.

>
> Thanks,
> Bernard
>
> ___
> Dev mailing list
> Dev@wso2.org
> http://wso2.org/cgi-bin/mailman/listinfo/dev
>

Best Regards,
Ramindu.
-- 
*Ramindu De Silva*
Senior Software Engineer
WSO2 Inc.: http://wso2.com
lean.enterprise.middleware

email: ramin...@wso2.com 
mob: +94 719678895
___
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev


Re: [Dev] [Siddhi] Incorrect results when having two window joins with sums

2018-11-01 Thread Ramindu De Silva
Hi all,

Thanks Tishan for the explanation.
I have used step 1 and 2 with "join with window.length(1) of each stream"
and Im getting the results as expected.

Best Regards,
Ramindu.


On Sun, Oct 28, 2018 at 11:18 AM Sriskandarajah Suhothayan 
wrote:

>
>
> On Wed, Oct 17, 2018 at 11:59 PM Tishan Dahanayakage 
> wrote:
>
>> Hi Ramindu,
>>
>> Each time a join happens s.amount is populated as 100 and it is a current
>> event. When that current event reach the sum attribute aggregator it will
>> keep on adding. In other words sum(s.amount) represent the sum of amounts
>> that joined with consumptions stream not sum of amount came within last
>> hour.
>>
>> If we are to accurately achieve this requirement we should take below
>> approach.
>> 1) Window query to calculate running sum of last hour of supply stream
>> 2)  Window query to calculate running sum of last hour of consumption
>> stream
>> 3) Pattern query to compare each supply with proceeding consumptions
>> which fulfills the conditions.
>>
>> +1
> #3 can also be a join with window.length(1) of each stream, just to join
> and compare .
>
> Thanks,
>> /Tishan
>>
>> On Wed, Oct 17, 2018 at 10:46 PM Ramindu De Silva 
>> wrote:
>>
>>> Hi all,
>>>
>>> In tutorial[1] which we are using for our labkit as well has the siddhi
>>> app as follows.
>>>
>>> @App:name('MaterialThresholdAlertApp')
>>>
>>> @source(type = 'http', @map(type = 'json'))
>>> define stream MaterialConsumptionStream(name string, user string, amount 
>>> double);
>>>
>>> @source(type = 'http', @map(type = 'json'))
>>> define stream MaterialSupplyStream(name string, supplier string, amount 
>>> double);
>>>
>>> @sink(type='log', prefix='Materials that go beyond sustainability 
>>> threshold:')
>>> define stream MaterialThresholdAlertStream(name string, supplyAmount 
>>> double, consumptionAmount double, user string, supplier string);
>>>
>>> from MaterialConsumptionStream#window.time(1 hour) as c
>>> join MaterialSupplyStream#window.time(1 hour) as s
>>> on c.name == s.name
>>> select s.name, s.amount as supplyAmount, c.amount as consumptionAmount, 
>>> user, supplier
>>> group by s.name
>>> having s.amount * 0.95 < c.amount
>>> insert into MaterialThresholdAlertStream;
>>>
>>>
>>> But in-order to check the total consumed amount is greater than 95% of
>>> supplied amount, we should have the query as follows.
>>>
>>> select s.name, sum(s.amount) as supplyAmount, sum(c.amount) as 
>>> consumptionAmount, user, supplier
>>>
>>>
>>> But the results gets printed as follows when we simulate using the
>>> following steps,
>>>
>>>1. Simulate "*MaterialSupplyStream*" with "sugar, yyy, 100"
>>>2. Simulate "*MaterialConsumptionStream*" with "sugar, xxx, 97"
>>>   1. Materials that go beyond sustainability threshold: :
>>>   Event{timestamp=1539794935863, data=[sugar, *100.0*, 97.0, yyy,
>>>   xxx], isExpired=false}
>>>   3. Simulate "*MaterialConsumptionStream*" with "sugar, xxx, 97"
>>>   1. Materials that go beyond sustainability threshold: :
>>>   Event{timestamp=1539794936733, data=[sugar, *200.0*, 194.0, yyy,
>>>   xxx], isExpired=false}
>>>   4. Simulate "*MaterialConsumptionStream*" with "sugar, xxx, 97"
>>>   1. Materials that go beyond sustainability threshold: :
>>>   Event{timestamp=1539794937643, data=[sugar, *300.0*, 291.0, yyy,
>>>   xxx], isExpired=false}
>>>
>>> Even though we dont send event to the *MaterialSupplyStream,* the
>>> summation adds 100 each at each join. It should be resulted with the
>>> initial 100 that we sent. Is it? Please correct me if I'm wrong.
>>>
>>> Best Regards,
>>> Ramindu.
>>>
>>> 1. https://docs.wso2.com/display/SP430/Correlating+Simple+Events
>>>
>>>
>>> --
>>> *Ramindu De Silva*
>>> Senior Software Engineer
>>> WSO2 Inc.: http://wso2.com
>>> lean.enterprise.middleware
>>>
>>> email: ramin...@wso2.com 
>>> mob: +94 719678895
>>>
>>
>>
>> --
>> *Tishan Dahanayakage* | Associate Technical Lead | WSO2 Inc.
>> (m) +94716481328 | (w) +94112145345 | (e) tis...@wso2.com
>> GET INTEGRATION AGILE
>> Integration Agility for Digitally Driven Business
>>
>> Disclaimer: This communication may contain privileged or other
>> confidential information and is intended exclusively for the addressee/s.
>> If you are not the intended recipient/s, or believe that you may have
>> received this communication in error, please reply to the sender indicating
>> that fact and delete the copy you received and in addition, you should not
>> print, copy, re-transmit, disseminate, or otherwise use the information
>> contained in this communication. Internet communications cannot be
>> guaranteed to be timely, secure, error or virus-free. The sender does not
>> accept liability for any errors or omissions.
>>
>
>
> --
> *S. Suhothayan* | Director | WSO2 Inc. 
> (m) (+94) 779 756 757 | (e) s...@wso2.com | (t) @suhothayan
> 
> GET INTEGRATION AGILE
> Integration Agility for Digitally 

[Dev] Validity of access token after OIDC SLO

2018-11-01 Thread gayan gunawardana
Hi Devs,

I followed exact instructions in IS 5.7.0 and got logout working. However
issued access token is valid even after logout (I have checked with token
introspection). Is that the correct behavior or any justification ?

[1] https://docs.wso2.com/display/IS570/Session+Management+with+Playground

Thanks,
Gayan
___
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev