Hi all,

We've been working on analytics filter for APIM 2.5 microgateway. Following
is the design we came up with.

*1. Filtering the event data and writing them to a file.*


>From the request stream, the relevant fields will be extracted and
requestDTO will be populated using those attributes.
EventDTO consist of streamID, timestamp, metadata, correlation data and
payload data.
EventDTO can be populated using requestDTO.

Likewise when the response filters are available, we can extract the
necessary attributes and fill ExecutionTimeDTO and ResponseDTO. Then the
eventDTO can be populated corresponding to response related attributes.

Once EventDTO is populated, we publish those events in to event stream.
A method to write those events to a file has been subscribed to the event
stream at the gateway initialization process.
Hence whenever a events stream gets an event, it will write that event to
the file we are defining.


*2. Event publishing from files to analytics server.*

Files will be written with .dat extension.
For example say "api-usage-data.dat".

In APIM v2.2 micro gateway, this file would rotate if the file size exceeds
12mb or after a specific time interval. These values could be set as
configurable values.
Rolled file will be compressed with the timestamp.
eg: *api-usage-data.{timstamp}.zip*
This would reduce 12mb file into (approximately) 4kb file.

Then this zip(s) will be uploaded and persisted in a DB. This will be done
using ballerina tasks(to make it happen periodically).
Then the relevant entries (files) will be read from the DB and populate the
events by reading line by line of that files. Then those events will be
published to the analytics server using existing data publisher.




Thanks,
DinushaD




-- 
Dinusha Dissanayake
Software Engineer
WSO2 Inc
Mobile: +94712939439
<https://wso2.com/signature>
_______________________________________________
Architecture mailing list
Architecture@wso2.org
https://mail.wso2.org/cgi-bin/mailman/listinfo/architecture

Reply via email to