Kettle (PDI) is very useful for building small ETL definitions
quickly, building complex data migration tool for Mifos on top of
Kettle might be just too much of effort not just in learning curve but
also getting hit by it's performance limitations.

Same applies to the Mifos REST APIs (which I assume is being referred
as Import API in thread).

Data migration which is bulk/batch process needs very high level of
optimization which should be done using Mifos code (model). I favor
this approach because having a separate module using direct code can
 - Avoid dependency on another layer to get fixed for bulk processing
 - Create different optimization required for bulk processing
(caching, lazy loading, memory setting)
 - Create separate/specialized Queries/Daos for bulk processing
 - Reuse the existing code and business rules Mifos code (DRY)
 - Separate Mifos Business rules (refactoring) in Mifos core so that
those can be used easily in migration tool. (not necessarily required
but it could help in making Mifos more cleaner implementation)

The data is available in various format when you preform migration so
we should support configurable basic data parsing/migration (SQL, CSV,
XLS, XML etc) and translation, also provide easy way to add new data
implementation for unknown cases (data types).

Udai

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
_______________________________________________
Mifos-users mailing list
Mifos-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mifos-users

Reply via email to