Pierre - Great insight! It would be awesome if each type of import the dependencies were identified as I know that could be the snake in the ground! Thanks again for great insight!
James -------- Original Message -------- Subject: Re: Legacy Data From: Pierre Smits <[email protected]> Date: Fri, May 03, 2019 1:21 am To: [email protected] In the recent past, and for several customer projects, I have been involved in moving quite a lot of data from the DBs of other (legacy) solutions into others (DWH, etc.) with ETL solutions. One of those projects was from a legacy system to the OFBiz Dbs (both the transactional and the dwh). Before I delve deeper into benefits and cost of using such a solution, I want to remark that, no matter what tool is used (Excel, CSV, etc.) to move such legacy data into the transactional and dwh databases of OFBiz, such activities require a lot of planning [1], development [2] and testing [3] before data can be moved into the production environment. [1] The entity model of OFBiz is way more complex than some may anticipate (based on experiences with their legacy solution) as there are a lot of interlocking dependencies between tables. Careful planning of what needs to be loaded in what sequence requires in-depth knowledge of the entity model; [2] Legacy data does/may - in my experience - adhere to different paradigms regarding field type definitions. That requires - quite some - transformation of field values before it can be loaded into the OFBiz database(s), [3] Before loading into the database(s) in the production environment, the load part of the ETL processes must be tested in a separate environment to ensure that - at the moment of loading into production - the disruption of operational activities (of users and/or OFBiz services) is minimised as the risk of such (and the potential rebuild of the production environment) is quite costly. Using an ETL solution to migrate date delivers several benefits: 1. Extracting data can be done against various sources, whether that be from a (live) database, Excel sheets, various data files in CSV, JSON and/or XML format, 2. Transforming data values, as well as combining data from different sources can be done (developed) in various ways, and tested step by step, 3. Changing where the ETL solution is going to load the data into is fast, and this data can go into various formats: database, Excel sheets, various data files in CSV, JSON and/or XML format, 4. The whole of ETL processes can be executed both on-demand and to off-business hours (scheduled) to cater for both a) need data now and b) minimisation of operational disruptions. The key thing to keep in mind is that migrating legacy data into OFBiz databases does NOT mean that it is being processed by the OFBiz engine automatically. The OFBiz engine has a lot of services that are triggered by other services (consider the gl transaction creation on the state change of an invoice or payment). Loading the legacy data into the transaction db of OFBiz implies that such state changes needs to happen on these kinds of parent objects to get to the desired end result. Another thing to keep in mind is that OFBiz is - for most - intended as a transactional solution. With legacy data one should distinguish between what is needed for operations and what is intended for historical insights through the business intelligence. Often, if not always, a lot of the legacy data is just intended for that, and one should consider to aggregate (some of) the data into the OFBiz clap database. If anyone wants more insights and/or assistance, feel free to contact me directly. Best regards, Pierre Smits *Apache Trafodion <https://trafodion.apache.org>, Vice President* *Apache Directory <https://directory.apache.org>, PMC Member* Apache Incubator <https://incubator.apache.org>, committer *Apache OFBiz <https://ofbiz.apache.org>, contributor (without privileges) since 2008* Apache Steve <https://steve.apache.org>, committer On Thu, May 2, 2019 at 5:48 PM Paul Mandeltort <[email protected]> wrote: > Using Excel XML features: > -In Web tools: > -Export the entity you want to work with, one at a time > - open it in Excel Windows > - Last I checked, Mac excel XML support sucks. But Mac > Excel sucks in general. > - Excel will figure out the schema and embed it so you can save it > as XML from there. > > I’m going off of memory here as I haven’t done it in a while but I’ve > found at the time it was the easiest way to directly load/update data in > the entity engine outside of directly writing data. > > It’s possible, but not sure if it’s advisable, to directly load data into > the underlying database via an ETL tool like Pentaho BI. Anyone know the > risks of doing this? I haven’t had time to research the implications. > > —P > > > > On May 2, 2019, at 10:12 AM, Michael Brohl <[email protected]> > wrote: > > > > You can also easily generate load data by Excel itself: > > > > - create a new sheet and insert a formula concatenating the static parts > with the dynamic parts/cells of the data sheet (see example). > > > > - apply the formula to each row by dragging the lower right corner of > the cell containing the formula for the number of rows containing data > > > > - copy the first column and paste into the load data xml file. > > > > Example: > > > > =CONCATENATE( > > "<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"' > externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />" > > ) > > > > Regards, > > > > Michael Brohl > > > > ecomify GmbH - www.ecomify.de > > > > > > Am 01.05.19 um 18:22 schrieb [email protected]: > >> We are trying to upload legacy customers into ofbiz using the data > >> import tool. Is there a spreadsheet template to utilize? I uploaded my > >> file but the "not processed" field still shows 0. Can someone send me > >> the excel template for uploading? > >> > >> Thanks, > >> > >> James > > > >
