I am not one to define best practices for ofbiz.
Japoco wrote the Datafile import and is the one to converse with about
using it.

What I have done

to make the import of very large(50megs -gigs) files faster i have
eliminated all xml type classes.
I so I don't pull-in or buffer large amounts of data.
I parse by line not by element.
I use standard services to create, update, delete data.
the routine that does the import, with stubs to the services takes 5 min
, to read in, for a 500mb file.
I parse a 50meg file on a 1 gigherts cpu in about 17 min.
most of that time is the services that put data in ofbiz


ian tabangay sent the following on 9/30/2008 1:04 PM:
> Thanks that was very helpful. Would you suggest some best practices to
> properly handle batch processing and to avoid timeouts? Im currently toying
> with the idea of propagating work into multiple threads. What do you think?
> 
> Ian
> 
> On Tue, Sep 30, 2008 at 8:13 PM, BJ Freeman <[EMAIL PROTECTED]> wrote:
> 
>> if Service A is not interacting with the entities, you do not have to be
>>  concerned with transactional time outs. However if you are pulling the
>> CSV from a remote source, there is a timeout for the stream, which is
>> different.
>> I pull 500mb of data(xml) remotely, and have yet to run into a time out
>> related to the stream.
>> now if Service A calls Service B for each line of the CSV, like runsync
>> then there is a transaction time out associated with it. I have not run
>> into time outs using runsync.
>> there are services that call other service in ofbiz. Under maximum use,
>> which I have not tested under, there may be some timeouts. If that were
>> the case I would look to other remedies other than adjusting the time out.
>> That all said. if you use the data or webtools for imports you are doing
>> one big transaction to import data directly into the DB, not thru other
>> services, for the whole file, at that time the Transaction time out is
>> important.
> 

Reply via email to