I have written an application using camel-jpa to extract batch of 10,000
records from old table, transform each entity object(25 fields) to another
entity object(15 fields) and persist to new table using JpaProducer. I have
given 1Gb memory when starting the application. 

I have noticed it takes about good 30 mins to complete 40,000 records.
Basically, per second only 22 records are processed. I am assuming that it
should be much faster like process 60 records in a second. is that
reasonable suspicion that 22 records per second is slow? JPA insert(store
transformed entity object to new table) and  update(mark each row in old
table that it is processed) are taking too long. I have noticed camel
server(daemon that constantly gets update and persist to new table) gets
slower after 20,000 records. I am using batchprocessing where 10,000 records
will be committed in a batch. The transformation logic (used bean injection
for translation) doesn't take any time and can be considered that it takes
in microseconds to complete transformation.   

I have ensured that index are put in place for old table and new table.
There is no need of second level cache in this scenario. I have used UUID to
generate unique key when inserting new record. Yet this apps take 30 mins
for 40,000.  

I have looked at Hibernate JPA performance tuning perspective and applied
the changes as explained in second paragraph. Are there any optimization
that I could make in Camel? 


is there JDO component support in next Camel release?
-- 
View this message in context: 
http://old.nabble.com/Performance---Camel-JPA-tp27412920p27412920.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Reply via email to