Team, sql.withBatch(batchSize, query) {} does not give performance benefit. Inserts rows one by one in database
Below is the code I used ``` sql.withBatch(1000, phQuery) {BatchingPreparedStatementWrapper ps -> rows.each { ps.addBatch(it) } } ``` The observations are below 1. The time taken to insert 10 lac rows is same as individual insert statements that I was having previously 2. Also, in Database, the row count increases one by one and not in multiples of 10000 I was expecting that 'Sql' class will commit the rows once for given batch size. But, it did not happen that way. My expectation may be wrong ! So, I changed the code as below ``` int count = 0 Connection con = sql.connection con.setAutoCommit(false) def stmt = con.prepareStatement(phQuery) rowValues.each { Object[] row = it row.eachWithIndex {o, i -> stmt.setObject(i + 1, o)} stmt.addBatch() count++ if (count == batchSize) { stmt.executeBatch() con.commit() count = 0 } } //to commit the balance rows stmt.executeBatch() con.commit() ``` This code gave expected my required performance. But, it is java way, not groovyish way !!! Is it wrong to expect this performance level from `sql.withBatch(batchSize, query) {}` itself? Just for reminder? -- Sent from: http://groovy.329449.n5.nabble.com/Groovy-Dev-f372993.html