Well I'm not saying that it's the db instance. I just said something isn't closing...
in looking at withBatch/addBatch, you seem to be using it incorrectly. Take another look at the functionality. Owen Rubel 415-971-0976 oru...@gmail.com On Tue, Jul 14, 2015 at 7:20 AM, Daniel Price <danprice...@gmail.com> wrote: > I continue to use the db instance and close it at the end of the script. > If that were the issue, would it not occur 100% of the time? Hangs only > happen about 1/20 runs... I'll experiment with closing at the end of each > batch... > > Thanks, > D > > On Tue, Jul 14, 2015 at 9:54 AM, Owen Rubel <oru...@gmail.com> wrote: > >> If hangs are always happening on end of batch, you are probably not >> closing something. >> >> Owen Rubel >> 415-971-0976 >> oru...@gmail.com >> >> On Tue, Jul 14, 2015 at 6:20 AM, Daniel Price <danprice...@gmail.com> >> wrote: >> >>> Groovy script withBatch() is still hanging... >>> >>> Database trace and activity monitor do not indicate any locks, and all >>> data is inserted/updated as expected, so this doesn't seem to be a DB >>> issue... >>> >>> I'm wondering why I'm not seeing a non-empty return from my batch code: >>> >>> myList = [['a','b'],['c','d']] >>> >>> def result = sql.withBatch(someInt, "insert into myDB.dbo.myTable >>> (column1, column2) values (?,?)"){ ps -> >>> myList.each{ >>> ps.addBatch(it) >>> } >>> ps.executeBatch() >>> } >>> println "result: " + result >>> >>> prints: >>> result: [] >>> >>> Shouldn't result == [1] * myList.size()? >>> >>> Hangs always happen upon completion of the batch... >>> >>> Regards, >>> D >>> >>> >>> >>> On Mon, Jul 13, 2015 at 4:06 PM, Daniel Price <danprice...@gmail.com> >>> wrote: >>> >>>> Good Afternoon, >>>> >>>> I'm experiencing infrequent (1/20) hangs with a Groovy script that >>>> uses withBatch() to insert and update a SQL Server DB using stored >>>> procedures. In each case (insert or update), the batch completes as >>>> indicated by database content, but the script hangs and does not continue >>>> until I ctrl-c it. This script is the only user of the table, so DB >>>> locking isn't involved. The batches are only about 200k rows, but I set >>>> the withBatch() parameter to 25k to be nice to the DB. Has anybody seen >>>> this before? Any suggestions? Thanks! >>>> >>>> D >>>> >>> >>> >> >