Not out of the box, and it probably doesn't make sense to do so out of the box.
Once you're in that situation, you might want to prefer using the Loader API, as documented here: http://www.jooq.org/doc/latest/manual/sql-execution/importing/importing-csv The loader API has the advantage that you can fine-tune: - Bulk size (i.e. the number or rows per statement) - Batch size (i.e. the number of statements per batch execution) - Commit size (i.e. the number of batch executions per transaction) The above metrics are highly database dependent. Usually, you shouldn't have too small / too big numbers for any of the above. There's also been a similar discussion on this user group, recently: https://groups.google.com/d/msg/jooq-user/PalloixXTr4/4ZCfA_gVGgAJ In the upcoming jOOQ 3.7, we'll also support loading non-serialized data - i.e. other formats than CSV and JSON, such as arrays, or records I hope this helps, Lukas 2015-09-25 9:40 GMT+02:00 jdoe <[email protected]>: > Hello! > > Appearing of lots of VALUES in my resulting INSERT query leads to > exceeding the max_allowed_packet size. > I'm wondering if jOOQ offers any "splitting" functionality for laaaaarge > INSERTs? > > -- > You received this message because you are subscribed to the Google Groups > "jOOQ User Group" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "jOOQ User Group" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
