I am trying to use the SqlBulkInsert macro to insert records from a 250 million line csv. I set the "BatchSize" parameter to 10,000 thinking that this will make the engine perform a bulk insert operation every 10,000 records. However, this isn't happening...looking at the Rhino.Etl source code, it appears that the SqlBulkInsert operation tries to insert all the records from the rows value at once...this won't work when there are a lot of rows like in my case.
Is there another way to batch out the bulk inserts? -- You received this message because you are subscribed to the Google Groups "Rhino Tools Dev" group. To view this discussion on the web visit https://groups.google.com/d/msg/rhino-tools-dev/-/9nckkRpPWDAJ. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/rhino-tools-dev?hl=en.
