I'm looking at the source code right now, and SqlBulkInsertOperation 
doesn't even appear to use the BatchSize parameter. Am I looking at the 
wrong class?

On Tuesday, July 24, 2012 2:57:07 PM UTC-6, Nathan Palmer wrote:
>
> For a file of that size I would also recommend to disable the transaction 
> as it will still commit the 250mil at once at the end.
>
> BatchSize = 10000, UseTransaction = false
>
> Nathan Palmer
>
> On Tue, Jul 24, 2012 at 3:09 PM, Michael Gates <[email protected]>wrote:
>
>> I am trying to use the SqlBulkInsert macro to insert records from a 250 
>> million line csv. I set the "BatchSize" parameter to 10,000 thinking that 
>> this will make the engine perform a bulk insert operation every 10,000 
>> records. However, this isn't happening...looking at the Rhino.Etl source 
>> code, it appears that the SqlBulkInsert operation tries to insert all the 
>> records from the rows value at once...this won't work when there are a lot 
>> of rows like in my case.
>>
>> Is there another way to batch out the bulk inserts?
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Rhino Tools Dev" group.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msg/rhino-tools-dev/-/9nckkRpPWDAJ.
>> To post to this group, send email to [email protected].
>> To unsubscribe from this group, send email to 
>> [email protected].
>> For more options, visit this group at 
>> http://groups.google.com/group/rhino-tools-dev?hl=en.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Rhino Tools Dev" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/rhino-tools-dev/-/1ZbnKtTn3HQJ.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/rhino-tools-dev?hl=en.

Reply via email to