I was setting it on the operation. I took your suggestion and it still
didn't work. What did was override the Execute method like so...
public override IEnumerable<Row> Execute(IEnumerable<Row> rows)
{
BatchSize = 5000;
UseTransaction = false;
LockTable = false;
return base.Execute(rows);
}
This was the ONLY way I found to get it to load a table without blocking
other users.
On Thursday, September 27, 2012 12:02:59 PM UTC-4, miles wrote:
>
> Hi,
>
> Are you setting UseTransaction=false on the whole pipeline or just the
> operation?
>
> I noticed the other day that when you register the operation, the
> UseTransaction setting is set with the value from the _process_ itself
> (default true), overwriting anything you have applied to the actual
> operation :
>
>
> https://github.com/hibernating-rhinos/rhino-etl/blob/master/Rhino.Etl.Core/EtlProcessBase.cs
>
> (line 55)
>
> Perhaps that is the cause of this behaviour?
>
> Miles
>
>
> On Thu, Sep 27, 2012 at 3:38 PM, oliwa <[email protected] <javascript:>>
> wrote:
> > I'm having the sane issue. I'm reading data in from a data reader and
> > sending it to a SqlBulkInsertOperation. I've set the batch size, turned
> off
> > transactions, and even set LockTable to false but RhinoETL is still
> trying
> > to bulk insert all the rows in one transaction.
> >
> > I watch SQL Profiler so I know my batch size has taken effect. This is
> > rather frustrating.
> >
> > Any ideas?
> >
> >
> > On Tuesday, July 24, 2012 4:57:07 PM UTC-4, Nathan Palmer wrote:
> >>
> >> For a file of that size I would also recommend to disable the
> transaction
> >> as it will still commit the 250mil at once at the end.
> >>
> >> BatchSize = 10000, UseTransaction = false
> >>
> >> Nathan Palmer
> >>
> >> On Tue, Jul 24, 2012 at 3:09 PM, Michael Gates <[email protected]>
> wrote:
> >>>
> >>> I am trying to use the SqlBulkInsert macro to insert records from a
> 250
> >>> million line csv. I set the "BatchSize" parameter to 10,000 thinking
> that
> >>> this will make the engine perform a bulk insert operation every 10,000
> >>> records. However, this isn't happening...looking at the Rhino.Etl
> source
> >>> code, it appears that the SqlBulkInsert operation tries to insert all
> the
> >>> records from the rows value at once...this won't work when there are a
> lot
> >>> of rows like in my case.
> >>>
> >>> Is there another way to batch out the bulk inserts?
> >>>
> >>> --
> >>> You received this message because you are subscribed to the Google
> Groups
> >>> "Rhino Tools Dev" group.
> >>> To view this discussion on the web visit
> >>> https://groups.google.com/d/msg/rhino-tools-dev/-/9nckkRpPWDAJ.
> >>> To post to this group, send email to [email protected].
> >>> To unsubscribe from this group, send email to
> >>> [email protected].
> >>>
> >>> For more options, visit this group at
> >>> http://groups.google.com/group/rhino-tools-dev?hl=en.
> >>
> >>
> > --
> > You received this message because you are subscribed to the Google
> Groups
> > "Rhino Tools Dev" group.
> > To view this discussion on the web visit
> > https://groups.google.com/d/msg/rhino-tools-dev/-/4hnm-wqs62gJ.
> >
> > To post to this group, send email to
> > [email protected]<javascript:>.
>
> > To unsubscribe from this group, send email to
> > [email protected] <javascript:>.
> > For more options, visit this group at
> > http://groups.google.com/group/rhino-tools-dev?hl=en.
>
--
You received this message because you are subscribed to the Google Groups
"Rhino Tools Dev" group.
To view this discussion on the web visit
https://groups.google.com/d/msg/rhino-tools-dev/-/ozb3SdiVwloJ.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/rhino-tools-dev?hl=en.