Steve, what sebb describes is what I have done, btw.
a Loop Controller, with a JDBC sampler child (with a single insert
statement with ${variables} defined in the CSV Data Config Set), which
has a child of "CSV Data Set Config"
works quite well - you just don't get batch performance.
-Abram
sebb wrote:
On 03/03/2008, Steve Miller <[EMAIL PROTECTED]> wrote:
Thanks Sebb,
I would expect the input data to be defined with either:
- some columns random and some columns fixed for each row
or
- ~reading next n CSV or XML rows from a file for each batch, but use
same file each time invoked, thus 'marching-thru' the file data over time.
That's how the CSV Dataset works already.
I've now had a look at the docs, and AFAICS the JDBC "batch" feature
is not intended for batch loading, it seems to be intended to process
multiple different SQL statements as a single batch that either
succeeds or fails as a whole.
One convenient approach, at least from the GUI perspective, could be to
add another variable, something like 'repeatCount', to the JDBC Sampler,
at least for inserts, default to 1, and if set to >1 then whatever was
specified for the parameter values, e.g. mix of random and fixed, would
be repeated for each row, though separate instantiations of the 'random'
per row/column.
Just enclose the sampler in a Loop Controller, and use a function to
generate the 'random' data.
This isn't critical to my testing, I'm just doing more single row
inserts now as a first approximation, though I need to be cautious about
potential differences in latency to concurrent read operations.
Steve
sebb wrote:
> On 03/03/2008, Steve Miller <[EMAIL PROTECTED]> wrote:
>
>> I've got some JDBC samplers running, and for one of them what I really
>> want to do is a prepared statement that inserts many (from 10 to 100)
>> rows at once. JDBC has the PreparedStatement.addBatch() method, along
>> with PreparedStatement.executeBatch(). Is there a convenient way to do
>> the equivalent with JMeter?
>>
>
> Not at present.
>
> One could put the insert values in a file and loop over them with an
> Insert statement.
>
> If the batch feature was to be added, how would you expect the input
> data to be defined?
>
>
>> I've seen an Oracle 'kludge' that allows you
>> to do a 'batch insert' with an sql statement like:
>> INSERT INTO table (column 1, column 2) VALUES (
>> select value1a, value2a from dual union all
>> select value1b, value2b from dual union all
>> ...
>> select value1...,value2... from dual
>> );
>>
>> but that would clearly get very awkward at best using the parameter
>> values and parameter type input fields on the JDBC sampler.
>>
>> thanks,
>>
>
> If there are a lot of rows to be added, I would probably do this
> before starting the test using whatever utility is most suitable.
>
>
>> Steve
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [EMAIL PROTECTED]
>> For additional commands, e-mail: [EMAIL PROTECTED]
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>
>
>
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]