Hi,

At Logstash we have a enrichment filter that allows a user to query a 
remote db (using Sequel) and populate a local Derby db with the results.

The existing code is:


  @db[loader.temp_table].multi_insert(records)


With large results we hit a Derby bug 
<https://issues.apache.org/jira/browse/DERBY-1735>.

I have tried paginating the insert in a transaction and its is slow but it 
works.

In researching the bug I saw that some advice was given to use the data 
import facility SYSCS_UTIL.SYSCS_IMPORT_TABLE claiming that is more robust 
and performant.

Initial tests bear this out however I have to persist the file before I can 
import it. To do this I am using:


  ::File.open(import_file, "w") do |fd|
    dataset = @db[loader.table]
    records.each do |hash|
      array = hash.values.map {|val| dataset.literal(val) }
      fd.puts(array.join(", "))
    end
    fd.fsync
  end


This is OK and works really well.

I was wondering if Sequel offers a more elegant/concise way of generating 
the CSV literal string?

Thanks,

Guy Boertje
Elastic::Logstash.team



-- 
You received this message because you are subscribed to the Google Groups 
"sequel-talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sequel-talk.
For more options, visit https://groups.google.com/d/optout.

Reply via email to