So, I've been fighting an issue with performance of one my my services. At 
times it gets really bogged down, very slow to respond. My first though 
was, maybe my interaction with the DB is slow. (DB server is bad, indices 
are not right..whatever)

So I tried to add some metrics around this. Like this.

--------
begin blah(asdf)
  begin_time = Time.now
  my_data_set = @my_db.db[:my_metadata].where(name: requested_name)
                   .where(type: requested_type)
                   .where { mystoptime >= requested_start_time }
                   .where { mystarttime <= requested_stop_time }
                   .reverse_order(:myreceipttime)

  duration = Time.now - begin_time
  log_db_metrics(__method__, duration)

  ret_array = []
  my_data_set.each do |row|
    ret_array.push(row)
  end
  ret_array
end
----

The value of "duration" was always tiny. However, I now realize that the 
big factor of my time is in that creation of the "ret_array" 

I started trying to remeber how this works in Sequel. Is it really don't 
with the DB connection one the initial creation of my_data_set, or is it 
really doing the reading from the DB in that my_data_set.each...

TImes I am talking about, that calculation I made for the "duration" is 
like 0.01 seconds.
But the timing around the my_data_set.each, is like 30 seconds. 
I know that data set is NOT that large, so I am thinking I that must be 
doing the DB stuff.

Thanks, been a few years since I looked at this code. Just started getting 
at it when we started having horrible performance issues

-- 
You received this message because you are subscribed to the Google Groups 
"sequel-talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sequel-talk/b71c8650-ebc3-429b-b32f-1be7dbe62a06n%40googlegroups.com.

Reply via email to