I think I answered my own questions, yes whwen I still have a
"Sequel::Postgres::Dataset", I haven't really "read" the data out. It not
until I did an "each" or, as I changed it to now. an ".all"
SO now I got this..
--------
begin blah(asdf)
begin_time = Time.now
my_data_set = ADS.db[:my_metadata].where(name: requested_name)
.where(type: requested_type)
.where { mystoptime >= requested_start_time }
.where { mystarttime <= requested_stop_time }
.reverse_order(:myreceipttime)
ret_array = my_data_set.all
duration = Time.now - begin_time
log_db_metrics(__method__, duration)
ret_array
end
------
That makes totally sense now where I was seeing slowdowns. And I was
blaming the GIL in Ruby... (there is a ton of other stuff going on in this
app)
On Sunday, August 15, 2021 at 7:39:02 PM UTC [email protected] wrote:
>
> So, I've been fighting an issue with performance of one my my services. At
> times it gets really bogged down, very slow to respond. My first though
> was, maybe my interaction with the DB is slow. (DB server is bad, indices
> are not right..whatever)
>
> So I tried to add some metrics around this. Like this.
>
> --------
> begin blah(asdf)
> begin_time = Time.now
> my_data_set = @my_db.db[:my_metadata].where(name: requested_name)
> .where(type: requested_type)
> .where { mystoptime >= requested_start_time }
> .where { mystarttime <= requested_stop_time }
> .reverse_order(:myreceipttime)
>
> duration = Time.now - begin_time
> log_db_metrics(__method__, duration)
>
> ret_array = []
> my_data_set.each do |row|
> ret_array.push(row)
> end
> ret_array
> end
> ----
>
> The value of "duration" was always tiny. However, I now realize that the
> big factor of my time is in that creation of the "ret_array"
>
> I started trying to remeber how this works in Sequel. Is it really don't
> with the DB connection one the initial creation of my_data_set, or is it
> really doing the reading from the DB in that my_data_set.each...
>
> TImes I am talking about, that calculation I made for the "duration" is
> like 0.01 seconds.
> But the timing around the my_data_set.each, is like 30 seconds.
> I know that data set is NOT that large, so I am thinking I that must be
> doing the DB stuff.
>
> Thanks, been a few years since I looked at this code. Just started getting
> at it when we started having horrible performance issues
>
>
--
You received this message because you are subscribed to the Google Groups
"sequel-talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/sequel-talk/ca3b8dc4-322b-4f78-8255-188423a14c07n%40googlegroups.com.