Hello,
I ran into a problem where I'm trying to compare two timestamps, one of
which is an updated_at timestamp that is automatically set when a record is
marked as processed, to use my app's domain language. So, this query
returns too many records:
Person.where('updated_at > processed_at')
I believe the problem is caused by me having set processed_at to
Time.now.utc and letting updated_at be set automatically:
person.update_all(processed_at: Time.now.utc)
I expected the timestamps to match and didn't realize I was dealing with
microsecond precision. To take an example record:
updated_at: 2016-08-19 22:23:19.391
processed_at: 2016-08-19 22:23:19.390
I believe the fix would be to compare the timestamps after casting to
remove milliseconds like this for postgres:
Person.where('updated_at::timestamp(0) > processed_at::timestamp(0)')
...or like this for all databases:
Person.where('CAST(updated_at AS timestamp(0)) > CAST(processed_at AS
timestamp(0))').count
Going forward, I believe I can avoid this issue be an issue by explicitly
setting the two timestamps:
now = Time.now.utc
person.update_all(processed_at: now, updated_at: now)
...but I wanted to check in to make sure I'm getting this right, and to
leave some breadcrumbs for future Googlers who might run into this same
issue :)
Thanks!
- Trevor
--
You received this message because you are subscribed to the Google Groups
"sequel-talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sequel-talk.
For more options, visit https://groups.google.com/d/optout.