Greetings!

 

My test query is:

 

select tag_key,value,

       datetime(value_timestamp, 'localtime') AS localtime,

       datetime(value_timestamp) AS UTCtime  

from trend_view

where trend_key=1 

and value_timestamp >= julianday('2006-07-27 10:08:32.000') 

order by value_timestamp desc

 

My table has records recorded every hour, roughly on the hour.  This
program that generated this query wanted to get the last 24 hours' worth
of data.  It was run at 10:08 on June 28th.  

 

The earliest data returned was recorded at 11:01 UTC on June 27th, or
7:01 AM EDT on June 27th.  That's too far back by four hours (the
difference between EDT and UTC time).

 

When I changed the query to use julianday('2006-07-27 10:08:32.000',
'utc'), the earliest data returned was recorded at 15:01 UTC on June
27th, which is 11:01 EDT on June 27th, which is what I want.  

 

When I changed the query to use julianday('2006-07-27 10:08:32.000',
'localtime'), the earliest data returned was recorded at 07:01 UTC on
June 27th, which is 03:01 AM EDT on June 27th, which is far too much.

 

So, I conclude the following:

If no modifier is provided to the julianday() method, the given date is
assumed to be UTC.

If the 'utc' modifier is used, the given date is assumed to be local
time.  It is converted to UTC before the comparison is made.

If the 'localtime' modifier is used, the given date is assumed to be UTC
time, and it is converted to local time before the comparison is made.

Are those conclusions correct?  And I presume the datetime() method
operates the same way?

 

Thanks again!

 

Rob Richardson

RAD-CON INC.

 

Reply via email to