If we really need a millisecond accuracy in timestamps we should go with 
INTEGER an use int(time.time()*1000) in python code.
But on the other hand I don't get why we need this at all. Events which happen 
at the same time for the user will have a different timestamp, one example is: 
A user opens firefox with 10 predefined pages loaded in 10 tabs, for the user 
the 'load' event for all this pages happened at the same time ('at firefox 
startup'), but the resulting events will all have different timestamps.
Or is it up to the clients to decide when events happened at the same time?
Or should the dataprovider in this situations just send one event with multiple 
subjects?

-- 
Use timestamps with milliseconds granularity (was: use REAL)
https://bugs.launchpad.net/bugs/483603
You received this bug notification because you are a member of
Zeitgeist-Engine, which is the registrant for Zeitgeist Engine.

Status in Zeitgeist Engine: New

Bug description:
Seconds are not fine-grained enough to differentiate events, so we should store 
the timestamps as floating-point numbers.

_______________________________________________
Mailing list: https://launchpad.net/~zeitgeist
Post to     : zeitgeist@lists.launchpad.net
Unsubscribe : https://launchpad.net/~zeitgeist
More help   : https://help.launchpad.net/ListHelp

Reply via email to