I agree it would be useful. It will be tricky to implement. Moreover row = db().select(db.table.field.date()).first()
would not go into row.field but into row[db.table.field.date()] it would be much easier to just provide a function row.field.asdate() or row.field.astime() On Jul 20, 5:03 pm, Angelo Compagnucci <[email protected]> wrote: > Hello list members, > > sometimes is useful to exctract the date or the time from a Filed > object without traversing the rows returned by a select statement. > > I provide an examlpe: > > db.define_table('test',Field('data','datetime')) > > def getdates(): > rows = db(db.test).select(db.test.data.date()) > return dict(rows=rows) > > def gettimes(): > rows = db(db.test).select(db.test.data.time()) > return dict(rows=rows) > > This could be usefull in all the cases where yuo have to produce > statistics based on time and dates, think of "how many access I had > per day? Or per month?". > > An example: > > hsname = 'testhotspot' > entry_date = dbradius.radacct.AcctStartTime.year()| > dbradius.radacct.AcctStartTime.month()| > dbradius.radacct.AcctStartTime.day() > rows = dbradius(dbradius.radacct.CalledStationId==hsname).select( > count, > dbradius.radacct.RadAcctStartTime.date, > groupby=entry_date) > > Whitout the date method in Field object, I have to traverse the rows > object and produce another rows object with the datetimes swapped by > dates. It's not elegant and really time consuming when you have many > rows. Excracting date or time directly db side is really a fast > operation and it is supported by all major db. I verified this and I > think every database supported by DAL should have these options. > > I apologize for being reopening this discussion (I previuosly made > another request on the ml) but Massimo was not here so He can read > this time! > > I'm willing to implement this feature if anybody intrested. > > I have already added the two methods (date and time) but what I obtain > is a string, not a date or a time object as I'm expecting. Can anyone > help me? > > Thank you

