re-splitting the hair in half. I'm comparing db(query).select(cacheable=True) with db(query).select(processor=myprocessor) to find the culprit. parse_value() is evidently the one blowing times up Skipping back a little bit, I can shave a 1% (from 0.401 to 0.397) changing #L2221 <https://github.com/web2py/web2py/blob/master/gluon/dal.py#L2221> to
fields[j]['type'] Let's move "forward" to parse_value() (gets called 1k * num_of_columns times). Lot's of ifs (L2016, 2021, 2023). Commenting the first drops to 0.389 Commenting also the second drops to 0.318 Commenting also the third drops to 0.299 So, what actually I didn't expect sucking up times are those 3 ifs, that shaved off the 25% of the performances. I guess those are "safety-precautions".... but are they really needed ? On Wednesday, November 20, 2013 9:51:17 PM UTC+1, Niphlod wrote: > > splitting the hair in a half, > db(query)._select() > takes roughly 0.0005 in my computer, compared to the "total" of > executesql() that has it "pregenerated", of 0.026, so we're talking about > the 2% of it. > Times to generate the query are actually a lot faster than the time to > parse the resultset (as expected). > Let's see if I can break further down the parse() to see where it "drops" > from 0.022 to 0.396 > > -- Resources: - http://web2py.com - http://web2py.com/book (Documentation) - http://github.com/web2py/web2py (Source code) - https://code.google.com/p/web2py/issues/list (Report Issues) --- You received this message because you are subscribed to the Google Groups "web2py-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.

