On Tue, Sep 23, 2008 at 12:52 AM, David Cramer <[EMAIL PROTECTED]> wrote:>
> For me, personally, it would be great if this could accept callables
> as well. So you could store the username, like so, or you could store
> a choices field like:
>
>    field = models.IntegerField(choices=CHOICES)
>    denorm = models.DenormField('self', 'get_field_display') # which
> would rock if it was .field.display ;)

I think denormalizing with callables is a very different thing than
denormalizing with expressions that can be evaluated by the database.
Not that they both aren't worth supporting, but db-level expressions
are going to be far easier and more efficient to validate and update
in bulk - no looping in Python, just executing SQL. In this case, I
think your example would be better suited as an FK, for denorm
purposes.

Callables would be useful for something more complicated like
abstracting auto_now to not be limited to dates, that is allowing a
field value to be set by a callable on save, not just create, for any
field type.


On Tue, Sep 23, 2008 at 2:42 AM, Andrew Godwin <[EMAIL PROTECTED]> wrote:
> Still, with an
> AggregateField(Sandwiches.filter(filling="cheese").count()) it's still
> possible to work out that you want to listen on the Sandwiches model,
> and you could then fall back to re-running the count on every Sandwich
> save, even if it ends up not having a cheese filling.

I'm not sure I like the idea of accepting arbitrary QuerySets. It
could just be my point-of-view, but I see automatic denormalization
and aggregation as intimately tied, where a denormalized field is just
a declarative aggregate expression that's optionally cached in a
column. I think this makes it easy to understand and document, since
aggregation queries and calculation fields would support the same
features, and it also allows the implementation to share a lot with
aggregates. It's also better in terms of storing and updating a
calculation: you can calculate on reads or writes for N objects in one
query without N subqueries, though it may involve a lot of joins.

> So, I think the best approach would be one to replicate fields (like my
> current DenormField; perhaps call it CopyField or something) and one to
> cache aggregates (an AggregateField, like above).

I'd also be hesitant to have two separate fields for these cases,
since copying a related field value is just a simple SQL expression. I
think the same calculation field could be used for both, by assuming a
string is a field name:

name = CalculationField('user.name')

or by using F expressions:

name = CalculationField(F('user.name'))


Another approach to calculations, that doesn't necessarily involve
denormalization, would be to use views. Russell talked to me a bit
about this at djangocon, and I think the idea was that if you solve
basing models on views (Isn't it actually possible now? Maybe we need
read-only fields), and then have view creation support in the ORM,
then calculation fields can be implemented with views. I see that
un-stored calculations are re-implementing views, but I don't know
enough about views to know whether they have some performance
advantages over embedding the calculation in a query.

-Justin

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to