Re: minify static files (css and js)

2011-08-10 Thread Silvio
I feel inclined to mention the tool I built:
https://www.sgawebsites.com/projects/django-aggregator/ . Very easy to
use, and easy to deploy.

-
Silvio

On Aug 9, 1:44 am, Alexander Schepanovski  wrote:
> I prefer webassets. You may also look into.

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com.
To unsubscribe from this group, send email to 
django-developers+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en.



Re: Improvements to better support implementing optimistic concurrency control

2011-08-10 Thread Anssi Kääriäinen

On 08/10/2011 03:18 PM, Simon Riggs wrote:

That adds additional SELECT statements, which then extends a lock
window between the check statement and the changes. It works, but in
doing so it changes an optimistic lock into a pessimistic lock.

True. The issue I am trying to solve is: Guard against concurrent modifications 
to some
object without taking a lock when the edit page is loaded. I did some Googling 
and I see this is not what is meant by optimistic locking. Sorry for that.


IMHO the right way to do this would be to add the OptimisticLockField
check as an additional item on the WHERE clause of the UPDATE or
DELETE. If that action returns 0 rows then we know that the lock check
failed and we can handle that. This keeps the locking optimistic and
doesn't add any additional SQL statements.
e.g.

UPDATE foo
SET col = newvalue, optimistic_lock_field = optimistic_lock_field + 1
WHERE pkcol = p_key
AND optimistic_lock_field = p_version

DELETE FROM foo
WHERE pkcol = p_key
AND optimistic_lock_field = p_version
The problem with this is that I feel the checking should be done already 
in data validation part, not when saving is already under way. But I 
guess that is use case specific. Doing the checking while saving will 
result into situations where half of the stuff is saved and the other 
half is not. The model.save() isn't atomic itself without a savepoint 
due to model inheritance. And if it is not atomic, then it is easy to do 
a half-update using the shell, for example.


On the other hand, if a clean implementation is written for this, 
including it in Django would be really nice. This doesn't cost much in 
performance, and gives a nice guard against concurrent edits. BTW there 
is more discussion in the ticket #16549 
(https://code.djangoproject.com/ticket/16549).


 - Anssi




--
You received this message because you are subscribed to the Google Groups "Django 
developers" group.
To post to this group, send email to django-developers@googlegroups.com.
To unsubscribe from this group, send email to 
django-developers+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en.



Re: Improvements to better support implementing optimistic concurrency control

2011-08-10 Thread Simon Riggs
On Tue, Aug 9, 2011 at 1:33 PM, akaariai  wrote:
> On Aug 9, 1:17 am, Steven Cummings  wrote:
>> I don't think we're talking about new or specific fields as part of the base
>> implementation here. Just enhanced behavior around updates to:
>>
>> 1) Provide more information about the actual rows modified
>> 2) Check preconditions with the actual DB stored values; and
>> 3) Avoid firing post-update/delete signals if nothing was changed
>>
>> From there you could implement fields as you see fit for your app, e.g.,
>> version=IntegerField() that you use in a precondition.
>
> That would be useful. Especially if that can be done without too much
> code duplication.
>
> I had another idea for optimistic locking: why not use the pre_save
> signal for this? There is a proof of concept how to do this in
> https://github.com/akaariai/django_optimistic_lock
>
> The idea is basically that if you add a OptimisticLockField to your
> model, the pre_save (and pre_delete) signal will check that there have
> been no concurrent modifications. That's it.
>
> The code is really quickly written and downright ugly. It is a proof
> of concept and nothing more. I have tested it quickly using PostgreSQL
> and it seems to work for simple usage. However, it will probably eat
> your data.

That adds additional SELECT statements, which then extends a lock
window between the check statement and the changes. It works, but in
doing so it changes an optimistic lock into a pessimistic lock.

IMHO the right way to do this would be to add the OptimisticLockField
check as an additional item on the WHERE clause of the UPDATE or
DELETE. If that action returns 0 rows then we know that the lock check
failed and we can handle that. This keeps the locking optimistic and
doesn't add any additional SQL statements.

e.g.

UPDATE foo
SET col = newvalue, optimistic_lock_field = optimistic_lock_field + 1
WHERE pkcol = p_key
AND optimistic_lock_field = p_version

DELETE FROM foo
WHERE pkcol = p_key
AND optimistic_lock_field = p_version

-- 
 Simon Riggs   http://www.2ndQuadrant.com/
 PostgreSQL Development, 24x7 Support, Training & Services

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com.
To unsubscribe from this group, send email to 
django-developers+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en.