Ed Leafe wrote:

>> - Does Dabo handle concurrent modification of data? (Pessimistic  
>> locking)
> 
>       That is implemented differently on each backend server. So far no one  
> has requested this capability, but we would certainly add it if there  
> is enough interest. We abstract each backend into its own class, so it  
> would be possible to customize for each server's syntax.

I see, thanks for your quick answer :)
What about optimistic concurrency control, then? Versioning wouldn't 
require a per-DBMS customization, isn't it?


>> - I see that Dabo favors the "classic" development style of "first
>> design the database, then the business objects from that", do you  
>> think it would be overkill to make Dabo work the other way round?
[cut]
> 
>       The tradeoff in the latter approach is that it makes it much, much  
> harder to work with existing databases. I remember testing TurboGears  
> a couple of years back, and its model implementation almost *required*  
> that the database didn't exist; I had to add all sorts of different  
> settings in order to make the underlying SQLObject-based model use the  
> existing database information.
> 
>       If you're doing brand-new development in which no data exists when  
> you start, then yeah, the ORM approach is pretty nice. If you're  
> creating apps that work with existing data, it's much less attractive.

So Dabo doesn't prevent an approach like that, great :) (And I just 
found dAutobizobj...)


>> - I need to add a few patterns to Dabo data handling, such as logical
>> deletion and data logging/archiving (that is, upon updating a record
>> generate a new record instead and mark the old one as such)..
>> Would subclassing dBizobj be enough?
> 
>       Logical deletion is more of a database design than a bizobj design,  
> since the bizobj is concerned with the logic, and thus logical  
> deletion == deletion. To implement logical deletion, you'd need to add  
> that flag field to the database, and then subclass dBizobj to always  
> add 'delflag = 0' or the equivalent to every query.

So I'd have to have a look at dBackend too, I see..
Another question if I may: implementing logical deletion requires that 
the persistence layer "emulates" the DBMS to prevent deletions at the 
end of not-null relations and the like, does dabo.db provide means for 
such checks, that I could use? (metadata of the relations..? )


>       I'm much more wary of your read-only update system. That would  
> require a lot of code to maintain data integrity, since changing a  
> single column would result in a new PK for that record, and you would  
> somehow have to track all relationships that may involve that PK and  
> update those records, which would then require that any table that  
> references those PKs be updated, etc.

It's true, but fortunately in our business domain this isn't required.. 
More than that, it must be prevented :) For example, if there's an 
"Invoice" object that references a "Customer" object, and I modify the 
Customer, the business rules need the Invoice to keep referencing the 
"old" customer data.
In the event that I want the Invoice to reference the new one, I'd 
update the Customer and then contextually update its reference inside 
Invoice.


> When archiving is required, I've  
> typically created shadow tables in the database that hold the original  
> records, along with timestamp/user info on who made the change. Then  
> an udpate trigger is created to archive the original values before the  
> update happens. This way relational integrity is not an issue, and you  
> have a complete trail of changes to the data.

Clever! :) Many thanks for your help!



_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
Searchable Archives: http://leafe.com/archives/search/dabo-users
This message: http://leafe.com/archives/byMID/[EMAIL PROTECTED]

Reply via email to