On Monday 24 December 2007 4:30 am, Ed Leafe wrote: > On Dec 23, 2007, at 9:56 PM, Adrian Klaver wrote: > > I sort of understand. Let me see if I can work this out. In my case > > the bizobj > > refers to a table (which from the documentation is the 'default' > > use) so I > > saw the table as the DataSource, not the bizobj. > > I can understand your confusion, especially if you've never > developed in a client-server manner before.
Actually, I have. My questions are just a way to find out where this distinction exists in the Dabo framework. I have the the general plan: db-->bizobj-->ui Its the details that I am trying to sort out. > > > What you are saying is that > > the bizobj as the middleware component caches data in its structure > > and > > update() works on that structure and requery() works on the > > underlying table. > > Am I getting close? > > It's not so much a cache as it is a separate copy. The purpose of a > cache is to speed up multiple accesses of the same data, whereas the > copy in the bizobj is supposed to be distinct from the database, so > that you can modify it, sort it, do whatever you want to it... and > then discard those changes without affecting the actual data in the > database. The only way to affect the database is through save() or > delete(). That clears things up. > > > If my assumptions above are correct I can see how this would be a > > problem. To > > play the Devils advocate I prefer that data is automatically saved > > when > > moving from record to record. > > While such a setup may sometimes be a good thing, the general rule > is that anything that is potentially destructive to the data must be > explicit. Users in general should be able to make a change and only > save it when they want to. I would tend to agree, but experience has taught me that all users do not believe in the concept of save often and that when data goes missing they come looking for the app writer. See more below. > > > I have greater faith in a database table holding data than an > > application. > > On a long-term basis, of course. But your app should only be holding > a very small amount of data at any given time, right? It is more a quality then quantity issue. Ironically, the app that drove me to this viewpoint is a Visual FoxPro one. Before I get into the particulars, let me say I was generally impressed with VFP and my limited experience with it led me to this project. Let me also say I did not develop the app nor did I have access to the code. The problem is that GUI apps, especially those running on Windows(where this app exists) can be fragile. When the enviable happened and there was a crash, bad things happened. The biggest problem was the issue of orphaned records. This could be attributed to the apps dependence on explicit saves. If they weren't done then the data was left in an incomplete state. At the same time I was dealing with this app, I was developing my own app using Postgres and a variety of clients. One of the things I did was make automatic saves a part of the process. We had some power issues so we had unscheduled shutdowns, yet I never ran into the orphaned record issue. Just my 2 cents worth. > > > My inclination is to make the data hit the database soon and often. > > Have you ever developed a client-server app with thousands of users? > If you took that approach your app would not be very responsive. The > more efficient approach is to grab just the data you need, display/ > modify it locally, and then save the changes back only when you want > to. That's what Dabo's data access is designed to do: we buffer local > data completely, so that even with parent-child relationships, the > child data is buffered even when you move the parent pointer. No I have not worked on anything that size. My guess though is that I would hit a connection handle issue first. In this vein, does Dabo support lazy connections or connection pooling? I agree with the local copy approach, I just plan to make saving implicit not explicit as much as possible. I am sure there will learning curve as to the proper balance. > > > From poking around I can see ways to make that happen, I just have > > to work out the procedure. > > You could hook into the bizobj's beforePointerMove() method, which > fires (as the name suggests) before the current row pointer moves to > the new record: > > def beforePointerMove(self): > self.save() > > > -- Ed Leafe > -- http://leafe.com > -- http://dabodev.com > > > > [excessive quoting removed by server] _______________________________________________ Post Messages to: [email protected] Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users Searchable Archives: http://leafe.com/archives/search/dabo-users This message: http://leafe.com/archives/byMID/dabo-users/[EMAIL PROTECTED]
