We had to make a decision whether to use DataSets or not when basically
writing our O/R mapper.

We devided against them - not flexible and performant enough, and way
too high memory use. Note also that you are pretty limited in what you
can do during the updates.
For example, what we do in our layer is:
(a) ordered updates. Means we FIRST insert, THEN update, THEN delete -
over all tables etc. Necessary when you want to save "arbitrary
objects".
(b) delayed updates. For LINKS (fields refereencing other object PK's)
the link can be a BACKLINK. In this case, the INSERT inserts a NULL,
then after all inserts, before the original updates, the value is
updates.

As a result (of this which we could not easily do with datasets) we can
automatically store arbitrary object graphs :-)

No, DataSets were decided to be part of the problem.

Thomas Tomiczek
THONA Consulting Ltd.
(Microsoft MVP C#/.NET)

-----Original Message-----
From: Ben Kloosterman [mailto:[EMAIL PROTECTED]] 
Sent: Mittwoch, 9. Oktober 2002 02:36
To: [EMAIL PROTECTED]
Subject: Re: [ADVANCED-DOTNET] Strongly-Typed DataSets vs.
Strongly-Typed Collections


Why not a full automatic Data access and mapping layer that uses
datasets ? I have used these in the past.

Datasets performance is ok except they take too much memory , but that
is not really an issue as its pretty cheap to buy mem.

So with my Dataset automatic DataLayer I can straight away write a
powerfull UI with no new code , no new bugs etc. IMHO anything that
requires writing less code is a priority - provided it is easy to
maintain and has good functionality.

Ben





-----Original Message-----
From: Moderated discussion of advanced .NET topics.
[mailto:[EMAIL PROTECTED]]On Behalf Of Thomas Tomiczek
Sent: Tuesday, 8 October 2002 10:39 PM
To: [EMAIL PROTECTED]
Subject: Re: [ADVANCED-DOTNET] Strongly-Typed DataSets vs.
Strongly-Typed Collections


IMHO you guys see this totally wrong :-) OK, lets start the flame.

Yes, strong typed collections ARE evil. But what about strong typed
CLASSES? The problem you have with flexibility here is that you THINK
you have to change so much logic then. Thats just a missing piece here.

You need a full automatic Data access and mapping layer.

Have a look onto these two classes (snipped partially)

- (1) >
using System;

using EntityBroker;
using EntityBroker.ObjectQuery;

namespace Daikiri.Cms {
        /// <summary>
        /// Summary description for Container.
        /// </summary>
        [NaturalOrder ("SuperContainerREF", "FriendlyName")]
        [TableMapping ("CmsContainer")]
        [TypeSelector ("ContainerType")]
        public abstract class Container : StandardEntityObject,
IEntityObjectAction {

                [Link ("SuperContainer", "SuperContainerREF")]
                protected abstract Container _SuperContainer { get; set;
}

                [Field ("SubItemCount", EntityDbType.Int,
DefaultValue=0)]
                protected abstract int _SubItemCount { get; set; }

                [Field ("FriendlyName", EntityDbType.VarChar, 48)]
                protected abstract string _FriendlyName { get; set; }

                [Field ("SearchFriendlyName", EntityDbType.VarChar, 48)]
                protected abstract string _SearchFriendlyName { get;
set; }

                [Field ("Comment", EntityDbType.VarChar, 512,
DefaultValue="")]
                public abstract string _Comment { get; set; }

                [Field ("ContainerType", EntityDbType.Char, 4)]
                public abstract string ContainerType { get; set; }

                [Link ("Particle", "ParticleREF")]
                protected abstract Particle __Particle { get; set; }

                [Link ("Template", "TemplateREF")]
                protected abstract Container _Template { get; set; }

                public Container SuperContainer {
                        get { return _SuperContainer; }
                        set {
                                if (_SuperContainer != null) {
                                        _SuperContainer._SubItemCount
--;
                                }
                                _SuperContainer = value;
                                if (_SuperContainer != null) {
                                        _SuperContainer._SubItemCount =
_SuperContainer._SubItemCount+1;
                                }
                        }
                }

                public int SubItemCount {
                        get { return _SubItemCount; }
                }
        }
}

< (1) -

And

- (2) >
using System;

using EntityBroker;

using Daikiri.Cms.ComponentModel;

namespace Daikiri.Cms {
        /// <summary>
        /// Summary description for File.
        /// </summary>
        [TypeSelection ("FILE")]
        public abstract class FileContainer : Container {
        }
}
< (2) -

Inhowfar is this less flexible than using a DataSet? I mean, when I
really have to make a change in the database, then I change the
database, change the property definition (adding/substracting it), and
chances are high that I will get the compiler errors NOW instead of
later. Surely, queries are still an issue (i.e. for the query I have the
problem that I still ahve field names as strings).

BUT I dont see any problem with being less flexible than with a dataset.

Dataset performance, OTOH is crappy. It is pretty good for what they do,
but they do way too much f you ask me :-) Memory - especially size when
transfering over the network to another machine - is bad. Anyhow, thats
something for later :-)

But - IMHO the use o a real data access layer with O/R features has been
totally misvalued here. THATS where the fun and SPEED (in terms of
development) really is. BESIDES that you can have a TON of wonderfull
functioanlity within such a layer that you just dont have with DataSets.
Query the above Container database for Containers, for example, and you
get a collection of different Container subtypes (selected by the
ContainerTyp field) that you can then just call functionaltiy in - OO
programming at it's best.

Thats live code, btw., from a CMS that we are just now working on.

The O/R mapper used was planned to go live yesterday - will ahve to wait
some more days :-)

Thomas Tomiczek
THONA Consulting Ltd.
(Microsoft MVP C#/.NET)

You can read messages from the Advanced DOTNET archive, unsubscribe from
Advanced DOTNET, or subscribe to other DevelopMentor lists at
http://discuss.develop.com.

You can read messages from the Advanced DOTNET archive, unsubscribe from
Advanced DOTNET, or subscribe to other DevelopMentor lists at
http://discuss.develop.com.

You can read messages from the Advanced DOTNET archive, unsubscribe from Advanced 
DOTNET, or
subscribe to other DevelopMentor lists at http://discuss.develop.com.

Reply via email to