People, please, what would you say of the
-fadvancedAlgebra key proposal
contained in the middle of this letter?
This is new, I had not thought much about it and doubt what might be
wrong there. It is very short and, hope, can improve much.
------------------
Sergey Mechveliani
[EMAIL PROTECTED]
---------------------------------------------------------------------
Marcin 'Qrczak' Kowalczyk <[EMAIL PROTECTED]> writes
>> basAlgPropos announces that the operations like `baseSet' also can
>> be ignored by everyone who is lazy to consider it.
> "Ignored" means "made returning \"don't know\" or even bottom",
> and this is a bad design.
>
> First, such instance is useless. The information that something is
> not known gives us nothing.
For some users - it gives, for others does not.
As with all operations in Haskell-98 too.
For example, I do not use fromInteger, and no tragedy occurs.
> You should not skip it, unless this is an unfortunate case where
> particular classes do not fit well into what we are defining and
> there is not any good definition of fromInteger.
The situation with my application program is exactly as you describe.
And it often repeats with different users and different operations,
and there is no tragedy about this.
> Second, such instance cannot be improved. It has been defined,
> and there can be only one instance for a given class+type pair in
> a program. If someone needs a more complete instance, he is stuck.
No. In the *standard* instances, say, for (,), Fraction ...,
it is defined thoroughly. And the user may use it or not.
For the user types and instances, it a business of the user how
(and whether at all) one is going to exploit such operation.
> You should not force other programmers to make a decision: either
> define poor instances or spend time writing instances that will
> probably never be used.
In the worst case, the hacker defines one extra *dummy* Set
instance.
This is - in addition to the absolutely necessary
Additive, AddSemigroup, Ring
to appear. You may relax, the community will accept them, in my
proposal, or maybe, in someone's next.
Maybe, people would agree that this all is not so principal.
But I agree that this a nasty technical detail, politically
important. Here is the preliminary
advancedAlgebra key proposal
-----------------------------
To introduce the standard compilation key -fadvancedAlgebra
Without this key, the compiler inserts automatically the dummy
definitions for all the necessary instances for the user types
from certain fixed list of "advanced" categories - like
Set, AddSemigroup, MulMonoid ...
--------------------------------------------------------------
For example, the user program exploits +, *, divRem for C a.
The user defines Additive, Multiplicative, EuclideanRing for C
- this is natural in this situation.
But suppose one does not want even to recall the fancy names
`Set',`AddGroup' ...
needed to define dummy super-instances for C a ?
Then, without -fadvancedAlgebra the compiler sees easily that
their instances are needed for T, adds the dummy ones and
reports a warnings.
I sometimes get warnings on skipping the operation definitions in
some standard classes. And find it natural.
Now, it would appear also the warnings of dummy instances.
People, how do you think, can this approach, minimally supported,
satisfy both the hackers and the snobs?
>> > must think about Show instance, and define partial ordering first,
>> > then make it total ordering.
>>
>> What one can do without Show? It is needed everywhere.
> Of course not! From all standard library types, in practice Show is
> used almost only for numeric types, except some debugging messages.
> OTOH almost all types are Ord.
> [..]
There is no problem about this. In any case, the dummy Show can
be generated automatically, by the compiler, if the user skipped it.
Let it put for example,
showsPrec _ = ("Dummy Show"++)
Would you then cry that it had printed occasionally what you did not
like to see for T, if you had not bothered to define it for T ?
I admire, how people like to make problems from nothing instead of
looking for really serious questions related to the subject.
> Superclasses are needed only:
> - when our class makes little sense without the superclass, to
> simplify contexts, or
> - when a default method implementation requires that class, and we
> feel that the importance of having such default implementation is
> larger than the inconvenience of having a superclass.
"Only" or not only, but this is sufficient. Because we have to add
"
when our class makes little sense without the superclass
- in many particular situations that can be possibly created by
the user types.
"
This is a good reason to make Show a superclass.
At least - if this can be made in a manner that does not complicate
the life for other programmers.
> With your proposal it would be much higher ratio of useless instances
> in programs.
Again "useless" - for you, not for many others.
Also we are searching for the way to make this painless for
everyone.
>> Probably, you want to say that the attributes like OrderIsNoether
>> are needless to introduce, because when needed, they can be
>> expressed with the corresponding *classes* and instances.
> Fortunately they are needed in so small percentage of programs,
> that the overall complexity of Haskell programs would be smaller.
The attributes `IsFinite', `cardinality' - for example, are
*fortunately* going to be used by millions, next 10 years.
Cultured people would leave Haskell for more cultural languages and
libraries, if this Num-hacker-non-functional fiesta continues.
Haskell B.Curry would not be guilty in this.
I do hope the committee will be a snob.
>> But in practice, the result of such approach will be that we would
>> need to add about 200 classes more to basAlgPropos, and also to
>> mention their *mandatory* intermediate instances in the user
>> program. And the classes will be named like this
>> LeftAssociativeAlgebraOverRing,
>> LeftAssociativeAlgebraWithUnityOverField,
>> LeftAssociativeAlgebraWithUnityWithSuchAndSuchThingOverRing ...
> Not necessarily, because you can put several things in one context.
> This is one of reasons I like Haskell more than C++.
?
For example, do you suggest to put all the Haskell-98 standard
classes in one class, to make it a single nice class.
Do I miss the direction?
>> > I don't know what does it mean that the order is Noether,
>>
>> You can ignore it, and you will remain as if with Haskell-98.
>> No violence against old habits. Advanced features are optional.
> Then another author needs to use the information about whether ordering
> on one of my classes is Noetherian. And he cannot, because I already
> went the easy way and made an instance saying that I don't know.
Probably, you mean
"... on one of my types with Set instance is Noetherian".
The answer would be simple, then.
The type constructor, say, C, was not standard, it was *your* choice
to put (OrderIsNother,Unknown) for it. This means that you aim the
the user of your program not to require more information.
Naturally, if your program aims the user to deal with it, then of
course, you would need to study what does this Noetherian mean.
You pretend for not knowing what a thing is and at the same time for
the user to exploit essentially the value of this thing produced by
your program.
For example, you may write fromInteger _ = 0
for the type T and then, complain that the user of you program
always obtains 0 :: T when calls fromInteger for T.
And say "Where is the justice? I do not know what is fromInteger"
>> Generally, it is not correct to pretend for such a language as
>> Haskell to serve *only* the hackers.
> Similarly incorrect is to make Haskell serve only authors of computer
> algebra systems.
My approach can hardly be qualified in this manner.
At least, the recent discussion aims to help designing of something
fit for both the hackers and the authors of CA systems
(there are several of them latters in this list).