On Tue, Nov 18, 2008 at 03:42:52PM -0500, Michael Wojcik wrote:
> Andre Poenitz wrote:
> > On Mon, Nov 17, 2008 at 11:07:05AM -0500, Paul A. Rubin wrote:
> >> I wonder if disk manufacturers are paying M$ to do this?  I've got about  
> >> 54MB of crap in %windir%\winsxs, with multiple versions of each set of  
> >> files.  Presumably there's no way for Windoze to know that something  
> >> depending on an older version can use the newer version, so old versions  
> >> never go away. 
> > 
> > In fact that's actually the most sensible behaviour since there are only
> > very few cases where a new version indeed can replace an older one
> > without any existing or imagined problem.
> 
> I respectfully disagree.

No need to show some special respect here. I believe I can stand ordinary
disagreement rather well.

> I've worked on many projects that maintained backward compatibility
> with new releases of the API, and seen a great many more.

Just for my curiosity: Which projects, which scope? 

I am still pretty convinced that "compatibility" and "progress" are
fairly incompatible notions when it comes to the development of _usable_
libraries.

Guaranteeing the behaviour of only a very limited set of property gives
you the opportunity of changing/improving implementations, but reduces
the utility of the library as such. That's the approach taken by e.g.
standardized languages like C++ 

   _or_

you try to provide everything and the kitchen sink, and end up with
design and implementation decisions that need to be re-evaluated from
time to time in the presence of new environments. Java and Python, or
anything including a "GUI" comes to mind.

> And in this case, we're talking C and C++ runtimes, which should
> conform to the ISO standard anyway.

Ah... should they conform to the Standard or should they be compatible to
older versions? What is supposed to happen if an existing version does 
_not_ conform to the Standard?

> There's no need for them to change every other week.

No. But if problems show up. Non-conformance is a problem for instance.

Also: What am I supposed to do in case there is no obvious standard to
adhere to? I have e.g. a few hundred kLOC of pre-1998 C++ code (done
well before 1998...) around that's uncompilable with todays compilers.
Who is to blame here? Should g++ have sticked to 2.95's view of the
world?

> > In particular that would mean not only source and binary but also
> > behavioural compatibility including keeping buggy behaviour.
> 
> No it doesn't. Undefined behavior is undefined; an application that
> relies on it is broken.

What is an application supposed to do when it lives in an environment
where only buggy libraries are available? 

> And for the rare application that does, there are other Windows
> mechanisms for tying it to the old version of the DLL.

I obviously dispute "rare", otherwise Wikipedia would not know about
"DLL hell", and I have to admit that I am not aware of a lot of "other
Windows mechanisms" that scale from, say, Win 3.11^H95 through Vista.
What exactly are you refering to?

Andre'

Reply via email to