On Sun, Jun 29, 2014 at 4:11 PM, dmccunney <dennis.mccun...@gmail.com> wrote:
> On Sun, Jun 29, 2014 at 4:21 PM, Rugxulo <rugx...@gmail.com> wrote:
>> On Sun, Jun 29, 2014 at 1:19 PM, dmccunney <dennis.mccun...@gmail.com> wrote:
>> I think you're mistaking it for ALGOL 60. Because Pascal always had
>> I/O built-in to the language (and compiler). That part was always
>> there, even before ISO got involved.
> Not in Wirth's original design for the language, it wasn't.  As
> stated, it was intended to be a teaching tool, and there was no
> assumption of Wirth's part that it would actually be implemented on a
> machine.

While I'm no true historian and admittedly only know bits and pieces,
that doesn't sound right at all. Are you going off of memory or can
you cite specific people or books to verify this? (It doesn't matter
much, obviously. I could maybe email Scott Moore since he knows more
than almost anybody.)

AFAIK, Pascal was designed and almost immediately implemented on real
hardware and soon used in teaching students (mathematicians,
physicists, etc) at ETH Zurich circa 1970 or 1971. It's still
considered part of the ALGOL descendant line because of stylistic
similarities and Wirth's history, esp. with his previous language
ALGOL W. And even ALGOL W was physically implemented at Stanford with
Wirth's help, and he did work at Stanford from 1963-7 (according to

I don't know how to go back in time to 1970 to find out for sure. (And
it really doesn't matter much anyways.) But I do have a PDF copy of
the 1973 Report (which admits a few very minor changes in the Pascal
language itself due to experience).


The very first paragraph of that report mentions "Algol 60", "teaching
programming", and "efficient tool to write large programs", as well as
"efficient implementability" and an existing "one-pass compiler ...
CDC 6000". This was no abstract toy.

However, in fairness, it does seem to mention "read" and "write" as
being standard procedures, "described in a new Chapter 13". So maybe
it really was a new 1973 feature (unlikely, IMHO, considering that it
compiled itself and was strict on static typing and error checking) or
at least just a new addition to the manual. If you read chapter 13,
you'll see that it says "read(c)" is the same as "c := input^;
get(input)" and "write(c)" is the same as "output^ := c; put(output)".

But it's not wrong to say that he meant it to be explainable without a
computer. People like Dijkstra advocated that for ages. But they had a
CDC mainframe at ETH Z, so it's not like students couldn't send off
their programs and receive back the output (or errors). Later, Wirth
wrote Pascal-S (subset) interpreter to speed up turnaround time even
more. Obviously the formalism of strictly defining the language
grammar came from ALGOL (and Wirth more or less popularized EBNF,
railroad diagrams, T squares or whatever, compiler construction, etc).

> Pascal also suffered from lack of standards.  There were lots of
> Pascal implementations, all different in various ways.  You could not
> expect to write Pascal code that would build an run, for example, on a
> PC under Turbo Pascal, and a DEC VAX under DEC Pascal.

Not counting Pascal-S (strict subset interpreter), the only other
subsets were the various P[1234] porting tools. It's been said that P2
was incomplete but still used (if not literally, at least as
inspiration) by UCSD Pascal (and later Borland). Even P4 (the last of
the series, circa mid '70s) was incomplete. This was way before any of
the various standardization efforts. It was not full Pascal, in
contrast to the CDC version. (Scott Moore only in recent years added
in the rest of ISO 7185 to what he calls P5.) BTW, standard unextended
Pascal is almost exactly the same as J&W. No new features were added.

AFAIK, not counting semi-official J&W, work towards a standard first
started circa 1977, and BSI was the original group to push for one.
There were also (later) ANSI/IEEE and ISO. Depending on which version
(of what is basically the same thing, classic / unextended), the
finalization year was something like 1981 or 1983. The later (also
unpopular) Extended Pascal standard came in 1988 or such (and had far
less compilers developed, and obviously Wirth had moved on to greener

The official BSI test suite is nowadays abandoned, locked up, thrown
away. (Sadly, though luckily Scott Moore has his own public test
suite.) Most of it came from PUG (Pascal User Group) newsletters, I
think. But that was the official way that people tested their
implementation. (See (T)ACK.) Or were supposed to, anyways. For
whatever reason, many implementations either were buggy or incomplete
or just didn't care (ahem, Borland). That doesn't mean it was
impossible, just that the effort was lacking.

(BTW, you can't standardize that which keeps changing. But the reason
people hate standards is that they're too weak or too baroque. Yet an
ignored standard is little better than none.)

Yes, you can write a program that will run (without changes) under
Turbo Pascal as well as "classic" ISO 7185-ish dialect. I've done it!
(Not on a DEC machine, obviously, but with GPC and P5.) Of course,
you're more or less just avoiding any incompatible features directly
(not that huge a list but not totally trivial either) as well as
misuse of mixed comments to almost pretend to have a preprocessor.
Yes, I know, using a real preprocessor would be cheating. In that
case, you could combine any languages (see TeX), but my point is
that's all you can do in some cases. Portability isn't free, as you
well know, all you can do is implement two (or more) versions and hide
the gory details.

> C did get standardization efforts, with the intent that you should be
> able to write C that would build and run on any machine with a
> standards compliant C compiler.  This did not happen automatically,

AFAIK, they didn't start standardization efforts until 1984 (well
after BWK's anti-Pascal diatribe, which was only half valid in the
loosest sense). Old K&R pre-ANSI (at least mostly documented in their
first edition circa 1978) and even C89, my favorite C variant, are
considered obsolete as well these days.

> and you had to understand what sorts of things *were* portable.
> Netscape's documents on portability for folks working on Mozilla code
> are amusing because of this.  Mozilla uses C++ instead of ANSI C, but
> the same issues apply.  A lot of what is in the Netscape docs reduces
> to "Just because you can do it in MS C++ compiled by Visual C++, don't
> assume it will work elsewhere. It probably won't."

A lot has changed since then. C++98 fixed most peoples' complaints,
AFAIK. Though obviously even more has changed now that C++11 is out.
Keep in mind that even Firefox has changed drastically since 2008.
Nothing stays the same for very long.

> And yes, Algol 60 did not define I/O, resulting in wide variance of
> implementations.

Well, for simple algorithms (you know "algorithmic language"), you can
get the gist and not worry about the details as much. Only for really
big or complex programs (which were outside the scope of normal
theory) would it matter enough to complain. I'm just saying, it's hard
to fault them in 1960 (or even now) for not knowing or supporting
everything!   ;-)

>> Some compilers are faster (on certain OSes) if you don't shell out to
>> run the assembler for every file. But it makes cross-compilation much
>> harder.
> Cfront didn't shell out for each file. Cc compiled all source files to
> assembler, the passed the list to as for assembly, and that list was
> passed to ld to build the executable, all under the control of make.

You mean the assembler handled all the separate files at once? But you
know what I meant here. Obviously it's faster to not need an external
assembler or linker at all, esp. on systems where shelling out is
expensive (ahem, Cygwin). But GNU prefers (probably wisely) to keep
things separate. However, OpenWatcom will compile to .OBJ directly.
And FreePascal has (for most targets) at least an internal assembler,
and even for a few (Windows) an internal linker as well, for better
speed. But FPC doesn't always smartlink by default because it's too
slow (and manually done in piecemeal fashion). If you aren't doing a
final release, then size doesn't matter as much just for testing.

> Compiler technology has advanced far beyond the sort of tricks you had
> to play in resource constrained environments like MS-DOS on 8088 based
> machines.

Yes and no. Most of the common techniques have been around for
decades, typically since the '70s. Though indeed I always get the
feeling that people think GCC is the only compiler in the entire
world, and that's more than a bit naive. But indeed, almost nothing
gets as much use as GCC. But GCC has always assumed more memory (and
hardware) than an 8088 had. Most other compilers aren't as
broad-minded. But even GCC can't support everything.

Open source business process management suite built on Java and Eclipse
Turn processes into business applications with Bonita BPM Community Edition
Quickly connect people, data, and systems into organized workflows
Winner of BOSSIE, CODIE, OW2 and Gartner awards
Freedos-user mailing list

Reply via email to