On Wed, Nov 18, 2015 at 3:20 AM, Keith Medcalf <kmedcalf at dessus.com> wrote:

> A little off topic, this reminds me of a feature of the PL/1 F compiler.
> The PL/1 compiler was a huge monstrous beast that had both syntactical and
> semantic analyzer paths so it could "correct" a number of common
> programming errors, faulty assumptions, and misspelling of keywords if the
> programmer were too lazy to formulate the problem and the source code
> correctly.
>
> Because code was painstaking (slowly) entered onto decks of 80-column
> hollerith punch cards that were submitted for execution overnight (or, in
> many cases over-fortnight), this "feature" made it possible for many
> "sloppy" errors to be automatically corrected thus avoiding the fortnightly
> correct-submit-result-edit-repeat process.
>

I realize you're saying "feature" and "sloppy" with quotes, which probably
means that you don't actually believe that, but I would like to chime in.
Languages like FORTRAN had implicit variable definition rules, and they
were a real feature of the language. When you are keypunching a stack of
cards, every card cost money. Implicitly defining the type of variables
that start with a given letter saves you the expense of the card that
declared one or more variables. It reduces the time it takes to read the
stack of cards into the computer. Given that the machines of the time were
very memory constrained by modern standards, this was all very valuable. As
for PL/1 F, from Wikipedia:

PL/I was first implemented by IBM, at its Hursley Laboratories
<https://en.wikipedia.org/wiki/Hursley_Laboratories> in the United Kingdom,
as part of the development of System/360
<https://en.wikipedia.org/wiki/System/360>. The first production PL/I
compiler <https://en.wikipedia.org/wiki/Compiler> was the PL/I F compiler
for the OS/360 <https://en.wikipedia.org/wiki/OS/360> Operating System,
built by John Nash's team at Hursley in the UK: the runtime library team
was managed by I.M. (Nobby) Clarke. The PL/I F compiler was written
entirely in System/360 assembly language.[18]
<https://en.wikipedia.org/wiki/PL/I#cite_note-Krasun-18> Release 1 shipped
in 1966. *OS/360 was a real-memory environment and the compiler was
designed for systems with as little as 64 kilobytes of real storage ? F
being 64 kB in S/360 parlance. To fit a large compiler into the 44
kilobytes of memory available on a 64-kilobyte machine, the compiler
consisted of a control phase and a large number of compiler phases
(approaching 100). The phases were brought into memory from disk, and
released, one at a time to handle particular language features and aspects
of compilation.*

So every card saved reduced the price of consumables for the program,
reduced the time it took to read a stack of cards into memory, probably
reduced the amount of data written out to temporary magnetic storage
between phases, which of course reduced the amount of data read back in
from temporary magnetic storage. Given that the cost of computer time by
some estimates was about $10,000/hour (almost $2.78/second!) this was all a
very real savings to the owner / operator of the computer.

Forgetting for a moment about the "lazy programmer" who took a short cut to
save some keypunching, 23 years ago Steve McConnell wrote Code Complete and
claimed an average of 15 to 50 bugs per KLOC. Regardless of how many bugs
there are on average, there is some non-zero number that is the real
average probability of a bug in a line of code, so reducing the number of
lines in theory reduces the number of bugs, and reduces the probability of
wasting a computer session and waiting a fortnight for another shot at it!

I never had to work with punched cards. Missed it by a few years, even in
college. I consider myself fortunate to work in a world with lots of RAM
and huge sophisticated compilers that can do lots of things quickly in a
"single pass" of the compiler. I like being able to do "RAD" style
programming, though to me that means I can spend my time typing text into a
computer and let the compiler tell me I made a typo. :)

Which brings us back around to SQLite: SQLite isn't always running in that
heavy duty environment with gigs of ram. It works on memory constrained
devices with perhaps less than a megabyte of ram.

Anyway, I agree it would be nice to have more detailed error messages
available when FK conflicts arise, but I can appreciate why the overhead is
not desirable, and that past architectural decisions have made it a
difficult feature to add at this time.


> For example, in "C" you may get an error that some variable is not
> declared and the compilation halts.  In PL/1, the compiler assumes that you
> are simply a forgetful and sloppy programmer and "creates" the necessary
> declaration for you.  This allows the compilation and execution to
> continue.  If the assumption the compiler made turns out to be correct,
> this saved you two weeks of time.  If the compiler was in error, it cost
> you nothing -- you still have to correct the program source and do another
> fortnight turn-around.  Of course, the compiler also had an option to
> "punch" a new card deck of the "corrected" and "properly formatted" source.
>
> The long and the short of this is that it encouraged programmers to write
> sloppy ill-formatted and error-filled code and let the compiler fix it up.
> If it did so correctly, then you ended up with a working program in a much
> shorter period of time and a beautifully formatted deck of source that was
> completely free of compiler detectable syntactical or semantic errors.
> True expert programmers knew exactly how the assumptions that the compiler
> made worked, and instead of wasting time punching the "correct" code,
> punched code in a hugely shorter format in the firm knowledge that the
> compiler would fix it exactly correctly saving much cardboard and much
> time.  The not so skilled often could not tell whether the compiler was
> "doing the right thing" or not.
>

I would say saving money was more often the concern. This was not a time
when any old person could setup a computer in their home and teach
themselves to program. I'm not saying there weren't lazy programmers, but
there is more to programming than just writing elegant code.


>
> To bring this up to modern times, programming now has a cycle time of
> minutes rather than fortnights.  It is now typical to expect the compiler
> or the database to tell you what your syntactical and semantic errors are,
> and to only fix things that are complained about.
>
> So back to topic -- the wish to have SQLite be a "heavy" system in the ilk
> of PL/1 is just a compensation for inadequate wattage being deployed in the
> first instance during the problem analysis stage -- the failure of a
> deferred constraint is a programmer/system analysis error (a wetware
> error).  Where the error is should be obvious since it was the analysis
> stage which led to the constraints being deferred in the first place.
>
> The ultimate "lite" system, in my opinion, was the ancient TRS-80 BASIC.
> It had only three error messages from which you ought to be able to
> determine types of errors they represent:
>

Altair 8800 was much lighter than TRS-80 BASIC. At least the "basic"
machine (not to be confused with BASIC). Lights on a panel!

-- 
Scott Robison

Reply via email to