You are right, of course,
I was not very precise and mixed up things a bit.

I started with development on little machines in the early 1980s,
that was even before IBM-PC, and the first systems were called
CP/M and later MS/DOS, and at that time the usage of different
compilers and linking stuff together was common .. but it worked
only, if the compilers came from the same manufacturer.

I already had at that time some years of experience with the old German
mainframe, and I always missed the support, for example dumps on abends,
and diagnose tools, on the little machines. In the beginning I thought
that this was due to size restrictions, but even as the machines grew
larger, there was another kind of spirit.

Today at our site we are building portable applications in C, and in case of
abends or severe problems, the situation on the different platforms
is completely different:

- at z/Arch, I am always able to get information about the source of
the problem; LE prints the stack frames at the time of error, and,
if necessary, I can read the SYSUDUMPs etc. ... I have access to
all needed information, including registers and machine state,
and this is even true if the abend occurs in production environment

- on Windows and Unix, I have debuggers, but only in test or
development environment. If there is a problem in production,
I always have a very hard time to reproduce the problem in
a test environment where I can probably activate trace output etc.;
this is the way how it is handled in these environments.

For doing test runs in pre-production stage, we always use z/Arch
and we are very happy with this solution and environment. The
rollout to Windows and to the different Unixes is done after that.
If the application works OK on z/Arch, there are normally no
problems left on the other platforms.

BTW, our applications do only computations (insurance math),
no communication with the Windows API, for example.

By "calling conventions" I refer to the register usage etc. and
to the usage of machine instructions to do the transfer of the
program control flow from callers to subroutines. Even on z/Arch
there are different incompatible models (XPLINK, non-XPLINK
for example). On Intel machines, every compiler builder (in the
1980s) had calling conventions of his own, so you were never
able to call a M$ C subroutine from a Borland Pascal program.

Mixing different languages (at our site: PL/1, C, ASSEMBLER)
is always an adventure, but IMO it is still state of the art and it
is done in every non-trivial software project.

Kind regards

Bernd




Am 22.04.2014 11:08, schrieb David Stokes:
I suspect btw the poor experience is 30+ years old. To run under Windows a program must use the Windows API calling conventions (e.g. for Dlls). Any compiler for Windows must be capable of doing this, and in principle be able to interact with the results of other compilers. Of course 'no one' uses ALGOL, FORTRAN, PASCAL, PL/1, COBOL, BCP etc. under Windows in any case and linking stuff from different compilers together is not so much the Windows way. In fact many Windows developers quite likely don't even know there is an object code linker, even in .Net. Mostly it's C++ and .Net Framework nowadays (C#, maybe VB.Net), plus JavaScript and a few more esoteric things in the Web world. And mostly using Visual Studio. Of course there's other possibilities for the adventurous. Most give up at some point though, what with the lethargic Java coded tools, unfixed bugs, abandoned projects etc. I don't even have Java on any of my Windows machines. Never notice its absence, I must say.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to