Andrew po-jung Ho wrote:

> Hi Thomas,
>
>   I am most intrigued by your statement on the "rightness" of doing away with IDL.
>
> I thought IDL/CORBA is the most promising approach to have inter-operability between 
>health-related applications.  It seems to me that an "interface definition language 
>(IDL)" would be required no matter what you call it (or how it is defined).

My point of view is probably not fasionable, but if you take a hard look at the 
concept if IDL, it does not achieve its intended aims that well. Its purpose is 
twofold:

a) act as a primary formalism for humans to write interface definitions which can then 
be disseminated and adhered to by other systems.

b) act as a technical specification to be read by tools of system builders so they can 
either implement the interfaces, or interrogate systems exposing the interfaces

It doesn't really serve either purpose well.

I'll deal with a) first. The problem is that since IDL is a formalism without native 
tools, it cannot be compiled and experimented with in the way that an OOPL can. 
Consider that the process of standardising an interface consists of both discussions, 
ideas, etc, but
also prototyping with OO tools to determine if the interface is sensible. The latter 
rarely happens, and interfaces are published without the benefit of implementation 
experience (guaranteed to be lacking in some areas). No software person alive will 
tell you that
implementation experience will not change anything in an interface definition.

Now, in cases where implementations are done to test the IDL, how are they executed? 
Step #1: translate the IDL to some OOPL which can be compiled and used with tools. We 
have just left IDL behind. Now, imagine a process which takes several months or a year 
to complete
(I am thinking of an OMG cycle). During this whole time, you want the interface 
definition (or object model, for that is what it really is) in a compilable, testable 
form, not in IDL; one you can distribute easily, and which can be used by people with 
different tools.

When the specification is deemed complete, you want to then _generate_ the IDL from 
the something else everyone has been using.

The IDL definition then becomes not a primary formalism for human use, but simply a 
technical message to transport an object model between tools.

So the upshot is that for process a), what you want is IDL tools that do OOPL (or UML; 
same thing) -> IDL, not the other way round, as is the standard mentality of the IDL 
tools and process. In other words, IDL is not a useful formalism for the primary 
expression of an
object model.

Now, for problem b), IDL is also deficient. A vast number of users of IDL do the 
following:
- build a system
- perfect it
- want to export its interfaces to the rest of the enterprise.

To do the last, they also do not want tools which do IDL -> OOPL, but the reverse. It 
is true that recipients of the generated IDL will need an IDL -> OOPL converter, but 
since this is only mandated by the presence of IDL in the first place, we could avoid 
all the pain
and do away with IDL. In its place:

- Use an object model formalism which can be used to build implementations, and which 
is also clean enough for humans to understand and discuss, and develop standards 
around.

- Transmit the interface definitions directly as part of server software or 
components, so that client developer tools can see the definition, and build software 
based on it.

The original idea of IDL was based on the idea of "develop interfaces before 
implementations". In the big picture, this is still true, but I have come to see that 
the micro process of actually developing interfaces properly means you still have to 
start with
implementable & testable object models, and the interface emerges as a result of that 
experimental work.

The .net approach (proprietary by the way - so I am not promoting it, just a couple of 
its ideas) embeds the interface definition in the binary component bytecode, so it is 
directly available to development tools. This idea is a very smart move by MS, and I 
hope to see
someone in the open software/systems side of IT replicate it.

Lastly, I should state that I come from an engineering software culture of RPC, DCE, 
and CORBA. I have written a realtime OORPC mechanism for control systems years ago, 
and am fairly familiar with the whole philosophy of the approach. So my comments above 
are really
from experience, not those of an idle detractor. And despite the above, I see great 
value in many of the OMG specs, even though I think the CORBAmed ones are good due to 
the high quality of the people involved, not the backwards tools. I also support 100% 
open process
and philosophy, not closed proprietary processes. The OMG's work is important; it 
would just be more accessible if the technical means of doing it were improved.

I hope my comments above will be taken as my opinion, and as part of a healthy debate, 
not as a flame!

- thomas beale


Reply via email to