> I just caught up with this discussion after taking some time off for
> packaging Corel's next opus. I agree with the idea that trival
> piplineing a preprocessor into wrc will not leave enough
> information for
> totally accurate error messages.
OK.
> However: If the resource compiler understood the "#line <line #>
> <filename> <visit #>" directives dumped inline by the
> preprocessor then
> the line numbers would be accurate (or at least as accurate
> as gcc's :)
The fact that the line number is incorrect is not the only
problem preventing correct error report, eventhough it is
the most important problem since in many cases the error
is obvious if the look at the offending line.
> Intra line locations are always a problem however tightly you
> integrate
> the preprocessor to the syntax checker/interpreter because the
> preprocessor _changes_ the line (this is the whole point after all :)
> and an arbitrary decision must be made which of the various
> generations
> of the line should be displayed.
I am not sure exactly what you mean. The #line directive changes
the line true, but the reason that the #line directive have been
generated is as to _kludge_ the error reporting in the case when
you "plumb" tools together. It is a last ditch attempt to save
the day when the programmer had been "lazy" (short on time) enough
not to implement a integrated preprocessor.
What arbitrary decisions are you talking about?
Note that macros can't expand to preprocessor
directives like #line.
If a macro is expanded to several rows and
some internal macro is in turn expanded to several
more rows and this code has some error the line where
the topmost expanded macro is should be reported.
All other lines are sublines to the that line
which might or might not be reported as extra
information. I don't see the problem.
> The source text is trivially available as source (given the line
> number), and the compiler has the result of the preprocessing. The
> nested macro substitutions are useless in a practical sense
> and hard to
> store and retrieve anyway. Yes, code analysis tools
> sometimes do track
> all of these intermediate calculations (usually not to the
> limits I have
> implied) but we are not writing a code analysis too (yet).
Well, winapi_check is the beginning of one, eventhough I
must admit that it doesn't handle preprocessor directives
that good currently. Running the external preprocessor
is out of the question it is much to slow, I have tested.
Beside I don't have any use for the content of most include
files either so there is a lot of wasted effort even in theory
and even more so in practise.
> I too say its past time to rip out the cpp stuff in wrc to make its
> grammar smaller and much less complicated. However
> #pragma/#line/#error support must be added as part of this.
I'm still against it, however I don't care that much,
it is probably good enough for most people and is much
less work than a full implementation a preprocessor in wrc.
Ripping it out however seems wasteful to me.
One day somebody might find time to make a _real_
resource compiler with integrated preprocessing.
> <RANT>
> At part of this work I suggest that we add --pedantic option to whine
> when the resource compiler notices that it has C or C++ code in its
> input. Declaring prototypes and structures without "#ifndef
> RC_INVOKED"
> guard lines is sloppy coding and forces the tool to be kludged to
> explicitly ignore such code. We had mondo trouble with C++ extensions
> to the C grammar (and MS extensions to that) when we tried native
> compilation and it could have been avoided with a bit of discipline.
> </RANT>
Agreed.
> Thank you :)
For what? For my patch or for the privilage to speak your mind
or for me to listen. Neither is very to thank for. :-)