Bas,

why should that make a difference if it was already proven that
changing the optimization level of the compiler fixes the issue, and
that it is probably a special corner case of hardened gcc3.4? I
suppose it has to do with it's stack protecting techniques etc.

2010/7/22 Bas Mevissen <ab...@basmevissen.nl>:
> On 07/22/2010 05:22 AM, Victor Duchovni wrote:
>> On Wed, Jul 21, 2010 at 11:16:04PM +0200, Bas Mevissen wrote:
>>
>>
>>> Can you try what happens if you replace at
>>>
>>> typedef struct LOCAL_STATE {
>>>     int level;                /* nesting level, for logging */
>>>     DELIVER_ATTR msg_attr;    /* message/recipient attributes */
>>>     DELIVER_REQUEST *request; /* as from queue manager */
>>> } LOCAL_STATE;
>>>
>>>
>>> the first line with:
>>>
>>> typedef struct local_state {
>>>
>>> in virtual.h and same for DELIVER_ATTR and DELIVER_REQUEST?
>>>
>>> The difference is that the symbol LOCAL_STATE is now only used once.
>>> Most coding standards forbid to define a symbol more than once.
>>>
>> The local(8) and virtual(8) servers are separately compiled
>> programs (the latter being a stripped down version of the former).
>> The header files in question are not used in the same compilation
>> unit, and so this suggestion is not productive.
>>
>>
> Ah, even more fun. Using a different typedef with the same name all over
> the source tree.
> I would never recommend someone to have a global symbol redefined in the
> same source tree. Even if it are different applications as in this case.
>
> Anyway, my remarks is still valid if you change file local.h instead of
> virtual.h.

-- 
Regards,
Kai Krakow
http://hurikhan77.wordpress.com/

Reply via email to