OK, to explain ...

*ALL* the "DBCS" (or NODBCS) compiler option does is to determine how
X'0E' and 
X'0D' are treated when they appear WITHIN an alphanumeric literal.  When
it is 
turned on, then they are treated as SHIFT-OUT/IN control characters (and
this 
may be shifting to "Unicode" *or* to IBM-specific-DBCS codes).

The NSYMBOL compiler option determines
 - What the "default" USAGE is for PIC N data items (NATIONAL (aka
UNICODE) or 
DISPLAY-1 (aka DBCS))
- Whether N"literals" (and NX"literals" are DBCS or UNICODE format (on
output)

Therefore, it is "logically" possible to have any combination  of
NODBCS/DBCS 
and NSYMBOL(NATIONAL)/(DBCS).

IBM, however, at "roughly" the same time they changed the default to
DBCS 
decided (for "political reasons" - I believe, not syntactic reasons) NOT
to 
support the combination of

   NODBCS ,NSYMBOL(NATIONAL)

Clearer????
-- 
Bill Klein
 wmklein <at> ix.netcom.com
"Steve Comstock" <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]
> Imbriale, Donald (Exchange) wrote:
>> I think you're confusing the DBCS value of the NSYMBOL option with
the
>> DBCS option.
>
> Well, it certainly is confusing. But I tried to make it
> clear what I was saying is choosing the NATIONAL value
> for the NSYMBOL option forces on the DBCS option. And
> it still doesn't make any sense. Of course, it probably
> is not a good practice to have standalone options the
> same as choices for other options.
>
> Kind regards,
>
> -Steve Comstock
>

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to