On 7 August 2014 21:59, zMan <[email protected]> wrote:
> ASCII vs. EBCDIC?

Almost certainly. Base64 is a way of representing arbitrary byte
values using a set of characters that is less likely than many to be
mangled in transmission. Think of it as a more efficient way of
representing byte values than specifying hex digits: using hex doubles
the required space, where base64 multiplies it by only 4/3 .

Both schemes encode byte values rather than characters. So you must
pass the desired byte value into the algorithm; not a character.
Imagine a "byte to hex" routine out there on the Internet somewhere.
You pass a string Abcd through a form using your desktop browser, it
will be passed as the ASCII(ish) representation of the characters, and
the return character string will be 41626364. If you pass the EBCDIC
string Abcd to an assembler "byte to hex" routine, you will get back
characters C1828384.

It's just the same with base64, but the algorithm is a wee bit more
complex. So the problem almost certainly lies with your input; not the
implementation. If you want interoperability with ASCII-based
conversion programs, then you must present ASCII input to your
assembler program or EBCDIC input to the ASCII-based routine. If you
simply want to protect your data from corruption during transmission
(e.g. in email), you can just use the assembler encoding program at
one end, and the matching decoding routine at the other.

Tony H.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to