Hi Steve,

Very clever. I had used ASCII85 encoding before, but your approach produces smaller data statements and the decoding is also smaller.

Ken

On 5/30/18 11:48 AM, Stephen Adolph wrote:
I often want to embed ML into basic programs.  There are 2 ways that I use
1) make a string of binary and assign it to a basic string variable. (use VARPTR) 2) include data statements that contain the ML binary, with some encoding, and use a routine to poke into memory

regarding (2)
I've seen 2 methods
1) encode data as decimal numbers
ex.  Data 125, 34, 56 etc

2) encode data as hex characters
ex. Data C9F501 etc

Neither of these are really optimal because they expand the size of the code by 2 -3 x

I've come up with an alternative to this which I'm now using. The raw binary, with some code changes, can be directly embedded in data statements, in quotes.

ex. Data "$%#(Lop" etc

gotchas:
1) all binary codes <=32 must be mapped to special sequences to avoid problems with BASIC 2) the " character has to be mapped to a different sequence otherwise you confuse BASIC

I use the "/" as a special symbol to mean "special sequence" and I map all characters <=32, + " + / by adding 64d to each character.

Ex. if the code 0Dh is encountered in the binary, it would get transformed to "/" + chr$(13+64).

Decoding is straightforward - look for / and subtract 64 from the next character.

Anyhow the net result is very compact with a minimal poke routine. I compile my machine code into Intel HEX format, run a simple program to encode the data into sets of DATA statements, and copy the resulting text into a .DO file.




Reply via email to