Hi! I keep running out of space when downloading an ASCII BASIC program and
then using *LOAD*, *SAVE* to tokenize it on the Tandy 200. I'd like to be
able to tokenize on my UNIX host system before downloading it to my T200.

What I've found so far:

   - Robert Pigford wrote a Model 100 Tokenizer that runs in PowerBasic
   
<http://www.club100.org/memfiles/index.php?&direction=0&order=nom&directory=Robert%20Pigford/TOKENIZE>
   but requires Microsoft Windows, which I do not have.
   - Almost 20 years ago, John Hogerhuis was talking about making a Linux
   program that would handle detokenization
   
<https://m100.bitchin100.narkive.com/eP2uSl4J/tokenized-basic-programs-bytes-of-mystery>
   when talking to a PDD. He even released a Forth program to do it , but I
   don't see anything that handles tokenization.
   - The tokenization code in the Tandy 100 ROM disassembly
   
<http://www.club100.org/memfiles/index.php?action=downloadfile&filename=m100_dis.txt&directory=Ken%20Pettit/M100%20ROM%20Disassembly>
   seems reasonably short.

Questions remaining:

   - Has someone already written a BASIC tokenizer that runs on UNIX
   systems?
   - Is the tokenized BASIC for the Model 100/102 and Tandy 200 identical?
   What about for variants like the NEC PC-8201 and 8300?

I'll probably just write my own BASIC tokenizer since it seems simple
enough. To facilitate that, I've been gathering the information I've
learned on Archive Team's wiki
<http://fileformats.archiveteam.org/wiki/Tandy_200_BASIC_tokenized_file>:.
(Please let me know if you see any errors).

—b9

Reply via email to