On Wed, Mar 25, 2020 at 1:19 AM Tom Wilson <wilso...@gmail.com> wrote:

>
> On Wed, Mar 25, 2020 at 1:09 AM John R. Hogerhuis <jho...@pobox.com>
> wrote:
>
>>
>>
>> On Wed, Mar 25, 2020 at 1:05 AM Tom Wilson <wilso...@gmail.com> wrote:
>>
>>> Yeah, I experienced the same thing. At the very least, the de-tokenizer
>>> needs to scan for quotes and set an "inQuote" flag when it hits a quote and
>>> export those as their ASCII value, not their token code. However, it would
>>> be really nice if we could get some sort of hex tokens, such as \xFF, so
>>> the files would be 8-bit safe and we could construct BASIC programs on the
>>> PC without having to hack in extended ASCII codes after the fact.
>>>
>>>>
>>>>
>> It might be useful, but you wouldn't be able to load such a program back
>> into a Model T, at least without a well though out escape/quoting mechanism.
>>
>
> Yeah, this would be entirely for exchanging data between VirtualT and a PC
> based editor.
>
> I keep thinking of how the Commodore community did it...a long time ago,
> the magazine writers standardized a set of {token codes} for all the key
> combos that printed graphic symbols. So when we saw a listing with
> something like {Shift P}, we would hold shift and press P.
> I was actually thinking of writing a program to tokenize and de-tokenize
> stuff using the Graph, Code, and Graph/Code+Shift codes... but I want to
> get my Star Trek port done first. =)
>
>
Well we did establish a mapping between Model 100, 102 and Unicode (UTF-8)

http://bitchin100.com/wiki/index.php?title=Unicode_Mappings

-- John.

Reply via email to