Is Virtual T no longer being released by the original developers? (Ken Petit? Stephen Hurd?)
I was thinking about reporting some bugs and want it to go to whatever is the mainline these days. I've been noticing some strange artifacts in the way that NEC PC8201a binary files are saved to disk and I think it might be a problem with the tokenizer. Also, the file loading routines are overly strict and do not accept .BA files that run fine on real hardware. --b9 P.S. If anyone knows anything about the N82 BASIC token format, I would love to hear from you. I rewrote my optimizing tokenizer for the M100 <https://github.com/hackerb9/tokenize> family (converts .DO files to .BA on a host computer before downloading) and was able to figure out the file format <http://fileformats.archiveteam.org/wiki/Tandy_200_BASIC_tokenized_file> by using my Tandy 200 and Virtual T. But when I tried to reverse engineer and document the N82 BASIC file format using just Virtual T, I got weird, inconsistent results. (.BA files saved from Virtual T to the host computer and then read back in would be different. But maybe that's to be expected with the illegal token sequences I'm throwing at it?)
