Hi folks,
I guess the following would be more towards explanations and rants...

Games, games and more games: I guess, even for busybodies like us, we
need some time for recreation, and one of the useful recreations is
games. Some of us are dreaming about our next project - games, mobile
apps and so forth. For those of us who are thinking about game
creation, I'm sorry to tell you that you cannot really write and
execute game code on your BN for a VERY important reason: there's no
compiler that'll run under BrailleNote (even for inForm language).
Although we could try Javascript, that won't solve our problem of
"native" development that'll take full advantage of BN's hardware (for
instance, we don't see a way to access file system using JS, nor the
version of JS in use is different from desktop browsers).

As for the claims of "Javascript being a universal programming
language", I'd like to cordially disagree: According to web searches
and from what I've learned so far, in order for a language to be
universal, it needs to fulfill the following requirements: versatile
for all systems, adoptable to wide range of problems and expressible
in wide circumstances (even using ambiguous code for algorithms).
Because Javascript is a language designed for specific circumstances
and there's no proof of system software written in Javascript, it does
not qualify as a "universal language". As of 2012, there's no such
thing, although we have close candidates like C++, Java (not JS) and
C.

As for the question on why we can't just write game code in English
and run it: computers are, in a sense, just a collection of silicon
(or something that emulates one). For all computers, the only thing
they know are "electronic switches" and whether it should be on or
off. To facilitate this, they use zero's and one's to represent which
switch should be on (1) or off (0). For instance, the following group
of bits (binary digits) could mean, depending on the computer (rather,
the processor):
01111111100011110000000011111101
To one machine, it could mean read a character stored somewhere on
RAM, while it could mean multiply two numbers together on another one.
Still, to another, it could mean decode videos for playback, while the
fourth one would happily go about finding out which USB device was
connected when reading this instruction.
As the picture illustrates, each processor (processor family, in fact)
has its own machine language (set of binary patterns which cause it to
do something). This is true for both physical and virtual machines
(sometimes called interpreter, VM's, emulators, etc.). As it turns
out, each character on the keyboard has its own bit pattern (for
example, the uppercase A would be represented as ASCII 65, which is
01000001 in binary).
As for compiling: Suppose if we wish to write a program that displays
some message on the braille display. Now wouldn't it be beneficial if
we write something like:
say "my name is something"
in KeyWord (wordprocessor) and when this program is run by another
program, would print the message in quotes to the braille display?
That won't be the case for majority of cases: simply because the
machine can't understand a series of English characters because it
only knows 0's and 1's. If we do attempt to run this program, it won't
run at all (for most cases because the OS knows that the correct
startup routine was not found), or in old days, do something
unexpected such as deleting the file you were reading because of
misinterpretation by the machine. And imagine using more than one
processor (from different companies) with their own machine codes -
you don't want to think about it (and, in most of our cases, its true
because X64 code (the ones found on Mac's and PC's from Intel and AMD)
is way different than ARM machine code (ones found in various Android
devices and BrailleNote), and Java bytecode (the code that Java source
code are compiled to) is different than say, a code compiled for
Windows CE devices written in C# (C-Sharp). Even in interpreted
languages (say, inForm emulator code), an interpreter for Android
devices won't accept an inForm (Z-machine) code and vice versa unless
there is a Z-machine code interpreter written in Java that runs on
Android's interpreter (technically the ARM code using a process called
"just in time compilation where interpreter code is transformed to ARM
or machine code when it starts running). And a good rule of thumb is
that interpreted code (some people would call it "managed code") runs
slightly slower than machine (or native) code executables.

In our context, an inForm (source) code written on the BrailleNote
will not run unless transformed into a code that the Z-code
interpreter accepts, hence my answer above: that you cannot do game
development in inForm on the BrailleNote alone; to finish your
product, you need to compile it on your PC and test it on your BN. And
in order for this to work, you need KeySoft 7.0 and higher.

As for the rest of the story about compilers/interpreters and how
C++/Java/BASIC/Python (and what not) source code is transformed to
zero's and one's so the machine can work with it, I'll not answer it
here (this kind of discussion is, in my opinion, reserved for
programming lists). But for now, I think the above answer would be
sufficient (if you need more detailed analysis/proof/explanations,
please email me or Alex H (or any programmers here) offlist).
Hope this helps.
Cheers,
Joseph

___
Replies to this message will go directly to the sender.
If your reply would be useful to the list, please send a
copy to the list as well.

To leave the BrailleNote list, send a blank message to
[email protected]
To view the list archives or change your preferences, visit
http://list.humanware.com/mailman/listinfo/braillenote

Reply via email to