On Sunday, June 30, 2002, at 10:01 , bob ackerman wrote:
[..]
>> My complements on these two references.
>
> speaking of which - hard to break an old habit - eh?
> my 'compliments' to the chef.
[..]

technically I am Ideologically Orthodox - in that I bring the completeness
to the set - in that the packet is sent, and it needs to
be validated as having been recieved and noted as of value.

you are merely being Semantically Correct!

{ don't you just hate the whole SC Police thing taking over
the country! I mean this has been in place since the whole
effort to impose a Latin Grammar on Anglo-Saxon!!!! }

why if only the software were smart enough to identify
that not only were it syntactically a token in the language,
but also semantically correct in it's usage....

Clearly software applications need to resolve if
they should 'carp or croak' in these cases....

<rant>
Do I LOOK like a software application???? HUM?????

Do you have any idea how emotionally traumatizing it was
for me when my father learned to use a word processor -
and hence gave up the habit of embedding 'internal
encryption signature elements' that were the stack of
consistently misspelled words.... I mean it took me
nearly a decade to get over the fear that he had been
replaces by some sort of SPACE ALIEN.....
</rant>

There is also the problem embedded in all of this, which
needs to be addressed - namely the use of 'delimiting'
tokens in a 'string'.

eg:
        my $string = '\'string\'';

we have just inserted the "'" delimiters - should

        if (is_valid_english($string) ) {
                use_english($string);
        } else {
                bash_Infestation_Unit($string);
        }

Which is really merely a fancy declaration of the 'ambiguity'
of our notion of 'string' - and what defined the delimiters.

note also

        my $sentence = "The Cat Ran Up The Hill.";

Here we have a set of 'tokens' that are "white space" delimited,
but we also need to resolve if that should be "Hill"<delimToken>
or should we 'fault' on the last token because of a non_alpha....

All of which also gets us towards the semantical and syntactical
issuses we will want to address with tokens like:

        DamnYankee
        BleedingHeartWimpLiberal
        SpawnOfSatan

which in some lexical parsers would fail, because their limited
understanding of american cultural idiomatics, in which,

        my $correct_answer = "why yes, of course those are really 'one 
word'." ;

and hence would be "legitimate" 'english' - in the 'oral'
tradition of american - but tossed out by the culturally
myopic elements who are following the antiquated approach
with regards to 'word constructors'....

{sub_text: should one make sure to escape the embedded single quotes... }

which I hope will help explain why it is a DANGEROUS idea to
think that there is such a notion of a 'valid english word' to
begin with - let alone trying to solve any of the technical
issues associable with the process....

ciao
drieux

---


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to