On Thu, Oct 3, 2013 at 1:35 PM, Lodewijk andré de la porte 
<l...@odewijk.nl>wrote:

> IMO readability is very hard to measure. Likely things being where you
> expect them to be, with minimal confusing characters but clear "anchoring"
> so you can start reading from anywhere.
>
> If someone could write a generative meta-language we can then ask people
> to do "text comprehension" tasks on the packed data. The relative speeds of
> completing those tasks should provide a measure of readability.
>
> I don't like anyone arguing about differences in readability without such
> empirical data. (it's all pretty similar unless you design against it I
> guess)
>
> XML is actually surprisingly readable. JSON is a lot more minimal. I find
> its restrictions frustrating and prefer using real JAVASCRIPT OBJECT
> NOTATION wherever possible, like INCLUDING FUNCTIONS and INCLUDING 'THIS'
> REFERENCES. Harder on parses, but why would you write your own anyway? (No,
> your language is not archaic/hipster enough not to have a parser for a
> popular notational format!)
>
What part of the Chomsky hierarchy do you not understand?
What part of running computations on untrusted data which amount to Turing
machines sounds like a good idea? The trivial DDOS, or the oh-so-amusing
use as part of a distributed computing service?
What dangers of multipass computation on potentially ambiguous data do you
think are worth the extra connivence?
And let's not forget the bugs that context-sensitive grammars invite.

>
> I think that's the most useful I have to say on the subject.
>
> _______________________________________________
> The cryptography mailing list
> cryptography@metzdowd.com
> http://www.metzdowd.com/mailman/listinfo/cryptography
>
_______________________________________________
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography

Reply via email to