On Fri, Aug 23, 2013 at 10:13:56PM -0700, Walter Bright wrote: > On 8/23/2013 9:58 PM, H. S. Teoh wrote: > >On Fri, Aug 23, 2013 at 08:25:20PM -0700, Walter Bright wrote: > >>On 8/23/2013 7:10 PM, Jesse Phillips wrote: > >>>If we decided that 2 lines was how we do formatting, > >> > >>In general, I regard a "line of code" as one statement or one > >>declaration. Comments don't count, nor does cramming 3 statements > >>into one line make it one LOC. > >> > >>Of course, you can still game that, but if you're reasonable then it > >>is a reasonable measure. > > > >I am still skeptical of LOC as a reliable measure of language > >complexity. How many LOC does the following code have? > > Like I said, you can still game it. I think some common sense > applies, not a literal interpretation.
You conveniently snipped the rest of my post, which postulates a far better metric that's no harder to apply in practice. :) Comparing the compressed size of the source code is far more reliable than LOC. The absolute compressed size gives you an approximation of the Kolmogorov complexity of the source, which approximates the expressiveness of the language if you fix the functionality of the program you're comparing. The ratio of the uncompressed size to the compressed size gives you an approximation of the verbosity of the language's syntax. All it takes is for you to run zip instead of wc -l, and you have a far better metric for measuring language expressiveness. Insisting on using LOC despite this just makes one lose one's credibility. T -- They say that "guns don't kill people, people kill people." Well I think the gun helps. If you just stood there and yelled BANG, I don't think you'd kill too many people. -- Eddie Izzard, Dressed to Kill
