On Friday, 5 January 2018 at 13:22:00 UTC, Paolo Invernizzi wrote:

- be quantitative: your download statistics are a good start, try to collect from commercials statistics about the length of the codebase, the compilation times, how many are using a feature (C++ integration, allocators, scope when polished).

- be fact driven: analyse your own predictions about metrics with what you are as results from measuring, and iterate on the next decisions (also) based on that."

/Paolo

Yes, quantitative information is always good for making sense of things ;-)

I like how linux kernel development do their reports .. not that D is should be compared to the scale of the linux kernel development effort .. but still .. numbers, tables, graphs .. they provide a nice high level overview .. that you can't get when you're stuck in the trenches.

btw. interesting fact.. in a 10 year period, the linux kernel has gone from 8 million lines, to almost 25 million lines. If they average 0.5 defect per KLOC (a very convervative estimate), than means that 90 percent of the public cloud workload, 62 percent of the embedded market share, 99 percent of the supercomputer market share, 82 percent of the world’s smartphones, and nine of the top ten public clouds... are all running a operating system kernel with around 12500+ bugs in it.

Jeepers!

Reply via email to