On Wednesday, 6 January 2016 at 12:13:37 UTC, Guillaume Piolat wrote:
I guess obvious in hindsight.

I think that book is overrated, it was a debate book for the 80s where startups sold snake-oil... Yes, lecturers at universities toss it to their students for debate, because it was a classic, but if you want to dig into actual research on progress then Thomas Kuhn's thoughts about paradigm shifts are more relevant.

Basically what happens is that we have certain shifts that enable change. Like electricity, radios, CRTs, transistors, LCDs... Or right now: cloud computing, SMT solvers, probabilistic programming languages or other ideas that can trigger shifts.

We've had several advances that affects software development:

1. Algol (60s)
2. OO (late 60s)
3. Structured programming (60s/70s)
4. Structured analysis, entity modelling (60s/70s)
5. RDBMS (80s)
6. OO methodologies (70-90s)
7. Networked programming (80s)
8. Web (90s)
9. Grid/Cloud computing (2000s)
10. Formal provers (2010s)

The point stands about programming languages: their debut brought us order of magnitude improvements, and new PL won't. But i'll take the 50% improvement any day of the week of course. :)

I am more like 10-20x more productive in Python than in C++... ;-) Not that I consider Python to be a significant advancement.

We'll probably see a paradigm shift as a result of both a change in computing hardware combined with alternative approaches to software (like SMT solvers, probabilistic computing, actors).

Our hardware today consists of a lot of packaging, and the computing/storage happens on something the size of a small stamp. That's pitiful compared to the size of our brain.

Increased complexity requires change. We need robust solutions that tolerate human error (so actors, probabilistic computing, and proof systems). This is kinda happening already with web-services.

Systems are realized in a different way today than in the 80s, for sure.

Reply via email to