Sorry, but it feels like you started this thread with an idea like "FP is 
better and only myths and misconceptions keep it from becoming more mainstream" 
whereas I think that FP itself is a big misconception -- it ignores the reality 
of what computers are used for and what you need to do when you "program".

For me programming was mostly solved in the 60ies with Algol and the likes. It 
was a "general purpose" language which means "imperative" programming and most 
inventions that came afterwards were nice, very welcome additions (objects, 
mutability control, sum types, pattern matching, exceptions, macros...) but not 
essential. Throw away the "imperative programming" and focus on OOP or on FP or 
on "declarative programming" or on "actors" and the result gets much worse as 
you removed the essence. That doesn't mean that OOP or FP or declarative 
programming are bad, it means they are not as important.

Reply via email to