Quote:
Originally Posted by gaming_mouse
yes, and they have been, by very smart people who have spent untold hours on the problem, starting back in the 1950s and even before.
Concepts are not implementations.
Quote:
correct. i never said that nobody needs to worry about these things -- just that's it's only appropriate for low-level work.
All work becomes low-level when people build on top of your software. And all successful pieces of software tend to become something other people build on top of. Talented people don't want to work on the periphery forever. Untalented people tend not to make optimal use of advanced concepts and tend not to work on problems that are complex enough for these concepts to make a difference.
Quote:
normal programmers who absolutely shouldn't be thinking about this kind of thing but should instead be learning how to use high-level constructs well have been poisoned into a style of programming that is completely inappropriate for their work.
Normal programmers, in my experience, simply don't have the mathematical background to successfully write code where currying and point-free style are common. It adds significant cognitive overhead which is not really necessary given that most of the code they are writing is gluing together existing components and iterating as requirements evolve. Most programmers are best off adopting the conventions for the mainstream languages and frameworks because they are mostly solving an integration problem and seamless integration projects in the long run require compatibility, in conventions as well as in technologies. If you're not creating a platform, following the rules of the platform tends to be the path of least resistance.
Looking at this from another angle, size and complexity of the software matter. For small code bases, not much matters because there just isn't much complexity - most of the complexity is in other people's code you're using, which is minimized if you use mainstream technologies in a standard way. For large code bases, all kinds of factors become important - ecosystem, tooling, consistency across developers, readability, traceability, transparency, integration with other systems and these all favor mainstream technologies and limiting the level of abstraction to a manageable level. It doesn't matter some piece of functionality was implemented in 20 lines or 5 lines.
Also high-level design is completely divorced from low-level implementation at that level. You can write a stateful mess that is hard to understand in Haskell and you can write a software transactional memory system in C. The semantics of your system does not have to resemble the semantics of the language your system was written in. Any kind of real world system these days consists of multiple processes, multiple machines, multiple everything, across which language semantics can't be enforced. Runtime performance and control do matter - often it's precisely because you want to allow your users to have a high-level experience that you have to go further down in your stack.
Quote:
for the majority of business applications, it's just not an issue.
For the vast majority of business applications, whatever mainstream technology they are teaching in bootcamps works fine.