Many questions and answers on the C/C++ pages, specifically or indirectly discuss micro performance issues (such is the overhead of an indirect vs direct vs inline function), or using an O(N2) vs O(NlogN) algorithm on a 100 item list.
I always code with no concern about micro performance, and little concern about macro performance, focusing on easy to maintain, reliable code, unless or until I know I have a problem.
My question is why is it that a large number of programmers care so much? Is it really an issue for most developers, have I just been lucky enough to not to have to worry too much about it, or am I a bad programmer?
I think everything on your list is micro-optimization, which should not generally looked at, except for
using an O(n*n) vs O(NlogN) algorithm on a 100 item list
which I think should be looked at. Sure, that list is 100 items right now, and everything is fast for small n, but I'd be willing to bet soon that same code is going to be reused for a several million line list, and the code is still going to have to work reasonably.
Choosing the right algorithm is never a micro-optimization. You never know what kinds of data that same code are going to be used for two months or two years later. Unlike the "micro-optimizations" which are easy to apply with the guidance of a profiler, algorithm changes often require significant redesign to make effective use of the new algorithms. (E.g. some algorithms require that the input data be sorted already, which might force you to modify significant portions of your applications to ensure the data stays sorted)Tweet