Premature Pessimization and the like...
October 27, 2014

A nice quote from the developers of SQLite, who have apparently got some serious performance improvements in their latest version:

The 50% faster number above is not about better query plans. This is 50% faster at the low-level grunt work of moving bits on and off disk and search b-trees. We have achieved this by incorporating hundreds of micro-optimizations. Each micro-optimization might improve the performance by as little as 0.05%. If we get one that improves performance by 0.25%, that is considered a huge win. Each of these optimizations is unmeasurable on a real-world system (we have to use cachegrind to get repeatable run-times) but if you do enough of them, they add up.

 

I have from time to time been accused of being too obsessed with seemingly trivial performance issues when writing everyday code.

There is an orthodoxy based around the concept of premature optimization which seems to encourage some people to believe that performance should be wilfully ignored when writing code, and only dealt with in an isolated step, using tools like Instruments, once all the dust has settled.

Even then, there is a tendency to focus on the low hanging fruit - the top few methods that show up in a profile - and to ignore the rest, or throw up one’s hands at the prospect of improving them. Small (or not so small) overheads that show up in all the code, such as those associated with message passing, memory allocation, and that sort of thing, can easily get overlooked. Similarly, the decision to aim for a particular programming style of idiom can sometimes overlook the fact that the choices they impose have consequences, like lots of dynamic allocation, or lots of memory copying, or lots of synchronisation, or hitting memory in the wrong order and screwing the cache.

I’m not saying that the basic premise of the premature optimization argument is wrong - far from it. It does make sense not to waste massive amounts of time doing insanely complex optimizations too early, on the wrong code. It does make sense to use tools to guide you, rather than guessing. It does make sense also to write clean code that makes your intent obvious.

Most of the time in any case the biggest improvements come from picking the correct algorithms, rather than in twiddling individual lines of your code. 

What numbers like the ones quoted above show though, is that a large number of small improvements to performance can have a massive impact in aggregate. You shouldn’t obfuscate your code unnecessarily or obsessively, but if there are two ways to achieve the same aim, both of which are clean and easy to understand, and one of them is obviously more efficient in speed or space, then you’d be a fool not to choose it.

You can only make an informed decision about which implementation to choose if you have some basic awareness of performance and the implications of your choices. Aiming for a consistent style (functional, object-oriented, whatever) probably makes a lot of sense if it cleans up your code base and makes the whole thing easier to understand; but only if you acknowledge the impact is has. 

It’s not wise to just defer even thinking about all this stuff until some mythical optimization phase later. 

It definitely is wise to gain a basic understanding of how the building blocks of your language work, and roughly how the things on which you build are implemented, and to make decisions accordingly.

In case you’re wondering, the title of this post comes from the following quote by Herb Sutter:

Definition: Premature pessimization is when you write code that is slower than it needs to be, usually by asking for unnecessary extra work, when equivalently complex code would be faster and should just naturally flow out of your fingers.

« Hubris? Thoughts on Apple and Panorama »
Got a comment on this post? Let me know at @samdeane@mastodon.org.uk.