What is the most ridiculous pessimization you've seen? [closed]

I think the phrase "premature optimization is the root of all evil" is way, way over used. For many projects, it has become an excuse not to take performance into account until late in a project.

This phrase is often a crutch for people to avoid work. I see this phrase used when people should really say "Gee, we really didn't think of that up front and don't have time to deal with it now".

I've seen many more "ridiculous" examples of dumb performance problems than examples of problems introduced due to "pessimization"

  • Reading the same registry key thousands (or 10's of thousands) of times during program launch.
  • Loading the same DLL hundreds or thousands of times
  • Wasting mega bytes of memory by keeping full paths to files needlessly
  • Not organizing data structures so they take up way more memory than they need
  • Sizing all strings that store file names or paths to MAX_PATH
  • Gratuitous polling for thing that have events, callbacks or other notification mechanisms

What I think is a better statement is this: "optimization without measuring and understanding isn't optimization at all - its just random change".

Good Performance work is time consuming - often more so that the development of the feature or component itself.


Databases are pessimization playland.

Favorites include:

  • Split a table into multiples (by date range, alphabetic range, etc.) because it's "too big".
  • Create an archive table for retired records, but continue to UNION it with the production table.
  • Duplicate entire databases by (division/customer/product/etc.)
  • Resist adding columns to an index because it makes it too big.
  • Create lots of summary tables because recalculating from raw data is too slow.
  • Create columns with subfields to save space.
  • Denormalize into fields-as-an-array.

That's off the top of my head.


I think there is no absolute rule: some things are best optimized upfront, and some are not.

For example, I worked in a company where we received data packets from satellites. Each packet cost a lot of money, so all the data was highly optimized (ie. packed). For example, latitude/longitude was not sent as absolute values (floats), but as offsets relative to the "north-west" corner of a "current" zone. We had to unpack all the data before it could be used. But I think this is not pessimization, it is intelligent optimization to reduce communication costs.

On the other hand, our software architects decided that the unpacked data should be formatted into a very readable XML document, and stored in our database as such (as opposed to having each field stored in a corresponding column). Their idea was that "XML is the future", "disk space is cheap", and "processor is cheap", so there was no need to optimize anything. The result was that our 16-bytes packets were turned into 2kB documents stored in one column, and for even simple queries we had to load megabytes of XML documents in memory! We received over 50 packets per second, so you can imagine how horrible the performance became (BTW, the company went bankrupt).

So again, there is no absolute rule. Yes, sometimes optimization too early is a mistake. But sometimes the "cpu/disk space/memory is cheap" motto is the real root of all evil.