The term “premature optimization” is often misused. It’s supposed to be about trading simplicity for unnecessary performance gains. Instead, it’s used as a blanket dismissal of anything unfamiliar. That’s both inaccurate, and hostile to good engineering. Throughout my career, I’ve heard every one of these situations referred to as “premature optimization”. None of them are.
It’s not premature optimization when:
- They solved a problem elegantly, in a way that you didn’t think of
- They solved a problem elegantly by deviating from the beaten path
- They designed a clear and fitting code pattern that you didn’t come up with
- They used a fitting data structure that you weren’t aware of
- They used a fitting algorithm that you weren’t aware of
- They achieved extra performance without sacrificing clarity, with an approach that you didn’t think of
- They configured a piece of infrastructure for the required use case
- They let an appropriate existing system handle the work that it’s good at handling
…and they did that within the allotted time.
Respectively, it’s not premature optimization when:
- You propose a clean and elegant solution that they didn’t think of
- You challenge an existing practice with a simpler/cleaner alternative
- You introduce a new code pattern going forward, which improves codebase clarity and consistency
- You recommend a minor code change to use a more fitting data structure that they weren’t aware of
- You note that there’s a more fitting algorithm that they may not have seen
- You explain how a small change can make the code more performant without sacrificing its clarity
- You suggest an infrastructure config appropriate for the required use case
- You push for letting an appropriate existing system to handle the work that it’s good at
…and it takes an acceptable amount of time to accomplish.
Lumping good engineering with premature optimization is a sure way to discourage engineers, and push the codebase quality towards the lowest common denominator. Don’t do that.