A soap company had a problem.
During the packaging process, one soap box out of every hundred went through the assembly line empty.
They asked their engineers to solve the problem.
The engineers studied the issue layer-by-layer and devised an X-ray machine with high-resolution monitors manned by two people to watch all the soap boxes that passed through the line. If the X-ray showed an empty box, they picked it out.
Later, a small company also had the same problem and a rank-and-file employee was asked to solve it.
Being a non-engineer, he came up with another solution.
He bought a strong industrial electric fan and pointed it at the assembly line.
He switched the fan on, and it simply blew the empty boxes out of the line.
This example is an excellent illustration of the Einstellung Effect—a cognitive bias where people stick to familiar solution patterns even when simpler, more effective options exist.
The engineers approached the soap-box problem through their habitual lens: complex problems require complex engineering solutions. So, they designed a high-tech detection system.
The employee, not limited by that mental model, simply thought about the physical reality: an empty box weighs less, so just blow it away.
It’s as the old saying goes: If all you have is a hammer, then every problem looks like a nail.
The biggest risk in everything we’ve discussed thus far is that lessons learned become repetitions of existing thinking rather than generators of new insight.
So just how hard is it to unlearn?