Duplication of effort. Is it a bad thing?
IT organizations put a lot of effort into reducing duplicate code and design effort. Architects are tasked with producing standards and choosing technology for the entire organization. Responsibilities are divided carefully, and watchdogs look for opportunities for reuse, increasing conformity. Data is painstakingly normalized so that the same bytes are never repeated twice.
Is this optimal? Amazon defied this, breaking teams into independent entities, with their own hiring practices. Requiring backward compatibility, letting security and authorization be implemented over and over again. How inconsistent! How inefficient! The result? AWS. A surprise, and now very big business for what used to be a retailer.
The economy is an example. It is obscene duplication of effort for the same product to be produced in two factories near each other, with minimal differences in output. The same decisions made again and again, reduced economies of scale. If what you care about is reducing duplication of effort, central control is clearly superior.
Oh wait – communism doesn’t work. Centrally directed economies fail miserably. Instead, capitalism, with all kinds of random companies making the same decisions in different ways, duplicating effort all over the place but never quite the same, rising and falling unpredictably – all this leads to phenomenally better results. It leads to invention. To surprise.
It is like: say you have some blocks. To cover the most ground, you reduce overlap. Overlap is duplication of effort. But what do you get if you allow overlap? Height!
Perhaps if all the architect blocks running around making sure the other blocks don’t overlap, flattening them out, were instead allocated along with the others to arranging themselves as they found most fulfilling, the blocks would cover just as much ground and also create a pyramid.
Control reduces potential.