Why is the difference between "standardized" and "proprietary" so important in today's datacenters? We're in an age of commodity hardware, with shrinking margins for hardware vendors and a cloud environment where the hardware brand actually doesn't make much difference. These days, some hardware vendors actually want you to confuse the two terms.
The distinction is so important because standards in and of themselves are actually super great. A standard where we agree to use the same terminology to minimize confusion among staffers (such as ITIL) or to stop networking confusion (such as IPv6) avoids wasting time and resources.
But an effective standard needs a reason to exist. What do I mean? Standardization could also be described as constraining or restricting options, so it's a good thing only when there's a clear business objective behind it.
The trouble is, IT isn't terribly good at standardization. In our InformationWeek Standardization Survey, we asked 400 business technology professionals to grade their organizations on how they're doing with standards: Did they enforce rigorous standards when needed but allow more agile behavior when possible? Only 9 percent graded themselves at A, and more than 40 percent gave themselves a C or worse.
We asked the question with that balance of rigor and agility on purpose. Standards without a corresponding and well understood objective can hurt IT's relationships throughout the company. If a developer can't use some time-saving step because the infrastructure team has an operating system or server standard that thwarts it -- without any clear, expressed reason -- will that developer be happy? I don't think so.
Outside IT, employees understand when a company standardizes on PCs, not Macs, for an established cost objective, and because most everyone knows how to use a PC. But end users are skeptical of more granular standards that they see as arbitrary and senseless.
Next Page: The Datacenter's 'Blended' Trap