Businesses are straining to keep up with the exploding volume of data online. The Open Compute Project wants to help. Its goal is to disrupt networking technology by applying open-source development, just as open-source concepts previously disrupted software development.
The proliferation of mobile devices has led people to take for granted that they'll be able to access data anywhere and at any time, according to Frank Frankovsky, vice president of hardware design and supply chain operation at Facebook and chairman and president of the Open Compute Project. "Knowing that the water is flowing is good, but we don't always think about where the water is coming from and the plumbing it flows through." he said during a keynote address at the Interop Conference and Expo this week.
Keeping up with that demand is a big challenge, and it's getting bigger. He cited an IDC study that found there were 800 exabytes of data -- 800 billion gigabytes -- on the Internet in 2010. That tripled to 2.8 zettabytes by last year. By 2020, IDC expects 40 zettabytes of data to be online. Storing that much data would require 13 trillion standard 3TB SATA drives, Frankovsky said.
Businesses will need innovation in datacenter and networking technologies to deliver that data. "We don't want technology to be a limiting factor to the richness of experience online."
That's where the Open Compute Project comes in. The open-source concept has brought huge benefits to software innovation, and this project is looking to bring the same benefits to datacenter hardware (its focus since it was founded by Facebook) and, beginning this week, to networking.
The Open Compute Project's hardware engineering has already brought big benefits to Facebook, which maintains a major datacenter investment to feed the demands of its 1.1 billion active users. Frankovsky said his company has reduced power consumption by 38 percent and related capital expenditures by 24 percent using Open Compute Project innovations.
When the project was founded in 2011, its participants were predominantly from networking customer companies but included some vendors. Later, Facebook realized the mission was too big to own, so it spun the project off as a not-for-profit. Its mission is to deliver tangible products, not wish lists. Its accomplishments include developing cold storage technology for archived data. Facebook uses that technology for the older photos its users post; demand for photos declines rapidly after they've been online for 30 days. The project has also developed Open Rack (a rack design specifically for datacenters) and the Group Hug standard for hardware planes.
This week, the project announced that it is expanding its mission to include networking -- specifically switches. "Unfortunately, networking hardware and software is a black box, an appliance," Frankovsky said. "But it should be just another server." Rather than deploying proprietary switches, networking managers should be able to roll out commodity hardware and download and install switch operating systems on it. That's the Open Compute Project's next mission. Servers and clients are often open-source entities, but networks are still proprietary.
"We've lovingly created islands of open software architecture but the way we connect them is proprietary switches," Frankovsky told us after his keynote.
Ars Technica has more in-depth coverage of the Open Compute Project's networking initiative.
What do you think? Are customers locked in to proprietary networking hardware? Is the networking industry ripe for open-source disruption? Let us know.
— Mitch Wagner , Editor in Chief, Internet Evolution