A small shift in focus has caused the modern data center to cease to resemble the data center of even a few years ago. While the data center has traditionally been characterized by specialty hardware, engineered for a specific purpose, the focus moving forward is going to be on software.
The problem with continuing to use hardware as the focal point for innovation is that businesses are moving faster and adapting more rapidly to the economy, competitors, and the political landscape that ever before. Hardware is a prohibitively inflexible place for the value in a data center to live. It’s expensive, inherently finite, and cumbersome to manipulate. Software, on the other hand, can be easily squeezed, stretched, and morphed to meet the changing needs of the business very quickly.
As such, the software-defined data center (SDDC) is quickly becoming the new standard in data center architecture. In this article, you’ll learn about some of the key tenets of the SDDC philosophy: virtualization, automation, and self-service.
The concept of server virtualization is how most IT professionals start to become familiar with the SDDC. It’s a great example as it perfectly illustrates the value in abstracting a data center construct we associate with hardware into a logical, software entity. The SDDC takes the same abstraction notion that you’re probably familiar with in server virtualization and smears it all across the data center: storage, network devices, deduplication appliances, WAN accelerators. No hardware is exempt from this virtualized fate.
The advantage, of course, of moving data center constructs into software is that once they’re virtualized, they’re no longer bound by the physical limits of a box in a rack. Rather than provisioning single-purpose devices with a fixed amount of CPU and memory capacity and a limited amount of space to fit disks and peripherals, the SDDC is built from pools of shared resources that can flex with the workloads atop them.
You probably know that computers are significantly faster than humans at performing tasks that can be programmatically defined. In a world where IT organizations are under tremendous pressure to deliver and often aren’t being given more resources to deliver with, automation is the secret weapon for many IT professionals. Moving routine operations out of the domain where people are responsible and into a system where machines take care of it is a great way to increase leverage.
Heretofore, automation has been challenging, and that’s putting it nicely. The complexity and variability in many legacy data centers relegated heavy automation to being nothing more than a pipedream. However, as the SDDC creeps in and takes over data centers, an inherent standardization also begins to take place. The software-centric architecture lends itself extremely well to a high degree of automation. And as more software vendors come around to the notion that programmability is critical these days, it’s becoming more likely that any given piece of data center software will have a solid application programming interface (API) that can be leveraged for automation.
The complexity and variability in many legacy data centers relegated heavy automation to being nothing more than a pipedream.
Likely the biggest challenge of any IT organization today is competing against the promises that external (outside of the organization) IT providers are making to their end-users. Amazon Web Services can create just about any infrastructure construct your developers can dream of in a matter of minutes and charge it to their credit card. Dropbox can provide you terabytes of publicly shareable storage capacity with little more than a few clicks. And why is this so tempting to users?
End-users and IT professionals alike are enamored with the idea of self-service. That is, to be able to request IT resources on demand and have them be configured and billed appropriately and nearly instantly, without the need to interact with the IT department at all. When the data center is automated to the degree that was just discussed, building a portal that allows users to request and approve requests for resources without any interaction with IT becomes much simpler than it was in the past.
Calling the data center “software-defined” seems fairly exclusive and almost implies that hardware isn’t a part of the picture anymore. Alas, hardware is still very much important. All software must still be run on some type of hardware eventually, no matter how highly abstracted.
One thing that does change, however, is the look and feel of the hardware in the SDDC. While the traditional data center was packed full of proprietary, single-purpose hardware, the SDDC is a bit more bland. You’ll likely see quite a bit of commodity hardware that all looks the same. That’s because the value of the SDDC isn’t in that it doesn’t need hardware – it does – but rather in that the value is not uniquely provided by the hardware. The hardware that supports the software becomes less important.