When considering the benefits of a move to a cloud-based IT model, there are many familiar arguments that more than justify the decision to adopt cloud. Cost, agility, scalability, reduction of Capex – all these benefits are real and quickly achievable in the cloud.
One of the most important benefits of cloud, which has been discussed but is only lately being associated with a core reason to make the switch, is the ability to use precisely the amount of resources necessary at any given moment. Thanks to automated scaling features, cloud can more closely tie infrastructure to usage (and thus to revenue) than ever before.
Before cloud, a company would have to anticipate exactly how many resources would potentially be needed for an application. This led to an “over enthusiasm” (lets call it) in many IT departments that accounted, to a degree, in the ballooning of IT budgets. Combine the number of servers and equipment purchased with the typical lead time of many weeks or months (and associated costs), and the process of anticipating, purchasing, installing and finally using the infrastructure resources took an incredible amount of time and money.
The over estimation, however, didn’t come from a bad place. Rather, IT departments wanted to ensure that an application would be adequately supported. More than this, they simply didn’t have access to the tools that would allow a more easy match between usage and resources. With the advent of virtualization, companies could more quickly generate the resources, but there was still over-provisioning. Even with virtualization, it requires a lot of discipline and planning (not to mention money) to drive application utilization above 50%. The automation tools, smart processes, monitoring and management systems of the pre-cloud period required a substantial investment of money, people and time. Often these were inaccessible for the average small to medium business.
As cloud has developed over the last 5 years into the more mature (arguably anyway) IT paradigm of now, its adoption is both a challenge and a solution to the way IT used to function for organizations big and small. Perhaps the push back is related to fears about how cloud can change an organization from the HR perspective, but in many ways the change it creates is more related to IT budgets and the way IT works at a strategic level.
In a cloud-based organization, applications can be developed, tested and launched in fully cloud environments, helping admins and systems staff anticipate more accurately than ever before the amount of resources something will need at a baseline. Meanwhile, once the app is launched auto-provisioning of resources available via a public cloud quickly, and at low cost, match the increase in demand. Once the increase in traffic subsides, so do the resources required, and, in parallel, the amount of money being put out. This changes the way financial resources are used in an IT organization, since most cloud resources can be acquired without the need to purchase and install large systems and servers for the purchasing company.
With the change in how usage and resources are anticipated and accounted for, IT organizations are increasingly at the forefront of the business, not relegated to the “back office” as it was in yesteryear. Over-provisioning still occurs, but the reasons for doing so are more starkly circumspect than ever before. With cloud, the IT organization can tie its efforts and expenditures directly to revenue in a way that makes IT more important than ever, not less.
By Jake Gardner