The Importance of Knowing the Difference Between Virtualization and Cloud Computing
Microsoft raised some eyebrows when Corporate VP Brad Anderson boldly proclaimed that, “Virtualizations is not cloud computing.” While this statement garnered a lot of buzz due to its timing – Anderson said this during the VMWorld annual event – that doesn’t make this statement any less true.
It is critical to remember that the principle concepts behind both cloud computing and virtualization have been in practice for decades. It is only enormous leaps in PC hardware architecture and the reliability of the Internet that have brought these terms into the general public’s accepted technical vocabulary. Virtualization and cloud computing are major catalysts to flexibility and innovation in the deployment of business applications. However, they have been erroneously lumped together by the public and I’d like to take a moment to set the record straight.
Virtualization is a systems admin and data management tool that has many technical uses; most of which have nothing to do with the cloud. The technology allows enterprises to use a single piece of physical hardware, to perform the work of many. Multiple operating system instances running on one hardware device are far more economical than a single piece of hardware for every server task. Still, there is no direct link to the cloud from pure virtualization.
Cloud computing, on the other hand, is access through the Internet to business applications running in a non-local environment. Cloud computing can certainly take advantage of virtualization but cloud computing can be (and has been) accomplished without the use of virtualization.
One way to describe the difference between virtualization and cloud computing in non technical terms is through the following thought experiment. Imagine you could be controlled by taking command inputs from another person – like you had a keyboard interface in your brain. Now imagine that the person controlling you is viewing you through a closed circuit TV link (a network), that is cloud computing.
To describe virtualization, imagine that you are standing between two mirrors, and now you see dozens of copies of yourself, only each copy was capable of doing tasks independent of the other copies. All the copies are “on” one piece of hardware (the real you). That is virtualization.
To bring the two together you would take all those virtualized copies of you and beam them out over many different links to an individual(s) who would control and interact with them remotely. Now, you have cloud computing leveraging virtualization!
The delineation between virtualization and cloud computing is important to make at this critical juncture in the technologies’ adoption.
Unlike most IT projects, cloud computing and virtualization impact almost everyone in the enterprise which means that non-technical people are involved in the implementation and deployment processes. By trying to “dumb down” the technical language, these vendors are leading their customers astray. This is not a simple case of semantics as there are distinct circumstances as to why why an organization would prefer virtualization over cloud computing and vice versa – especially since the two concepts are not two of the same thing.
Cloud computing and virtualization may be the lynch pin to modernizing today’s IT business environment. Together, they are ushering in a new era where companies are granted freedom to run their workspaces without having to conduct non-strategic IT functions such as patches, updates, and backups. The risk of not knowing the difference between virtualization and cloud computing can be a costly one as hype around these two technologies reaches a crescendo. Do yourself a favor and learn about the strengths and weaknesses of virtualization and cloud computing and map them to your organization’s needs before moving forward with an implementation.
By Charles Buck, COO and Co-Founder of independenceIT