By Leena Rao
McKinsey & Company released a report, “Clearing the Air on Cloud Computing,” Wednesday that claims that large corporations could lose money through the adoption of cloud computing. The report paints cloud computing as over-hyped and maintains that cloud computing services like Amazon Web Services (AWS) overcharge large companies for a service the companies could do better on their own. The study also says that while cloud computing is optimal for small and medium-sized businesses, large companies will spend less if using traditional data centers. Virtualization is the optimal way to go, says McKinsey, and by implementing virtualization in-house, corporations can reduce costs when factoring in depreciation and tax write-offs. Virtualization, which McKinsey says can boost server utilization to 18% from 10%, lets you treat one machine like many, by carving the servers into many virtual engines, so that software can maximize power from one machine and add scalability. Not only is this cost-effective for companies, but cloud computing takes advantage of virtualization.
The report makes some thought-provoking points but neglects to address a few key trends that are occurring in cloud server services. Innovation is rapidly changing in the cloud. The space is still very much a work in progress and big cloud computing services, like AWS, Google (GOOG), Sun Microsystems (JAVA) and Microsoft (MSFT), are regularly coming out with different products. As these companies throw their hats into the “cloud computing ring,” AWS will face increased competition in the market and could cause prices to go down to fight for market share.
Amazon’s (AMZN) cloud computing services, in particular, are constantly evolving. What started out as pay-by-the-drink storage (S3) and computational processing ((EC2)), now includes a simple database (SimpleDB), a content delivery network (CloudFront), and computer-to-computer messaging (SQS). Most recently, Amazon added a web-scale data processing engine with Amazon Elastic MapReduce. (It is a framework for accessing data stored in file systems and databases). It allows developers leverage Amazon’s cloud computing power by creating applications which process huge reservoirs of data (conveniently stored in Amazon S3) in parallel.
The next generation of enterprise apps is already begun to be written with the cloud and virtualization both in mind. At that point, it doesn’t make much sense to do it all through conventional data centers, when you can optimize through other services and get the best of both worlds. And many large companies currently use cloud services for a segment of their data storage, and also utilize and virtualize in conventional data centers.
Microsoft recently announced Exchange 2010, a new suite of Microsoft Office-related products, are designed to be deployed and managed servers on-premises or from the cloud. Microsoft’s Azure OS, which is expected to be rolled out in the fall, can host these office-related products in the cloud. It’s not a huge stretch of the imagination to speculate that in the not to distant future, Microsoft will integrates the on-premise storage and Azure storage together, thus allowing companies to tap into both utilities in the same application.
The report seems to hype the cloud costs and understates the rapid changes in cloud market conditions and resultant innovation and price cutting that will take place in the near future.