From an executive perspective, I think we have been waiting for "the cloud" for some time now. Ever since the advent of the PC and it's insertion into the corporate environment, we have been trying to figure out how to manage users and data.
In the mid-90s, I was involved in rolling out IBM java-based thin-clients as terminal replacements hanging off of their mid-range platforms. This was a small step in the right direction (IP-enabled, Ethernet-attached, GUI-based) but still just a terminal with no real ability to match the performance characteristics or application availability of the PC, so it died on the vine.
Around the same time, we saw a hint of things to come with the Application Service Provider. Sure, people were tired of paying over and over again (capex) for updated versions of software that gave them no real advantage. Most people just want to create simple documents. Did each new version of Word make us that much more creative and powerful? The problem at that time was a lack of affordable bandwidth to deliver those apps as a service.
Then we saw the telecom bubble expand and miles and miles of fiber were laid to tap into the greatness that the Internet was quickly becoming. Then that bubble burst, 9/11 happened, and things shrank back from the bold new horizons that we looked to at the time. Simultaneously throughout the late '90s and into the early 2000s, we were hearing that we should focus on our core business and that IT was "context" and just a burden and while we understood where that was coming from, there was really no alternative.
And, finally, during that time, the last piece of the puzzle was slowly emerging: Virtualization technology.
So now, here we are. We have the mindset, the bandwidth, and the virtualization technology that allows us to decouple our data and applications from hardware which removes geographical limitations. We have all of the pieces to build out the Cloud 1.0. There are still a lot of questions around access, security, and offerings, but we are at a point where widget manufacturers can start to focus more on manufacturing widgets and begin to outsource those (IT) things that have ceased to provide a competitive advantage but that are still a necessity to function.
I think Nicholas Carr in his book, The Big Switch, does a great job of explaining why most IT functions will move to a utility type of model. He likens it to electricity. In the early days of electric power, companies had to build their own power generation capabilities on premise. Then they had to staff those systems to ensure that they were operational. One day, someone came along and offered them an outlet into which they could plug in their machinery and only pay for what they used, as they used it. Now the textile mill could get out of the electricity business and leave the generation of that power to someone that could do it on a much greater scale and provide it at a lower price than they could generate it for themselves. He sees the core functions of IT (computing and storage) as commoditized utilities that can easily be "generated" and delivered to any company that needs them and ultimately for a lower cost.
If companies can accomplish their business objectives more cost-effectively by pushing a bulk of their IT operations out to a cloud of some sort, then I am betting they will. After all, as Nicholas Carr says: [In] the end, the savings offered by utilities become too compelling to resist, even for the largest enterprises.
Tuesday, March 16, 2010
Subscribe to:
Post Comments (Atom)
It looks as though the ancient core functions of mainframe technology, centralized computing and storage "farms", is the wave of the future. All that is needed are sheep herders to tend the flock. This will save a bundle.
ReplyDelete