I’m initiating this new blog on the Economics of IT because it’s an area that seems to need additional clarity and there is a plethora of material on which to comment or in today’s vernacular blog about. It’s also an arena in which I feel quite comfortable having an undergraduate degree in economics, an MBA from a prestigious graduate school (UVA – Darden), and served in executive roles as a CIO, CFO and Deputy Assistant Secretary of the US Treasury.
So where to begin? Let’s start with one of the latest fads in IT, Cloud Computing or for those of us old enough to remember “Time Sharing”. One of the early entries into this arena was Ross Perot and Electronic Data Systems (EDS). As the story goes, Ross was an IBM salesman and would make his quota in the first quarter of each year. He realized that his customers had significantly greater computing capacity than they used and started selling time to other smaller companies. As I remember, I think the first company to allow Perot to sell their excess capacity was Frito Lay in Dallas.
Eventually, Ross was making more selling time sharing services than selling IBM iron and thus evolved the creation of EDS. So how does Cloud computing differ from selling time on an IBM mainframe. First, the network doesn’t need to be hard wired to the customer who only needs ubiquitous Internet access. Second the hardware is cheaper and more configurable and capacity can be easily shared using virtualization technologies. Data Center facilities, power, redundancy, cooling, etc…; largely remain the same.
So what makes Cloud computing of interest today is the notion that it can reduce costs. Meghan Stabler (@MeghanAtBMC) caught my attention with the following post on Twitter:
#Obama’s 2011 Budget Suggests Agencies Use #Cloud to Cut Costs http://bit.ly/9u8vND
I pointed out to Meghan in my reply that Cloud computing can’t on its own actually reduce the cost of computing in the long run but simply shifts the initial investment cost from the user to the service provider, who will recover this investment and generate a profit over the course of the contractual relationship. If this were not true, then the cloud wouldn’t exist. So if you compare the one year cost of the Cloud to the first year cost/investment of providing a similar service internally, it appears there is a substantial savings. It would be similar to comparing the 1st year cost of leasing a car to purchasing the car. The first year cost of owning always far exceeds the first year cost of leasing.
Therefore, the economics of the cloud rests on the time honored reality that there are tremendous economies of scale in computing. Whether or not this is in chip production, computers, storage, networks, etc…
For gigantic organizations such as the Federal government these economies can be best achieved through consolidation of its 1,000′s of data centers and not necessarily from purchasing computing services commercially through the Cloud. For smaller organizations, the Cloud can be used to shift the investment in computing infrastructure to the Cloud vendor and over the short term lower its computing costs.
My Recommendation: If you want to determine if your cost of computing is lower using the Cloud compare the net present value of the cost of building your internal computing capability to the net present value of the cost of using the cloud over the expected life of the alternative direct investment. Theoretically, they should be roughly the same. Only if the Cloud computing vendor can achieve great productivity in the use of computing resources will there be a long term cost advantage.
William A. Crowell