How much does cloud hosting cost providers?

 

The cloud provider industry is becoming busier. Companies are jostling to offer a unique angle, and the price offered to consumers has become a serious indicator of success. But to what extent is this price reflective of the true cost to the seller? In this article, we’ll discuss the energy, security and other costs that go in to making up a cloud hosting provider’s expenditure.
1.         Server hardware costs. Servers are not cheap, but they can cost less to buy than regular consumer computers. The main reason for this is that they are usually bought in bulk, but additionally only specific cloud providers will need to provide much graphics processing power (a pricey component), let’s say, and so high-end consumer hardware can be abandoned in many cases. Of course, buying drives that allow for heavy data read/write throughput and contain bags and bags of empty space is not cheap. According to Amazon’s figures, a fully set-up ‘High-Memory’ server can total around $10,000. The cost is lower for ‘High-Compute’ servers, at around $2,500 for the top-spec hardware. Customer surveys suggest that servers will last between 3 and 5 years on average, so it’s not a ‘set-and-forget’ investment.

  1. Network hardware costs. Network hardware is usually calculated as being around a fifth of the cost of server hardware. So, depending on the cloud provider’s needs, this can cost anywhere between $1,000 and $2,000. Network hardware has an expected shelf life of around 3 years, so it’ll require replacement reasonably frequently.
  2. Maintenance costs. Keeping the hardware in tip-top operating condition – required to avoid downtime, which could be catastrophic for the future of any cloud provider – is estimated to run to around 10% of initial purchase price, according to AWS. This is, of course, an ongoing cost – so it is reasonable to assume that around $1,000 per server is budgeted a year. To put that in perspective, Intel has around 75,000 servers. Facebook has about 60,000 (as of April 2012), and energy expert Jonathan Koomey reckons Google has nearly a million. IBM has over 8 million square feet of datacenter room, so they may be pushing Google if that space is used efficiently.
  3. OS costs. A Windows license suitable for cloud provision costs $2,999 per server. Linux, on the other hand, is free. This might help to explain why Linux is around twice as popular as Windows on AWS.
  4. Electricity costs. It’s been suggested that electricity availability is what’s impacting cloud development in developing countries right now, and for good reason. Scientists at Switzerland’s Federal Polytechnic Institute (EPFL) believe that cloud data centers draw around 2% of the world’s entire electricity supply. Doing some back-of-the-envelope analysis of the Amazon guide, each server looks to draw an average of 300W. Electricity is priced at $0.09/kWh (US average in December 2008), so a server running for 24 hours would cost 64.8 cents. Going back to the numbers from point 3, that means Google’s running costs are almost $648 million in electricity alone, every day. This seems unrealistic, so more prominent cloud providers must adjust for this with lower-power servers. Nonetheless, the costs are not insignificant.
  5. Construction costs. Data centers cost a lot to build. Aside from the necessary infrastructure, huge investments have to be made in optimizing the backup and redundancy functionality. AWS reckons a data center costs around $23,000 per kW of power intended to run. That’s assuming a ‘Tier III’ data center space, which should offer 99.982% data availability.
  6. Administration costs. Highly-skilled personnel are required to run data centers, with an average wage for an IT manager running to around $90,000 (though this varies by region). Amazon reckons around 50 servers will be all each manager can handle, so a few are needed.
  7. Bandwidth costs. Where you site your data center makes a huge difference to how much your bandwidth ends up costing you. For example, a data center is Central London can get a way with paying a mere $3.13 per Mbps (annually), whereas Sao Paolo is closer to $30. Again – sketching some mathematics on the back of an envelope – if you want to be able to transfer 1 GB (8,192 Megabits) in and out of your server each second (both uploading and downloading count), you’ll be needing to pay an annual fee of around $30,000. In Sao Paolo, that’s more like $250,000. That’s why data centers need to be built close to urban hubs.

That just about wraps up everything a hosting provider has to pay to get itself running. As you can see, the costs are not inconsiderable, and yet providers rarely charge more than 50 cents an hour for even intensive use. The markup comes through the sheer number of customers adopting the Cloud – and should continue to hold firm, given recent positive predictions about the future of cloud computing and cloud management.