Why I'm on the fence about cloud computing

Posted 1 year, 3 months ago | Originally written on 24 Jul 2023

I've always wanted to run my own servers. The first 'server' I bought was a €20 PC with 2GB RAM, which I attached to my home network and set up as a webserver. I was able to SSH home from work and fool about but did not do anything more than that. This was in 2014 before I had ever encountered cloud computing. At that time, I recall trying to find ways to host Django applications; Google offered AppEngine to run Django projects but I didn't pursue it since I did not imagine the ongoing costs would be worth it, however low. Incidentally, I recall also fooling around with heroku. I set something up which seemed to run but I never proceeded beyond that. Thereafter, I had a million emails telling me to migrate but never really bothered.

After leaving Ireland, I set up the old PC at home but this time was bugged by the public IP changing with every restart of the router. This meant that such changes would make the server inaccessible. I did come across some clever solutions which would run a server that would communicate with a remote server every time the IP address changed. Eventually, I implemented my own solution but knew that the best solution would be to simply have a static IP to the router.

In the meantime, my collection of 'servers' grew from the initial cheap PC to a couple of PCs each with 8GB of RAM running multicore CPUs. I bought them at a work auction for at most £20 each. Thus, I so far had incurred roughly £50-60 in hardware costs with no ongoing costs except the monthy Internet bill, which I would have to pay anyway even if I had my applications hosted externally. In effect, I have no operating costs for running my servers.

When I first encountered offerings by Google and Amazon, I was petrified that a 1GB machine with hardly a CPU to write home about would be billed at around £5 a month. Such a machine could hardly do anything useful. In any event, the question that bothered me was: why had we ever needed Xeon and Octa-core processors if I could string together a couple of 1GB machines? Had the wool been pulled over our eyes that servers must be beefy machines? The more I played around with the cost calculators the less sensible it seems to pay for cloud. By the time you are on to decent machines, even comparable to my humble servers, you're paying odds of £60 per month. On the other hand, eBay is choke full of old servers as well as decent old PCs which one can set up as servers. You'd be getting the exact same compute, right? I'm yet to find out why I should use a Xeon over an ordinary CPU. Apple's ditching Intel to end up with (at the very least) unbelievable power consumption stats suggests that we may have been living under a lie for a very long time.

A Market for Fat

I believe that cloud computing only exists because we live in a world in which a handful of tech companies have way more capacity than comparable companies. Big Tech companies such as Alphabet, Amazon, Meta and Microsoft employ armies of some of the best tech talent and managers in the world and are able to deploy them so effectively that they are able to generate so much tech capacity which they then sell off to others. It helps that they are extremely profitable so that they can spin the affordability of cloud - the illusion that you only pay for what you use. In reality, using cloud is a permanent operating cost that any business will always have to incur. If the cost of acquiring and managing such infrastructure is substantial then it makes some sense to go cloud. But just as the tech landscape is extremely uneven (only a handful of companies controlling more than half of the conversation) then only only a small number really need cloud. For the vast majority it is substantially cheaper to buy cheap PCs and string them together in a makeshift air-conditioned room and call that their data center. That's how Google started, anyway.

I'm not alone in thinking this:

  • https://deavid.wordpress.com/2018/09/15/the-cloud-is-overrated/


Edge cases

Perhaps you could argue: I would like my data to be as close as possible to my customers! And for that reason, one of the major costs with cloud is data transfers. Of course, ingress (getting data into the cloud) is free. In fact, so eager are they to get your data into the cloud that they could bring a truck to you just to gulp down all that juice. The truth is once you get your data to them you're hooked; you lose all options to manage the infrastructure yourself. Furthermore, that convenience of having data next to your users - on the edge servers - costs you. This is data egress or moving data between zones. But how many webpages really need to load that fast? I remember visiting Italy in 2006 and being aghast at the Internet speeds; pages loaded instantly. Nowadays, most pages talk to 1000 APIs before they even load so whatever gains you required by having data on the edge are washed away by the delays. For example, Gmail takes a eon to load and email really isn't that time-critical (use a phone for instant communication anyway). No one chats anymore; if anything, chat is such an inefficient way to communicate.