In 1833, British economist William Forster Lloyd coined the term “Tragedy of the Commons” to describe a situation in which individual users who have open access to a collective resource, unimpeded by formal rules that govern access and use, will act according to their own self-interest and contrary to the common good.
In Lloyd’s famous hypothetical, a group of individual herders share a public pasture for grazing their cattle. As each herder seeks to optimize his or her own economic gain by giving more of his or her cows access to graze, the commons eventually becomes depleted to the detriment of all.
In other words, when an infinite and seemingly “free” resource is offered up to be used with little consideration of cost or consequence, it becomes unsustainable.
There’s a similar phenomenon happening in today’s cloud-first data operations (dataops) environment. The “commons” in this case is the public cloud, a shared resource that appears to be free to the data teams using it since they have little visibility into what their cloud usage actually costs.
Crisis in the cloud
Industry analysts estimate that at least 30% of cloud spend is “wasted” each year — some $17.6 billion. For modern data pipelines in the cloud, the percentage of waste is significantly higher, estimated at closer to 50%.
It’s not hard to understand how we got here. Public cloud services like AWS and GCP have made it easy to spin resources up and down at will, as they’re needed. Having unfettered access to a “limitless” pool of computing resources has truly transformed how businesses create new products and services and bring them to market.