Time and Money in the Cloud

What Could Your Enterprise Do With Extra Time?

Benjamin Franklin is credited with saying, “Time is money.”

The 2011 movie, In Time, depicts an entire economy based on time. (A friend pointed out that time is not fungible, but I digress…)

So, what could your enterprise do with more time?

When you think about it, this question lies at the heart of many data- and software-related enterprise activities, such as:

  • Disaster Recovery
  • High Availability
  • DevOps
  • Software development methodologies
  • Project management methodologies
  • Lifecycle management

I believe this is an important question – perhaps the important question of the cloud-era.
I believe time is replacing money in a way we’ve never before experienced.
I believe the cloud is driving this new economy.

Technology Economy Conversations

Not long ago I wrote of my experience when I left an instance of Azure Data Factory SSIS Integration Runtime running overnight and it cost me about $30USD. The post was titled The Cloud Costs Money and reflects some of the thinking of this post.

Not long after that, I was honored to chat with Stuart Ainsworth (@codegumbo) at Atlanta Azure DataFest:

Chatting with Stu

In this Data Driven DataPoint, captured while attending the inaugural Atlanta Azure Data Fest, I was honored to speak with Rie Irish, Julie Smith, Tim Radney, Geoff Hiten, and Stuart Ainsworth.

The event itself was an astounding event on two levels:

  1. The velocity of technological innovation is increasing (“well duh, Captain Obvious”) so, if you haven’t attended such an event recently – and by “recently” I mean the past eighteen months – you should attend to see how folks are combining cloud, Internet-of-Things (IoT), analytics, machine learning, artificial intelligence, on-premises, and hybrid technologies to deliver – frankly – amazing solutions.
  2. Community. Networking with people will change your career. It will change your career in a way that will change your life. Ask anyone who is engaged in a Microsoft data community. My synopsis of Atlanta Azure DataFest is here and my theme is “it is not too late to jump in”:

The next Azure DataFest (@AzureDataFest) is in Reston Virginia 11-12 Oct 2018!


Learn more and register here!

On Time and Money…

Stu and I spoke about the dynamic of time and money and how both relate to DTUs – the unit of measure for Azure data-related computing. So what’s a DTU?

According to Andy Mallon (@AMtwo) – paraphrasing Microsoft’s documentation in this post titled What the heck is a DTU? :

A [Database Transaction Unit] is a blended measure of CPU, memory, and data I/O and transaction log I/O in a ratio determined by an OLTP benchmark workload designed to be typical of real-world OLTP workloads. Doubling the DTUs by increasing the performance level of a database equates to doubling the set of resource available to that database.

That’s a great definition. But what are the implications?

Stu and I discussed the following data integration scenario:  Your enterprise hardware is currently fixed – which fixes the capacity of your data-related workload. You can change your enterprise’s workload capacity at any time; you can increase capacity by buying more or better hardware.

Imagine your enterprise migrates your data and data-related workloads to the cloud. (I know a company that can help! :))  After migration, your enterprise can scale hardware up to meet demand, and then scale it back down again when demand drops. The economics of pay-for-only-what-you-need-when-you-need-it is compelling, to be sure, and it drives almost all decisions to migrate to the cloud.

But there’s more.

Time to market matters to many enterprises.
Time to market matters more than ever to some enterprises.
The impact of time to market is easy to underestimate.

Thinking in DTUs

Consider the math: A DTU is a DTU. How the DTU cycles are distributed across time and processors doesn’t really matter.

Let’s say you pay $100 to incrementally load your data warehouse and the load takes 24 hours to execute at the scale you’ve selected in the cloud. Prior to thinking in DTUs, engineers and business people would think, “That’s just the way it is. If I want more or faster, I need to pay for more or faster.” But DTU math doesn’t quite work that way. Depending on your workload and DTU pricing at the time (FULL DISCLOSURE: DTU PRICING CHANGES REGULARLY!), you may be able to spend that same $100 on more compute capabilities and reduce the amount of time required to load the same data into the same data warehouse to minutes instead of hours.

That’s DTU Math.

The shift to DTU thinking is subtle but vital.

We are used to thinking the only way to make things faster is to spend more money. That’s simply no longer accurate. The shape of the line between cost and performance is may still trend linear but you can dramatically – and very, very quickly – alter the slope of that line, especially with regards to time.

The fact that the cost/performance curve can be altered in seconds instead of months meta-changes everything.

The statements above are examples of DTU thinking and DTU math. So, please ask yourself: “What could my enterprise do with more time?”

Why is that so important?
Because Ben was right: Time is money.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.