BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Performance Measured by the Penny

Performance Measured by the Penny

Leia em Português

This item in japanese

Bookmarks

Cloud computing is a game changer for developers. Not because it requires a new architectural model, that is driven as much by fads and fashion as it is by actual hardware requirements. Nor is it the seemingly endless capacity with near-perfect scalability that the cloud is promising. The game changer is how poorly performing code now has a real price in hard currency.

Since personal computers replaced time shares, performance has been a nice to have. Generally speaking, either the application performance is good enough for the hardware it is running on or it isn’t. You don’t gain anything by dropping your peak CPU utilization from 90% to 81%, expect perhaps a small discount on your electric bill.

With the cloud platform, dropping your CPU utilization by 10% directly translates to a 10% reduction of you monthly bill from your cloud provider. For example, Windows Azure costs 12 cents per machine hour of computational time. Using this knowledge and a good profiler, you could literally say a certain block of code is costing the company X dollars per month.

Once the cost of poorly performing code is know, companies can then make economically sound decisions on whether or not to spend time and money to fix it. Simply by comparing the monthly cost of the code with the salary of a developer tasked with improving it, engineering managers can say with certainty how much time can be spent before the laws of diminishing returns kick in.

The performance=money equation will also bring dynamically typed languages into sharp focus. If we have truly reached the time where dynamically typed languages are “fast enough”, then that will be reflected in the price for renting cloud servers. If, on the other hand, production costs start to skyrocket then there will be irrefutable evidence that a statically typed language is in order. But of course this will have to be decided on case-by-case and project-by-project basis.

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

  • Utilization

    by Dave Rooney,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    While I agree with your premise that poorly performing code costs more (and like that it has been called out that way!), I do have one nit to pick:

    CPU Time != Machine Time

    I remember back in my green screen days on IBM big iron when you were billed for the time the actual processor was used. One of my 'accomplishments' in those days was to rewrite an application such that the CPU cost was an order of magnitude lower.

    Fast forward 20-some years, and I was using Amazon EC2 to host some applications. Even if the application was completely idle and didn't issue a single call to the CPU, I was getting billed for the machine time. To be honest, I like that model better since it's much, much easier a concept to grasp - if the instance is up, the meter is on.

    Dave Rooney
    The Agile Consortium

  • Re: Utilization

    by Jonathan Allen,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Upon rereading the documentation on Windows Azure, I have to agree that it currently follows the EC2 model. So maybe things won't be so cut an dry after-all.

  • Re: Utilization

    by Dave Rooney,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Well, it depends on how an application is running.

    If you're using an instance as a server, essentially a virtual host, then there you're just changing the billing model for hosting.

    If, however, you fire up an instance, run an application, grab the results and shut down the instance, application performance is much more important and your assertion is quite correct.

    Dave Rooney
    The Agile Consortium

  • App Engine?

    by Thom Nichols,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    I think Google App Engine might charge by CPU time. They do have a CPU quota on operations (presumably to curtail runaway applications) but I think you can pay to increase that quota.

  • Mendacious Weasel Words

    by David Clarke,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Simply by comparing the monthly cost of the code with the salary of a developer tasked with improving it, engineering managers can say with certainty how much time can be spent before the laws of diminishing returns kick in.

    This is a terrific reason to shut down non-performing assets, restructure non-performing teams, and downsize non-performing individuals. I can foresee this being used by middle management to justify their continued employ at the expensive of development positions but has little to do with the quality of the developer, the quality of the code being improved, or the cost to the organisation of an application that doesn't meet the business needs. CPU Time != Business Benefit.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT