BT

SPEC for SOA

by Mark Little on Sep 22, 2009 |

In the mid 1980's Jim Gray (et al) wrote a ground breaking paper on performance measurements for transaction systems. That paper was instrumental in the creation of the TPC Council. The Spec Organization was founded at roughly the same time to look at more wider set of standardized performance tests. Since then TPC Benchmarks and Spec Benchmarks have become commonplace and often used in vendor literature to persuade customers of this or that level of compliance or performance. With the high profile of SOA over recent years, the lack of any official SOA-based performance benchmark led vendors to compete with their own ad hoc approaches, typically measuring ESB or Web Service stacks. However, the Spec Organization recently announced that they are now looking to produce an official Spec for SOA-based products.

As the announcement states, at present IBM, Oracle and VMWare are interested in helping, but the organization is interested in hearing from anyone who believes they have something positive to contribute, and you don't have to be a member of the organization to join the working group. As Andrew Spyker, chair of the working group is quoted as saying:

The benefits for companies deploying SOA include business flexibility and cost optimization. An industry-standard benchmark will help SOA users understand best practices for improving performance and help vendors deliver performance optimizations based on typical customer scenarios.

As with other Spec benchmarks this one is likely to be an iterative affair, so the initial list of things that will be tested should be considered in that light. For instance, testing the performance of:

  • Services on top of application servers using web services.
  • ESB technologies that connect and mediate the services.
  • Choreography of services into larger composite applications through BPEL.

As has been reported many times here, SOA is not Web Services, so we hope that the benchmark will not be limited to simply testing SOAP-based technologies. In fact the organization does go on to say:

The SOA Performance working group realizes that the technologies around which the initial benchmark is being built aren't the only important ones; different aspects of SOA that would introduce additional technologies are likely to be explored in the future.

As with any benchmark (and which Gray tried hard to point out 20+ years ago), the relevance of any benchmark is based on how accurately it measures real world deployment usage. So the Spec SOA working group is going to have to ensure that whatever they come up with measures what is important to customers. But if successful this could lead the way to a more standardized and accurate way in which to measure the performance benefits of various SOA-based implementations.

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Mark, totally agreed by Andrew Spyker

I think your points are spot on and I've blogged about the same points here:

webspherecommunity.blogspot.com/2009/09/spec-wo...

I'd encourage folks that are interested in the effort query SPEC for how they could help. As the chair of the working group, I'd love to see the effort become quite successfull and as you said,
for that to happen -- we need to ensure we measure what is important to users of SOA technology.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

1 Discuss

Educational Content

General Feedback
Bugs
Advertising
Editorial
InfoQ.com and all content copyright © 2006-2014 C4Media Inc. InfoQ.com hosted at Contegix, the best ISP we've ever worked with.
Privacy policy
BT