BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Sun claims lowest cost jAppServer2004 bechmark with Glassfish

Sun claims lowest cost jAppServer2004 bechmark with Glassfish

Possibly responding to a BEA executive's blog comments that Sun's previous SPECjAppServer2004 submissions were "a perfect example of slower software consuming massive amounts of hardware and software licenses", Sun has just posted benchmark results on a completely open source license-free stack and is declaring it the lowest $/performance submission ever. Sun's latest submission uses Sun Java System 9, marking it the first benchmark submission on certified Java EE 5 server but also the first submission on (what Sun declares to be) an open source stack (SJS appserver based on Glassfish, Solaris 10, MySQL DB).  The submission achieved  $712.87 JOPS on 3 Sun Fire X4100 application servers and 1 Sun Fire X4100 database.  A Sun blogger is directly comparing this to a previous submission by BEA (that was highlighed in the BEA executive's blog) with the same number of CPU cores, claiming to be ~8 cheaper in terms of $/performance.

SPECjAppServer2004 is a benchmark designed to test the speed of Java application servers using EJB, Servlets, JSP's, messaging, transactions, and other aspects.  The spec includes a test harness and also a sample 'dealership management application' used to run the tests against, with the end performance metric being jAppServer operations per second (JOPS), made up of Dealer Transactions/sec + Workorders/sec from the sample app.   Vendors submit to the benchmark to compare speed in JOPS per second, although a second valuable comparison metric,  $/performance, which used to be included in previous versions of the jAppServer benchmark series, is  not included in v2004 because none of the other SPEC managed benchmarks include it  and the SPEC group found this metric to be too political and too arbitrary to calculate - better to let readers of the benchmark analyze the platforms used and come to their own conclusions about $/performance.

Despite SPEC's efforts to take $/performance out of the equation, that metric is the real significance of this submission as Sun is using open source to demonstrate the cost savings of the open source + Sun hardware combination.    A Sun employee blog even directly compares Sun's submission to a July 2005 BEA submission that was highlighted in the BEA executives blog, which also runs on 12 CPU cores.  In the comparison, Sun is claiming $72/JOPS compared to the BEA submission of $621/JOPS.  These $/performance metrics are not officially published by SPEC but were derived manually be adding up the costs of hardware and software on the submissions 'bill of materials' section.

As experience has taught us, these benchmark comparisons are highly political and completely unuseful for really comparing $/performance. Even comparing the BEA and Sun submissions cannot be truly accurate as they are based on different hardware combinations (6 dual-cpu single core servers for BEA vs. 3 dual-cpu dual-core servers for Sun).  The specs however are useful excercises for the vendors to tune their application servers and also show what's possible.  Perhaps the submission people would most like to see would be from JBoss, which has not happened in the 5  year history of the benchmark.

Rate this Article

Adoption
Style

BT