TIBCO's ActiveMatrix BusinessWorks Emerges Winner In SOA TCO Study by PushToTest
PushToTest, makers of the TestMaker open source test tool, announced the results of a research study that evaluates and compares the total cost of ownership of SOA development and deployment solutions from IBM, Oracle and TIBCO in a paper titled "The Composition Approach for Large-Scale SOA", which is part of the revised soa knowledge kit. TIBCO was declared the leading solution on multiple aspects including total cost of ownership. InfoQ spoke with Frank Cohen, CEO and Founder of PushToTest, on the underlying mechanics of the study.
InfoQ: What is the motivation to perform a TCO study on SOA stacks and offer support on a kit of collaterals for the community?
As a community leader in the IT industry I keep looking for an expert source that provides software architects and developers with a way to understand SOA development platforms for interoperability, developer productivity and performance. PushToTest is working for medium and large organizations to define a standard SOA application blueprint to surface interoperability, performance and deliver problems. Adoption of these practices will help PushToTest profitably deliver the knowledge, test tools like TestMaker, and support services to organizations. The SOA Knowledge Kit is the defacto standard at Best Buy, PepsiCo, Deloitte, and 30 other companies.
By urging the standards bodies (OMG, OASIS, W3C, IETF) to adopt these practices I hope to help the IT industry to deliver a more reliable world. PushToTest's work on the Kit appears in the OASIS SOA Blueprints project. http://tinyurl.com/85tsbal. We maintain that work and others.
We initially published the SOA Knowledge Kit in 2008. I chose the Oracle, IBM, and TIBCO stacks since they provide SOA development and deployment platforms. In 2009 we added JBoss to the Kit. We hope to add other SOA platforms, including Mule, in the near future. The current work updates the Kit to the latest versions: IBM WebSphere Integration Developer V7.0, TIBCO ActiveMatrix SOA Product Suite 3.13, and Oracle SOA Suite 11gR1 (188.8.131.52.0).
InfoQ: What was the team composition on this project? Can you tell us the average engineer's prior experience with each of the stacks and with web services development and orchestration in general?
PushToTest used 2 engineering teams to implement the SOA Knowledge Kit. Each team has 1 architect with 5-8 years of experience building SOA, Rich Internet Applications (RIA, using Ajax, Flex, Flash,) and Web applications, 2 engineers with 5 years of Java coding experience, and 1 project manager. The engineers have experience writing EJB's, SOAP and REST-based service interfaces, mediation, orchestration, and workflow. Team A implemented the Oracle and TIBCO projects. Team B implemented the IBM project. It took approximately 3-4 weeks to implement the use case for each stack.
InfoQ: Were the same service patterns used to design and implement services on all the stacks? Can you describe some challenges that were faced with implementing certain service patterns on the individual stacks?
PushToTest defined a use case for a typical manufacturing organization. We implemented the use case on TIBCO, Oracle, and IBM stacks. We then made changes to the implementation: added HTTPS/SSL security, changed the message schema, and changed to an asynchronous message delivery transport. We ran a functional test and a performance and scalability test of the finished work. We document the use case and the developer's experiences and publish everything under a free open source software GPL v2 license.
The manufacturing scenario use case implements a 3-step business flow.
1) The Allocate Purchase Order (PO) process opens a new PO. The implementation provides a method through a SOAP interface accessed over HTTP protocols.
2) Reserve Parts using a service for warehouse just-in-time inventory control to reserve portions of the inventory to meet the delivery needs defined in the purchase order. This is a Spring and Data Access Object (DAO) service that receives a REST-encoded request over a JMS service interface. The message conforms to the inventory control service's XML message schema. We used the OAGIS organization's Business Object Document (BOD) schema. A Web page with Ajax elements provides a human interface to request the service.
3) Price Purchase Order - uses a service to assign a price point to the purchase order based on the current price catalog. The system prices products by simulating business functions on an SAP installation and accessed through a simulation of SAP NetWeaver's SOAP Web Service interface and security credential system. We built mock SOAP services using soapUI.
The engineers kept a Developer Journal of their experience on each stack. Each Journal is 60-80 pages long. The Journal contains our software engineer's instructions, comments, and opinions to accomplish the use case implementation step-by-step. The Developer Journals describe significant challenges on each stack and workarounds/solutions.
At a high level we found the following challenges implementing certain service patterns on the individual stacks:
Oracle's stack provides quick solution creation for engineers. Everything in the Oracle stack requires a code dive and manual adjustments to deployment descriptors, classpaths, and general configuration. Oracle's tools are for developers - not for architects and business analysts. There is little or no model driven approach to designing, building, and deploying SOA applications in Oracle.
For engineers identifying the tools for the task using only the Oracle web site was easy. Oracle's Quick Start Guide is straight forward and useful.
Nothing worked out-of-the-box. Everything required community support (especially of the BEA components.) Most solutions came from workarounds found by third parties. For example, we found many tutorials on Web Service creation on the Oracle site. The one we chose failed. http://st-curriculum.oracle.com/obe/jdev/obe11jdev/ps1/webservices/ws.html#t5. We found soapUI web service mocks can not be called from Oracle. JDeveloper throws a WebServiceException: Error creating model from WSDL. We document this in the Developer Journal.
Other times we found multiple tutorials, some worked and others did not. For example, we found a tutorial on building asynchronous services at http://download.oracle.com/docs/cd/E17904_01/web.1111/e15184/asynch.htm#CBHECBFG. We used annotations that are not available on the app server: @AsyncWebService and @PortableWebService tags.
The IBM stack is several big platforms integrated together: WebSphere Application Server, WebSphere Integration Developer (WID,) Rational Application Developer (RAD,) and WebSphere Process Server (WPS.) IBM RAD is a 6 G download alone. The platforms have version issues: WID 7.0 requires WAS 7.0 and RAD 8.0.3 requires WAS 8.0.3. You wind-up having to install multiple versions of WAS to make their SOA stack work. The parts are enormous and often don't play well together. For example, WebSphere Integration Developer (WID) gave us broken WSDL document output.
We encountered many situations where we could not get 'there' from 'here.' For example, we could not create a client for a service we built in WPS. WID refactoring could not handle simple changes to namespace, name, and schema values. WID created corrupt WSDLS. RAD does not have the capability to develop business processes. And, integration and process development happen in different tools: WID and WPS.
The TIBCO SOA stack consists of modeling tools, component development tools, and service grid deployment: ActiveMatrix Service Grid 3.13, BusinessWorks 5.9.2, Enterprise Message Service EMS 6.0.1, Rendezvous RV 8.1, and BusinessWorks Studio (BS.)
The tools are model oriented. They keep a better control of code and abstract away the underlying technologies. There is no need to look at code, just models with SOA Concepts. This is great news for developers because they now have common tools to work with business managers and software architects to model the service interfaces and workflow.
Code oriented developers without SOA knowledge will have a steep learning curve. The documentation is abundant and quite good. Finding the right guide is hard. The surface-level documentation on the TIBCO Web site often describes the functionality rather than showing how to do things. The modeling language is a proprietary one, reducing intuitiveness. BW Studio is an Eclipse Based IDE that will be familiar to many developers. Some concepts will be new, such as Aysnchronous Services using JMS where some code oriented developers would expect to use Message Driven Beans (MDB.) And, mediation flow was required to implement asynchronous services.
InfoQ: Can you shed some light on the functional and performance testing methodology for the SOA app?
We applied Agile software development practices when building the Kit. We paired our developers with testers and created unit tests as we built the services. Some unit tests were Java JUnit tests that made class/method calls to object interfaces. Other tests were soapUI TestSuites that made SOAP and REST calls to services. We used PushToTest TestMaker to repurpose the tests as functional tests and load and performance tests. TestMaker deploys the tests to a grid of test servers in a QA lab and also to a cloud computing environment (Amazon EC2, GoGrid, Collabnet, and Rackspace.) TestMaker produces a set of reports showing the root cause to functional issues and performance bottlenecks.
The SOA Knowledge Kit comes with the tests we implemented and a copy of PushToTest TestMaker. Rather than publish the performance results based on our available hardware, we deliver the ability for anyone to run the tests in their own environment to prove out the performance differences between each of the SOA stacks in their own data centers.
InfoQ: Can you share some details on the TCO model that was computed from development effort?
did a time/motion analysis of each step of the services lifecycle to reveal the amount of time and effort required to build, integrate, deploy and manage a range of services needed to assemble a composite application. The goal of the study is to compare the cost savings that can be achieved through greater developer productivity resulting in significantly reduced TCO. The TCO model is a Open Office/MS Excel spreadsheet that assigns costs to each step of the Kits development. The model comes with the Kit.
In side-by-side product implementations, TIBCO ActiveMatrix and BusinessWorks provided the greatest productivity savings. TIBCO took 29 percent less time and development costs compared to Oracle. TIBCO took 22 percent less compared to IBM.
InfoQ: Did you utilize SOA infrastructure capability such as repositories, ESBs and other intermediaries from individual vendors as prescribed? Can you describe some of the high level details?
We followed the published best practices from each of the stack vendors, including use of their repositories, ESBs, and mediation services. For IBM that meant using: WebSphere Application Server v8.0.3, IBM Rational Application Developer V8.0.3 Multiplatform Multilingual, IBM Websphere Integration Developer V7.0, IBM Websphere Application Server V7.0 Multiplatform Multilingual, and IBM Websphere test environment. For Oracle that meant using: Oracle SOA Suite 11gR1 (184.108.40.206.0), Oracle Database XE 10g 10.2.0.1, Oracle WebLogic Server 10.3.5, Coherence, OEPE, Repository Creation Utility 220.127.116.11.0, SOA Suite 18.104.22.168.0 (2 parts), JDeveloper 22.214.171.124, and Oracle Service Bus 126.96.36.199.0. And for TIBCO that meant: ActiveMatrix Service Grid 3.13, BusinessWorks 5.9.2, Enterprise Message Service EMS 6.0.1, Rendezvous RV 8.1, TIBCO Runtime Agent 5.7.1, ActiveMatrix Sample Examples BWSE 5.9.2, Business Studio (BS) VS, and Designer Design Time DABS 1.3.1.
InfoQ: What is the support plan for future versions of SOA stack solutions from the three vendors: IBM, TIBCO and Oracle?
We plan to refresh the Kit for the Oracle, TIBCO and IBM stacks in 2012. That will give them some more time to make major new releases. We are looking for the developer community feedback to identify the additional stacks to add to the Kit. For example, we would like to add Microsoft stacks (maybe BizTalk Server, ASP.net,) and more open source software projects. Please let us know.
The SOA Knowledge Kit is available for free download at http://soakit.pushtotest.com
The report quoted here is from February 2008. However, this article makes it seem like something that PushToTest has recently released. Please post the latest one or do not link to such an old report!!
Disclaimer: I work for TIBCO Software, one of the vendors evaluated in the Push To Test study
PushToTest released a refresh of the SOA Knowledge Kit last week with updates to the latest platforms from IBM, TIBCO, and Oracle. It's entirely new and updated. The white paper I wrote on the Composition Approach To Scalable Systems is from the original 2008 report because it still applies to SOA development today.
The URL for the kit is soakit.pushtotest.com
Clarifications on conclusions around IBM stack used
In the slide deck you published on Slideshare you made a number of assertions about your teams' use of the WebSphere products. In particular I'm baffled as to why you chose to use both WebSphere Process Server 7.0 and WebSphere Application Server 8.0 when WPS is a superset of WAS and could easily have run any service logic you decided to deploy to a separate WAS instance. There was no real reason (that I can see) for using different versions.
In this article you also repeat the statement that
integration and process development happen in different tools: WID and WPS.
That's really not correct. WebSphere Integration Developer (WID) is an IDE for developing WebSphere applications and business processes (NB it can now be combined with WebSphere Lombardi in the BPM 7.5 product which you don't appear to have used). WPS is an application and process runtime. The two, WID+WPS, work together and are not intended to be "different tools". You can't develop anything in WPS, you develop in WID and deploy to WPS.
In the list of products at the end of this interview you go on to omit WPS altogether!
Needless to say, whilst I'm surprised by your conclusions I really cannot comment on offerings from other vendors. However, I would point out that there appear to be misunderstandings of the IBM stack here, and that may well significantly skew the results.
Re: Clarifications on conclusions around IBM stack used
Thanks for your feedback. Even though you say this is not your area of expertise I appreciate your effort and comment.
Our method had a small team of engineers with a background in Web, App, SOA, and BPM implement a use case on the IBM platform. We turned them loose on the IBM.com site and using Google to find community and third party sites supporting IBM platform. We told them: No hand-holding from the platform providers. You can only use solutions you find from published publicly available information.
In this case the developers chose RAD as their first option and reported all the challenges that they encountered with that. As you point out there are other ways, including development through WID or avoiding preferrence through it. Had they chosen to do the development through WID you would have had another set of entries in the Developer Journal on pros and cons. There is one path that one developer needs to pick.
Your feedback to the SOA Knowledge Kit is welcome as a subjective opinion on the Kit and shows a different point of view are there before starting any development. Also, the Kit highlights challenges met for work performed on WID.
We distribute the Kit under an Open Source Software license to encourage you to do your own implementation. The effort will show on a level playing field how your work supports IBM's claims to being the best for developer productivity and performance.
Lastly, I wish we could have done this on WebSphere Lombardi 7.5. It was not available a few months ago when we started this project. Would you be interested in implementing the Kit's use case on 7.5? I would be happy to work with you and certify the results.