BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News XML Overload: Bad Design or Neccesary Evil?

XML Overload: Bad Design or Neccesary Evil?

Bookmarks

The available styles of SOAP attachment and binary payloads are expanding and adapting, and SOAP messages themselves are achieving epic sizes. Many real world SOA implementations report XML documents reaching multimegabyte sizes and beyond.

A new survey of 570 developers, sponsored by Rogue Wave Software, finds that XML documents are getting huge. Thirty Six percent of respondents report they now need to parse XML documents bigger than a megabyte in size. Half of these say they are already dealing with XML documents that exceed five megabytes in size. 58 percent reported over 6 data formats within their organization, with 25 percent of respondents using greater than 20 different formats.

The take home message is that XML document size is expanding as SOA matures. but is this a problem? How big of a problem is it? Does the survey provide any clues about the magnitude of the problem?

35 percent of respondents see greater throughput speed and 26 percent see memory efficiency as the greatest challenge to XML parsing.

The survey participants were selected from the Rogue Wave developer community and also drawn from visitors to a Microsoft .NET community site. This selection process may have created some bias in the results of the survey, however it's not intuitively clear that customers of Rogue Wave or users of .NET would have substantially different XML document sizes than the general community.

These results suggest an increase in the magnitude of the documents and some side effects from those primary causes. However, the survey does not provide a detailed understanding of the extent to which this file size increase is adversely impacting organizations. The closest available survey data in the report showed 42 percent of respondents have requirements to specifically increase the scalability of their applications and of those 15 percent need to increase scalability by more than 10X, and an additional 23 percent need to increase by between 5-10X.

The typical responses to this include throwing more hardware and networking equipment at the problem (Gilder's law suggests that bandwidth grows at a rate three times faster than compute power). In some sense the emergence of software systems that rely on such large payloads is an evolutionary function of the availability of such bandwidth.

But of course a more architectural view would be that better software design would decrease the sizes of these transmissions substantially.

The question usually boils down to a simple equation: is the time and effort of good design worth the savings in terms of network latency, complexity, hardware cost and stability of the application?

Discuss.

Rate this Article

Adoption
Style

BT