Ars Magna: the revolution is overdue
The past sixty years have seen numerous "revolutions" in the realm of software development: ‘structured,’ ‘CAD,’ ‘objects,’ ‘SOA,’ and most recently, ‘Agile." Despite these advances, our ability to successfully complete software projects has improved only marginally. The time for a real revolution is overdue, and this essay introduces and outlines a direction that such a revolution might take - the establishment of a Great Art (Ars Magna).
[Brief tilte note: There have been numerous works titled Ars Magna. Two of the most relevant are: Gerolamo Cardano, in 1570 - "One of the three greatest scientific treatises of the early Renaissance." Cardano’s book dealt with Algebra and contained the first published solutions for both cubic and quadratic equations. Ramon Lull 1305 - considered the first work on computational (combinatorial) science, and one that influenced Leibniz’s work almost 400 years later.]
The past sixty years have been dominated by a specific idea about software development - that it is, or should be, a scientific and engineering discipline. Every advancement in the profession - with the exception of object orientation and agility - have been based on that idea. The scientific-engineering approach has been highly successful - sometimes!
The World runs on software, software that works.
Airplanes - that could not fly without software - cross the sky. Banks process billions of transactions daily, using software that is reliable to greater than Six-Sigma standards. Millions of people have hearts that beat 24/7/365, thanks to software. Modern life - absent the Internet and the software that keeps it running and growing - is almost inconceivable.
Everyone knows of Moore’s Law and the dramatic increase in the computational power of hardware chips. But part of that increase is due to software - specifically, replacing hardwired instructions with software equivalents. It was software (parallel programming) that allowed the most significant advances in supercomputing - replacing special purpose hardware with off-the-shelf generic components and allowing massive scaling. Computational theory (the math) has advanced significantly and we are on the verge of creating an entirely new kind of computing - quantum computing.
Today there is little doubt that we can build artifacts - devices that combine leading edge hardware with leading edge software to do amazing things. We obviously can use software to create experiences - movies and games - that rival, in terms of our sensory and perceptual abilities, anything the natural world can provide. We can build special purpose applications - an email system, perhaps - that work, work well, and work, effectively, all the time.
Unfortunately, our failures outweigh our successes; as the following depressing numbers indicate:
- $250 billion expended to develop 175,000 projects.
- 31% of projects are cancelled before completion
- 57% of projects show 189% cost overruns
- 16% of small/medium projects are delivered on time, within budget
- 9% of large projects are delivered on time, within budget
- Large projects are delivered with 42% of features specified/planned
- Small/medium projects are delivered with 78% of features specified/planned
- 94% of projects are restarted at least once
- Project development time averages 18-24 months for small/medium scale projects, and 3-5 years for large scale
Objects and Agility were deliberate attempts to define and follow a different path. Sadly, as they became ‘mainstream,’ both were rapidly co-opted, diluted, and modified to the point that, as Alan Kay said about objects, "The revolution has yet to happen."
We expect that the situation will only get worse. The pace of change is accelerating and enterprises no longer have the luxury of spending 18 months to 5 years developing software that will be effectively obsolete before it is delivered. The problems we are trying to solve with computing technology are increasingly "wicked" in their nature. We are beginning to recognize that the systems into which we deploy software and the systems we use to develop software are "complex" - non-deterministic and with self-organizing and emergent properties. Ultra-Large Scale systems are qualitatively different from anything we have attempted before - and like the Internet, are likely not amenable to being designed and implemented using formal engineering techniques.
The absence of a revolution does not mean a lack of revolutionary ideas. Clearly, agile and objects were self-conscious attempts to radically re-invent software development. Numerous other ideas have been introduced over the years, "rapid-prototyping," "software development as Theory building," and the "West Coast" - "fuzzy" AI of the Lisp community are but three examples. There is also a long, minority, tradition arguing for software as an art form rather than an engineering method. The divide between academia and practice - with practice regularly diverging from the formalism of science and engineering is another source of revolutionary ideas.
The fact that none of these revolutionary ideas has resulted in an actual revolution reflects the power of the science-engineering paradigm coupled with the relatively sporadic and isolated attempts to introduce radical change. If a real revolution is to occur it must be grounded in an alternative to the prevailing paradigm.
Points of Divergence
Traditionalists, software engineers and computer scientists, claim that the "failures," noted above, result from a lack of rigor. If everyone would simply adopt more formal (mathematical) techniques for specification, programming algorithms, testing and proof, all software projects would succeed. Quality and process advocates suggest that adoption of a more formal and mechanical process (e.g. Six Sigma, TQM, CMM Level 5, and even Lean development) will solve most if not all, ills.
The traditionalists may have a point - we actually lack the kind of empirical information that would allow us to decide. But, we do know that teams and organizations are "trying to do it right," (i.e., according to the precepts of software engineering), but are still failing. Despite our best efforts, our ability to consistently deliver high quality software on time and within budget has only marginally improved over the past sixty years. This leads to the conclusion that traditional approaches are doing something wrong, or are approaching the problem from the wrong perspective.
An alternative perspective can best be shown by contrasting fundamental "points of divergence" - similar in spirit to the Agile Manifesto. Nine of the points are conceptual and address the assumptions or presuppositions that differentiate Ars Magna (AM) and traditional Software Engineering (SE).Four of the points introduce values that may be shared by AM and SE, but they are emphasized and central to the practice of Ars Magna.
- Artifact-versus-system. SE focuses on the artifact to be delivered - the program. When SE talks of systems they mean the inter-related construct of programs and hardware. An artifact can be specified and engineered to meet specifications. Specification conformity is the measure of success in building an artifact. AM focuses on the complex adaptive system in which the software is ultimately embedded - i.e. the business enterprise or the socio-cultural system. Success is measured in terms of the stability, flexibility, adaptability, and "enhancement" of the contextualizing system.
- Deterministic-versus-non-deterministic. SE is grounded in world-view that came from 19th century physics - i.e., the universe is a deterministic system (a clockworks / a machine) that can be understood and manipulated via formally defined laws and relationships. AM views the world, instead, as a complex adaptive system, non-deterministic and highly dynamic.
- Production-versus-Theory. SE is committed to the concept of a formally defined, manageable controllable, and, ultimately automated, production process. AM adopts the position advanced by Peter Naur, that development is ultimately a human act of "Theory Building." (We will have more to say about this later.)
- Mechanical-versus-organic. This difference is most obvious in the choice of metaphor, borrowed concepts, and the definition of fundamental terms. SE speaks of bridges and buildings - AM talks of cells and ecologies. SE believes in types - AM in prototypes (ala Lakoff and Johnson). SE defines objects as abstract data types with attributes and functions - AM objects are anthropomorphic homunculi with knowledge and behavior.
- Prediction-versus-Exploration. This is most obvious with regard process and management; with SE convinced that enough can be known to accurately estimate and predict the course of the future - e.g. "Plan the Work and Work the Plan." AM favors trial-and-error exploration and basing projections of what might be on empirical observation. SE conceptualizes (somehow) ‘The Answer’ which is then simply implemented. AM ‘pokes around’ until ‘The Answer’ reveals itself.
- Episodic-versus-Continuous. SE is focused on discrete, isolated, projects with clearly defined beginnings and ends; that proceed according to their own internal logic in timeframes that are inconsistent with the natural cycles of the enterprise or organization that chartered the project. AM is focused on continual, but small, changes to a living and highly dynamic system. Changes are limited in scope and scale to that which can be accomplished in short, natural - i.e. consistent with the embedding system, cycles measured in hours to weeks, seldom months, and never years. The AM development process resembles the SE "maintenance" process.
- "Self conscious"-versus-"non-self conscious." The terms come from Christopher Alexander’s first book, Notes on the Synthesis of Form. A thorough discussion of these ideas is not possible here (we will note, however, that Alexander disparaged the self-conscious process) but SE is self conscious and AM is non-self conscious.
- Control-versus-Coordination. The concept of centralized hierarchical control has been central to SE since the days of structured programming. The old Program Structure Chart with its Afferent, Efferent, and Transform modules reporting to a single Master Control Module is still visible in most programming languages (e.g. the mandatory ‘main’ function in a Java program). AM has failed to totally divorce itself from ‘control’ - witness the Model-View-Controller architecture at the root of early Smalltalk implementations - but, philosophically, it eschews control in favor of object autonomy, cooperative distribution of workload, and coordination.
- Art-versus-science. An old debate, with SE clearly against any vestige of ‘art’ while Ars Magna IS Art.
History. Few disciplines are as ignorant of their own history as computing. Ars Magna stresses the need to be aware of our own technical past. Awareness and acknowledgement of "Old Masters" and the study of their work and ideas is essential.
Philosophy. We focus too much on practices, techniques, and tools and far too little on the philosophical presuppositions, the cultural biases, and the unfounded assumptions behind the particular. Philosophy is critical to understanding why things are the way they are. Philosophy provides us the knowledge foundation that supports decisions to adopt technology and practices and to appropriately adapt them to new circumstances.
Liberal Arts. The Ars Magna professional must have an understanding of an incredibly broad range of knowledge. The computer scientist can be focused within the discipline but software developers must be able to comprehend myriad domains, be able to recognize and exploit, as metaphors, ideas and concepts from other fields of enquiry, and, perhaps, most importantly, be able to integrate diverse threads of knowledge.
Humanity. Systems are designed and crafted by humans, for humans.
The concepts and values, just noted, define the point where the practice of Ars Magna diverges from mainstream software development methods. But what is the Great Art itself? The art of what?
Simply put, Ars Magna is the great art of Theory Building.
In 1985, Peter Naur (co-author of the programming language syntax notation, "Backus-Naur Form" or BNF) challenged the prevailing notion that "programming" was "the production of a program and certain other texts." The word programming is in quotes because Naur used the single word, "to denote the whole activity of design and implementation of programmed solutions."
Then, and now, mainstream software development is premised upon the idea that the goal is to produce an artifact (or artifacts) using a rational and replicable process - a production model. When Agile, specifically XP and Crystal, was first introduced it was a conscious effort to escape the production mindset. Alistair Cockburn goes so far as to, in his book, Agile Software Development, reprint Naur’s 1985 paper, "Programming as Theory Building." Unfortunately, Agile has moved closer and closer to the mainstream notion of production. Scrum and Lean are the two most popular and widely used variations on Agile ideas: Scrum is all about process management and Lean (both the production and the product variants) is all about process optimization.
Although the implementation of Agile has come to be about production and the management of production, Agile values and practices, especially as articulated in extreme programming (XP), are directly supportive of Theory Building and will also be Ars Magna practices and values.
A Theory, according to Naur, is the shared mental understanding, of a group of people, of "an affair of the world and how software will handle or support it." The Theory inside the head of an individual programmer (remember the more expansive definition of this term by Naur) involves: " ... the activity of matching some significant part and aspect of an activity in the real world to the formal symbol manipulation that can be done by a program running on a computer."
It is critical to note that in both the group and the individual - Theory involves an understanding of the World and what goes on inside the computer on behalf of the World. Theory is focused on a single system - the World or an enterprise - composed of discrete elements. Elements are defined by the role they play in the system and by their interactions with other elements. Some of those elements might be ‘improved’ or better able to fulfill their role in the system if they were automated - if they were implemented as programs running on a computer.
This view of theory radically redefines the software development work unit. Strictly speaking any given act of programming will involve a single element of the overall system and perhaps a single behavior associated with that element, or, even more precisely, "some significant part and aspect of an activity." This redefinition is not totally alien, however; XP does essentially the same thing, defining a unit of work in terms of a User Story - one thing the user wants or expects from the ‘system.’
Partitioning the software development workload in terms of individual system elements, individual element behaviors, or single User Stories - has the necessary effect of simplifying the programming task. Individual users stories should take a team (at minimum a pair of tester/programmers and an On-site Customer) no more than a couple of days to implement, and deploy. The solution, once in production, provides feedback - is it usable, useful, and helpful - which confirms (or contradicts) the Theory.
A Theory is a gestalt - a comprehensive and complete - understanding of the single system. The Theory, therefore, encompasses the understanding of any need to coordinate and communicate across multiple elements. The System that is understood by the Theory provides both architecture and what XP called, the System Metaphor. Theory provides the necessary context for detailed understanding of any individual element of the system, what it is supposed to do (including obligations to all other system elements), and how it is to do it.
A team possessing a Theory can, according to Naur:
- " ... explain how the solution relates to the affairs of the world that it helps to handle."
- " ... explain why each part of the program is what it is, in other words is able to support the actual program text with a justification of some sort."
- " ... respond constructively to any demand for a modification of the program so as to support the affairs of the world in a new manner."
It cannot be demonstrated here but a Theory of a single system with some automated elements has the effect of dramatically simplifying the software. No more million line programs; no more thousand-line ‘main routines’ with complex Boolean or nested ifs, case statements, or cyclomatic complexity. This simplicity, coupled with the redefinition of a unit of work means that any piece of software can be changed or replaced in a day or two and that the software and the system in which the software is embedded (the enterprise) change and evolve in the same timeframe. Because a Theory is of both "an affair of the World and how the software handles or supports it," software and business elements are always reflective of each other and in synch - a long sought goal, "business ecology," is realized.
An important caveat: the vast majority of any Theory exists only in the minds of the people that developed the theory! For Naur:
"A main claim of the Theory Building View of programming is that an essential part of any program, the theory of it, is something that could not conceivably be expressed, but is inextricably bound to human beings."
A consequence of this view is that ‘documentation" is of very little value to anyone except those possessing the Theory in their collective heads. (It also different teams can ‘document’ requirements, design, or a code base and pass those to different teams who will somehow obtain the Theory simply by reading and understanding the documentation.) "Documentation" does have a role and a purpose. Reinhard Keil-Slawik shows how artifacts, like whiteboard drawings and Story cards, facilitate communication and collective understanding - i.e. the actual development of the Theory - and serve as evocative triggers that bring certain aspects of a Theory to conscious attention.
Naur suggests that it is critical that "the programmers having the Theory of the program remain in charge of it." In today’s work environment this is not practical - but XP concepts, like Whole Team, On-Site Customer, Pair Programming, and Collective Code Ownership expand the size of the group holding a Theory to a "critical mass" that can accommodate normal organizational turnover without losing the Theory. The ultimate goal would be for everyone in the organization to possess a common ‘Theory of the Enterprise." This common Theory would not impose the need for everyone to understand all aspects of the Theory - coders would still know more about ‘how’ an automated system element does what it does and a CPA to know why a tax calculation formula has to be what it is. Everyone would share a common understanding of the system, the behavior of its elements, and relationships among those elements. Everyone could, therefore, be in a position to recognize potential innovations and either implement them directly or communicate them to others who can do so.
A Theory is what is built - the result of Theory Building.
Ars Magna is not a process, nor is it a method. It is a set of activities and practices that reflect values and ideas. Ars Magna is antithetical to the notion of method in the same way that Theory Building is contrarian.
"To begin with, what is a programming method? ... Here a programming method will b taken to be a set of work rules for programmers, telling what kind of things the programmers should do, in what order, which notations or languages to use, and what kinds of documents to produce at various stages.
A method implies a claim that program develop can and should proceed as a sequence of actions of certain kinds, each action leading to a particular kind of documented result. In building the theory there can be no particular sequence of actions, for the reason that a theory held by a person has no inherent division into parts and no inherent ordering. ... As to the use of particular kinds of notation or formalization, again this can only be a secondary issue since the primary item, the theory, is not, and cannot be, expressed.
It follows that on the Theory Building View, for the primary activity of programming there can be no right method. ... the quality of the theory built by the programmer will depend to a large extent on the programmer’s familiarity with the model solutions of typical problems, with the techniques of description and verification, and with principles of structuring systems consisting of many parts in complicated interactions. ... Where the Theory Building View departs from that of the methodologists is on the question of which techniques to use and in what order. On the Theory Building View this must remain entirely a matter for the programmer to decide, taking into account the particular problem to be solved."
Naur’s depiction of Theory Building can be simplified to, "Do the Right Thing at the Right Time as a function of the Right Context." This ability is based on a comprehensive knowledge of the problem domain and how it works, the problem, exemplar solutions, and the ability to think and not simply follow rote procedures. Kent Beck talks about XP in similar terms when he describes three phases of XP as: 1) out of the box - simply do what the book tells you to do; 2) adaptation - modify your book learning to suit particular contexts and problems; and 3) "transcendence" - do the right thing at the right time as a function of the right context.
This description should not lead one to believe that individuals or teams must establish a Theory before any productive work is done. Theories are developed the exact same way that Agile teams develop software - in an iterative-exploratory-incremental manner. In fact, you could simply repurpose the Agile (XP) practices as Theory Building practices and recognize that; at any given time, any set of practices, might be the most appropriate actions to take in order to advance and deepen the Whole Team’s Theory.
We can also relate common practices and tools directly to items that enhance the quality of a Theory - e.g. Patterns ("model solutions of typical problems"); TDD ("techniques of description and verification"); and User Story, Planning Game, and Retrospective ("principles of structuring systems").
Ars Magna requires thinking - as does Theory Building. And, ultimately, It’s The Thought That Counts.
Ars Magna recognizes five different "modes" of thought that must be mastered:
- Systems Thinking - In addition to the General Systems Theory that Gerald Weinberg, and others, attempted to bring into software development and computer science, it is also necessary to understand Complex Adaptive Systems and concepts like emergence, self-organization, and scale-freeness. Systems must be understood as more than a complicated interacting set of hardware and software components.
- Object Thinking - The ability to use the object metaphor (a homunculus with behavior and knowledge that is used to provide service to others) to decompose complex systems into their constituent elements and distribute responsibilities across those elements in an appropriate manner.
- Agile Thinking - A comprehensive understanding of the values, principles, and philosophy that shaped particular Agile practices and how to use that understanding to adopt, adapt, and eventually transcend the practices.
- Design Thinking - Importing and repurposing design concepts and tools - e.g. Design Brief, prototyping, visual thinking, metaphoric reasoning, liminality, connection and juxtaposition, divergent and convergent thinking, and bounding constraints - into Theory Building and Ars Magna.
- Computational Thinking - A deep understanding of what goes on inside the machine and the "how" of automating World System elements without adversely affecting their role or their essential nature. [Computational thinking should NOT, be used as the foundation for understanding the World around us as some have recently advocated.]
This essay is an intentionally provocative and controversial call for a real revolution in how we conceive of and practice software development. It asserts that most of software development (excluding low level software like device drivers, up to, perhaps operating systems) should be recognized as an Art, not a Science or Engineering discipline. It suggests some conceptual foundations on which the Art - Ars Magna - might be based along with connections to important but neglected or abused ideas in the profession; theory building, objects, and agile, primarily.
The intent is to raise awareness and stimulate discussion not to offer a prescriptive and comprehensive definition of Ars Magna (something will require one or more books to treat properly).
 The data behind this assertion are metrics that were observed in early (before they were diluted and corrupted) examples of Agile projects and Object projects.
that's a great essay. I would however argue that no matter what you do, it takes about a week to produce 250 LOC (lines of code) in a finished state. That is probably even true of someone writing an article, a book... Modern SE has compounded the problem by creating heterogeneous architectures that require lots of boiler plate code to bridge the conceptual mismatch between the different layers of an architecture. I personally don't see much solution to the problem without drastically reducing the size of the artifacts when we assume / convince ourselves that the 250 barrier is a constant.
Now, to be more specific on your proposal:
1. Systems Thinking: this is part of the problem, the time when a piece of software could be built independent of anything is long gone (I sure hope heart software are not connected and not subject to random updates). For most other engineering discipline, the context is a lot simpler and well known (building, cars, airplanes, boats...)
2.Object Thinking: for me this is the problem, the real problem of SE, regardless of what Alan thinks. The notion that you can "decompose complex systems into their constituent elements" and all these elements end up being "objects" is as flawed as it can be. Sure, I can create such a decomposition, but is it conceptually justified? When we decompose a car are the parts metamodel common? I am not sure it is true. There are electrical parts, mechanical parts, chemical parts, optical parts... Their assemblies and operating models are vastly different (you don't assemble an electrical system like you assemble a mechanical system). For me it is the same in SE, there are similar engineering areas that do not all roll up to "objects" (UX, data, computation, security...). Yes, a car is made up of atoms and molecule but automotive engineers are rarely molecular engineers. Yet, software engineer are asked to work at a level so low, that it could be compared to the molecular level of a car. Imagine designing the structure of a car at the molecular/atomic level?
3. Agile is a consequence of SE working at such a low level that if you don't validate what you are doing often (i.e. observe macroscopically what you built at the molecular level) you could built the wrong thing. Agile in itself has no value when you can easily visualize what you are building and map it efficiently to requirements.
Agile in itself has no value when you can easily visualize what you are building and map it efficiently to requirements.I don't disagree with this, but would state that at least half the "problem" in the software industry is that requirements are so difficult to articulate in an unambiguous manner.
Part of what makes all of the Agile processes work is the short feedback loops. They allow both the people building the software and the people for whom the software is being built to reflect on what has been done in the last increment, and verify that is was in fact what was needed. If requirements could be articulated clearly without ambiguity, this feedback would be unnecessary. I don't see that changing anytime soon, though.
What would be interesting to see is to see what method the projects where using in there day-to-day activity.
postmodern development - management hijacking improvement initiatives
Dave writes that improvement initaives (my words) "were readily co-opted, diluted, and modified ..." (his words). My view is that in many organizations grass roots improvement initiatives are, as soon as they show results, are hijacked by management. And translated into targets, key performance indicators and what have you got. To be dying softly soon after. It's kind of cynical, but it seems to me that it's up to the professionals to guard and protect their own best way of working. As Brooke suggests for designers: "Protect them from management."
An 1998 article by Robinson et al, Postmodern Software Development, offers a somewhat different and even more philosophical look on this subject. I guess it's and worth reading, given the suggestion in the article The Seven Dimensions of a True Craftsman: "... the passing of the knowledge form generation to generation to the next is crucial for the evoltion of the profession," also suggested reading bij Dave on InfoQ.
Link to Postmodern Software Development by Robinson et al: citeseerx.ist.psu.edu/viewdoc/download?doi=10.1...
I want to be paid and buy the things important to myself and my family.With all due respect, it's that attitude that is causing a great deal of the damage in the software industry. The people I've seen exhibit that attitude are the ones who learn one way to do something and stick to it. They don't try to further their abilities. They don't find better ways to do things. They only move to new techniques and technologies when they are told to.
In other words, they don't give a crap about their work and the sorry state of software is the result. At least Dave's Ars Magna is trying to change the dismal status quo. Which is more ridiculous?
yes, we agree 100%, but why is it that a car, a building or an airplane is not using Agile principles? The day we will realize that "computing" as a Software Engineering paradigm is the wrong direction (whether you dress it up in OO or not) then we will make progress. Until then we will try to find remedies to the wrong problem.
...why is it that a car, a building or an airplane is not using Agile principles?Because building is the wrong part of those processes to model. Look at how those things are designed, and you will see significant similarities.
Software Development is NOT a construction process, it's a design process.
Re: Comment [on Brett's comment, using "ridiculous" and "stupid"]
I guess Dave was right to write "provocative and controversial." No better recipe for a firm discussion. I agree with you that the idea of totally committed etc. is ridiculous, and that a theory of everything is out of sight for at least our lifetimes. But that not what's on: it's the need for a view with enough commonalities to work effectively together. The simple pragmatic point of view: that what works is true. That's about all the phenomenology we need.
On my turn, I strongly disagree with the way you describe your relation to the company you work for: as a knowledge worker it's in your own good interest to know what, know how, know why and care why. Effectively meaning that you get a better pay if you know why and care why. And for that you need the Theory. Allas.
reply to all
First, I felt a tinge of regret that the US left South Africa too soon (England can take care of itself) - but only a tinge, because the US has yet to recognize what football really is.
Second, nothing suggests that everyone understanding the entire enterprise is a "for free" activity. Naur even pointed out that if the Theory Building approach was taken seriously, programmers would gain in respect and compensation as a result.
As to "Systems" and "Objects"
When systems thinking was first introduced into the world of software development (Gerald Weinberg was probably the primary advocate) is was grounded in General Systems Theory which included biological and social systems as well as mechanical. Later, there was a brief flurry of interest in "composite systems" in the AI community - a composite system being one that included humans as an integral part of the system. CS/SE types quickly found that attempting to deal with such systems was 'too hard' and basically abandoned them in favor of understanding more and more about systems that were fundamentally deterministic ( 19th century physics, math, buildings, bridges, etc.) They simply made the assertion that all software (regardless of purpose) was such a mechanical system and that "more math" would solve all of our problems. As a couple of you pointed out, this is a basic and critical error.
By the way - the business community made this exact same mistake - thinking that the enterprise was a deterministic system that could be scientifically managed, controlled, and made predictable. The business world, in large part but not totally, recognized their error in the nineties and were talking about agile and complex dynamic systems long before the software world started to.
I am advocating "complex systems thinking" in this essay and elimination of the prevailing "deterministic systems thinking" at the core of CS and SE.
Objects are a universal and a natural way of doing decomposition. It is ingrained in our culture and in every natural language of which I am aware -- as nouns. The car example: when I look at a car I see nothing but a collection of objects - bulbs, wires, oil, cams, pistons, seats, body panel, etc. etc. And they do share the exact same meta-protocol - i.e. all of them are identifiable, have behavior, knowledge/access to knowledge, and respond to messages.
The problems arise when we attempt to implement this very straightforward notion of objects using computer languages. There are translation errors - even in Smalltalk. An object is NOT an abstract data type, although you might choose to use an ADT to implement a particular kind of object. But the biggest problem was an inversion of understanding about objects. This inversion can be seen in the first book every published about objects - Grady Booch's Object Analysis. First he says that objects lead to radically different conceptions and architetures - which is right (but never really happened outside the Smalltalk community) - but then says that "just as structured analysis arose from structured programming, object analysis arose from object programming." This is absolutely not true. Objects as concept and decomposition metaphor/tool came well before the existence of any programming language. You can also see this in the transformation of Simula (not a programming language) to Simula 1 (a programming language). The implementation (which was flawed) perverted the idea. And the mainstream rushed to adopt the perversion, only to be disappointed later and then to erroneously blame the concept.
Thanks again for your comments and for taking the time to read the essay.
It's all product development so we need a product development process not a manufacturing process. And guess what a big area of engineering is concerned about product development.
Maybe that's why car design has a more stable context and a higher degree of repetition. Yet, it is a hell of a complex process. A new car model is a multi billion investment!
I find it interesting that car manufacturers are facing the same problems as we do when things get more individual. The design variances due to the increasing number of interacting components in a car is one of them. Sound familiar?
I agree with JJ about the OO paradigm. My gut feeling is that it stands in the way often without solving any real problem. "The domain object layer is there, because we do things that way". For me, it is becoming too much of a habit everybody follows without questioning.
* $250 billion expended to develop 175,000 projects.
* 31% of projects are cancelled before completion
* 57% of projects show 189% cost overruns
* 16% of small/medium projects are delivered on time, within budget
* 9% of large projects are delivered on time, within budget
* Large projects are delivered with 42% of features specified/planned
* Small/medium projects are delivered with 78% of features specified/planned
* 94% of projects are restarted at least once
* Project development time averages 18-24 months for small/medium scale projects, and 3-5 years for large scale
Do you have the source for these statistics?
On the defense of SE
Let's start at first: Engineering, as it is, is a line BETWEEN science and art, not entirely part of one or the other. I know you've seen modernist buildings nowadays which don't seem to have too much art in them, but also you may have seen enterprise applications too - and to the contrary, you may have also seen beautiful websites.
Second: I don't believe that 60 years is enough for an engineering discipline to form: We haven't done it long enough yet. We know a few things already - lots of things in fact - that things may have states, that the category system of Aristoteles can be effectively used in programming (read: classes), and that the look [view] and behaviour [model] should have a connection [controller]. Also we know a lot about the form of data, or the usual form of applications.
It IS normal that at the beginning of a new engineering discipline, some perceive it as an artform. Still it's not the case.
- Engineering, as it is, is about the social-technical system, not the artifact itselfitself: architecture cannot be taught without thinking of the community of people using the building as an artifact, for an extended period of time
- Everything once built, even in thought, is deterministic. Have you ever tried to convert a Sherlock Holmes novel, to a, let's say, Danielle Steel-style romantic short story? There are two things to face: 1) the world is deterministic 2) even if it's deterministic, you cannot prepare for everything for an extended amount of time (Buckingham palace did not have toilets, win95 did not have mp3 support) 3) even if you cannot prepare for such, there are things that you can take as granted (like, a modern building needs a toilet, a webshop needs a buy button.) It is the thing built, and not the world, what is deterministic, exactly because it was created.
- Engineering process cannot be automated, as engineering is about building usable art for humans.
- Technology has mechanics, communities have mechanics. Engineering is not about deciding whether the world is predictable or not: it is about to make predictions and assume they'll be invariant for some time
- Mass production cannot be made without predictions. You say software is mass produced: true, may we have predictions then? If not, how do you think softwares should be built in such a number?
- The map is not the landscape: the model of the episodic/isolated project is not how projects behave But it's easier to talk about a map, than trying to show around in the landscape.
- if AM is not self conscious, what have you written, and what do you expect us to recognize??
- Construction of buildings aren't controlled: they are coordinated. So is software construction
- You cannot make a program without using science as a basis. It's really nice to see how media artists create primitive programs in Processing, yet they're hardly usable for anything, and they would have problems if the mathematics of the platforms beneath weren't thought out.
So, all in all, this is not an art. Doing it as an art only deliberates yourself from the deadlines and so on, who needs an artist with responsibility?; yet I don't feel your self-deliberation justified.
An engineer is responsible for the humanity at large, and the community [of users] he's working for at a small scale. It's even in the Software Engineering Oath of ACM.
Perhaps we should think about engineering more openly, rather than have preconceptions of it.
projects rarely fail to address the problem, not fail to build
Of course, the people who are blamed are the delivery teams rather than the steering committees and sponsors, because they don't understand their own roles in driving the project toward failure and coincidentally are the ones who get to define success after the fact, decide on the root cause and pass judgement. The delivery teams are technical and lack training in commercial management, so they try to think of technical solutions to the problem and we end up with these kinds of discussions.
That's not to say this discussion isn't worthwhile, it's just delusionary to think an improvement in the space discussed will dramatically affect the statistics presented at the start of this article.
Tan Hui Onn
but the software industry has been improving since then.
“This year (2009)’s results show a marked decrease in project success rates, with 32% of all projects succeeding which are delivered on time, on budget, with required features and functions” says Jim Johnson, chairman of The Standish Group, “44% were challenged which are late, over budget, and/or with less than the required features and functions and 24% failed which are cancelled prior to completion or delivered and never used.”
Especially, if such numbers are cited, I would love to see a proven improvement by any proposed course of action.
I think it's time to start being more careful about using the CHAOS report figures to scare ourselves into thinking that we must be working in a dark and filthy medieval dungeon of an industry.
If you dig into the Standish Group's programme around promoting and gathering data, it appears more and more like the Church of Scientology for Software people, asking you to pay up for every new level you attain to get more advice and the secret behind their data (which cannot be verified other than by trying your own comparative study, since the data is cleansed, anonymized, aggregated, interpreted and not published, not to mention that the participants get paid to tell them what they want to hear).
1. Look at these statistics from the Standish group!
2. Therefore, here is how to fix it!!!!
The hacker ethic
May I add this to my collection of theory-building articles?
I'm collecting significant responses to, as well as academic commentary on, Naur's paper in a PDF of my own (Discuss programming as theory building). May I have your permission to reprint your essay here within that PDF, for noncommercial study? (For example, Alistair gave me permission to include his appendix which discusses the programming as theory building.)
SE vs AM - ?, should be SE + AM
(talk about using the SOFTWARE DEVELOPMENT IS WAR metaphor a la Lakoff & Johnson :)
Now, I don't fall into either camp for business software development - SE or AM (and I agree, we need to make a distinction here, s/w development of hardware drivers and for other deterministic non-human centic systems does look more like an engineering discipline where the "traditional" SE priniciples would apply without much argument).
Rather, I think the truth is in between that software development combines both the Art and Science and the key to its ultimate success is the yet-undiscovered-and-still-being-sought perfect balance.
What we have de-facto is a weird and inconsistent mix of elements of SE and AM (another Theory), which cannot be explained, expressed or formalized in any rational way (or there's little attempt to do so).
I also know that taking sides is dangerous and if one goes too far on either side of this dynamic equilibrium, one would tip the scales and fall.
Too much disregard of the engineering principles and formalities of the s/w development is often at odds with the structure of and the way the modern enterprise functions (which is often very process centric and disciplines), including mundane things like budgeting, project timelines etc. We still live in the real capitalist world that has it own laws and constraints that are universal whether one develops software or one develops a piece of land for construction.
OTOH, too much preoccupation with the engineering aspects and formal SDLS methodologies leads to still-born software that is not what business users wanted, rigid and not adaptable to the evolving business needs, millions of LOC and maintenance nightmares (regardless of how many tons of documentation is produced as a result).
I am not sure we need a revolution here or there is ever going to be a revolution, though.
I come from a country that once has made a revolution and tried to reinvent the world. We all know how it ended and how much (and many) was sacrified.
I think we need to be more gentle here and look for synergies and, perhaps, borrow from success stories of both traditional engineering disciplines when they are able to produce quality products without necessarily resorting to smoke and mirrors and other paranormal activities on one hand, yet keep in tune with the ways modern business develops, how modern humans behave and what humans want.
I would like to see some synthesis rather than a condradiction.
See, if I would write an article one day.
Although I must say I am a busy software practitioner (a "software GP") rather than a academic / scientist so I have not read even a quarter of those excellent books by the "partriarchs" like Naur, Beck, Kay et al so I don't have a lot of knowledge baggage to fall back on (although I have lots and lots of experience baggage... too much to take on a plane!).
Once again - a brilliant and thought provoking article!
Re: reply to all
I agree with your consideration about objects but I understand where the opponents are comign from.
This is because the terms 'objects' and 'object oriented' are often perceived as having bad connotation resulting from the disillusionments of the 90's, in particular OOP failing miserably in the distributed computing.
Nothing wrong about the ideas, implementation was bad.
Objects and OOP are often associated with things like 'inheritance', 'tight coupling' etc each of which give the seasoned developers shivers down the spine (myself included).
The industry has learned the lesson and introduce the SOA concepts like that of a loosely-coupled and autonomous "service" which has a defined contract, behaviour (i.e. knowledge) and is communicated with using messages.
From that perspective, the SO "service" is in many ways similar to the Smalltalk's object but it's far distant from the C++ "object/class" paradigm.
What I absolutely agree on is that we need a consistent way of breaking down the complexity of the whole system into the smaller and simpler parts with inherent internal behaviours but adaptable "interfaces", sort of like humans - we all complex inside but we can communicate with and adapt to each other (well, most of us most of the time :).
How we call those "parts" - components, objects, services etc in the end doesn't matter.
The roots are in Smalltalk.
So, even if we wanted to get wildly optimistic and say that 50% of our projects succeed and 40% are challenged, and only 10% fail (my own belief is that the breakdown is more like 20, 60, 20) that sill means business is throwing away roughly $70 billion dollars a year.
Re: On the defense of SE
First, Engineering needs no defense, results speak for themselves. However, there are limits to what can be engineered. No one (except for a long line of failed utopianists) seriously believes you could 'engineer' a society or a culture or a biological ecosystem. It is my belief that "applications software" because it is integrally tied into systems that are inherently not amenable to being engineered will share that trait.
Second, the world has not been deterministic since the advent of quantum dynamics. Most natural systems have long been recognized as non-deterministic. This does not mean that you cannot have some degree of predictability - but it will be statistical not algebraic.
Third, even engineers recognize that understanding complex and wicked problems and finding solutions to them requires a different kind of thinking than that required to faithfully implement the solutions once found. This different kind of thinking is NOT grounded in Aristotle, Rationalism (Descartes, Leibniz), or the Scientific Method (which probably does not exist except as a post facto justification). This kind of thinking is more often found in Art and Design than in engineering or science.
Re: May I add this to my collection of theory-building articles?
Core competencies align with your Theory
He stated that "when you look at an organization’s core competencies as its most valuable resources, you can begin to think of learning, creating strategy, and innovation as parts of a single long journey. The journey is iterative, interactive, and full of small steps. Nobody gets a big aha one day. Instead, there is searching; there are missteps, experiments, and doubt."
Article is at:
Are you pulling a Sokal?
I have long held that programming is a research and design activity, rather than a process of mechanical assembly. You can predict in advance how long it will take 10,000 bricks. You can not predict how long it will take to figure out the right way to write an algorithm to solve a problem you haven't seen before.
The problem, of course, is that business needs (or thinks it needs) predictable timelines. But good businesses manage R&D facilities all the time. So it should be possible to manage development as an R&D effort! It makes things a lot harder to plan--the marketing program has to be just as agile as the developers, for example. But it's a model that makes a lot more sense.
Re: projects rarely fail to address the problem, not fail to build
I believe Agile is having a great success when adopted as intended because it solves the problem by recognizing/understanding the relationship between requirement+interpretation (specs) and delivery. And the solution to that problem targets both the creator(s) and the worker(s) by proposing to have 1) cross functional teams to define the product and 2) self organizing teams to build/design and deliver the product.
It will be interesting to have a more precise differentiation in the failure of project based on:
- did the development team understand the specs and accepted the deadline and failed to meet the delivery time?
- did the development team understand the specs and failed to evaluate properly?
- did the development team fail to understand the specs and delivered the wrong product?
- did the specs fail to reflect the requirements but the team delivered the right product?
In other words, it is important to point out the source of the failure in the statistics.
Anatole Tresch Mar 03, 2015