BT
x Your opinion matters! Please fill in the InfoQ Survey about your reading habits!

IT Values Technologies Over Thought

by Mark Little on Jul 08, 2012 |

Steve Jones, from CapGemini, has written a lot over the years on various aspects of SOA, REST and IT. Recently in an article titled 'Thinking is dead', he's written about how he believes that IT values technologies over thought. But what exactly does he mean by this? Well Steve starts off by citing Anne Thomas-Manes article from 2009 where she declared SOA as dead, he goes on to state:

The value of 'thought' and thinking in IT has diminished, in a way that mirrors society at large, to the stage where design, planning, architecture and anything else other than just banging away at a keyboard appear to have been relegated behind opinions and statements as fact.

To illustrate this, Steve looks at REST over the past few years. As we've reported several times in the past, Steve has been critical of the hype surrounding it. Whether it's asking whether or not REST is really successful in the enterprise, of if there are fundemantal issues with the way in which it has been sold to the IT community, Steve's opinions on it can best be summarised by himself:

So the last 5 years have been poor ones for enterprise IT. WS-* is the only viable system to system integration mechanism for large scale programmes, but its stagnating. REST has some cool stuff at the front end and for people who can invest in only the very highest calibre individuals but is delivering bugger all for the average enterprise environment.

He believes that what this shows is that whether or not it makes good architectural or implementation sense, the latest cool thing on the hype curve is likely to get more attention than those more mundate or tried-and-tested approaches that would likely have shown far more immediate impact to the business. And it's not just REST where this has/is occurring. Steve believes that there are similar issues around Big Data and Hadoop adoption.

The massive amount of information growth is complemented by an equally large amount of bullshit and a huge absence of critical thinking.  What is the major barrier to things like Hadoop?  "Its lack of real time ability" I hear people cry.  Really?  You don't think its that we have millions of people out there who think in a SQL Relational way and who are really going to struggle thinking in a non relational non-SQL type of way?  You don't think that the major issue is actually in the acquisition and filtering of that information and the construction of the very complex analytics that are required to ensure that people don't go pattern matching in the chaos and find what they are looking for.

From his experiences, Steve is seeing planning, architecture and design within IT ignored or given a bad reputation - something to ignore, despite the weight of evidence behind the success of things like TDD and contract design. As the article states, the adoption of new and unproven technologies based solely on their hyped expectations in preference to those approaches that have proven themselves time and again but don't have the associated "twitterati in thrall", is rife in the industry.

'Experts' in this arena has come to mean 'people who shout loudly' in a similar manner to US politics.  Facts, reason and worst of all experience are considered to be practically a disadvantage when being an expert in this environment.

This is something we have seen before with, say, REST, where arguments for it have sometimes been based solely on "shouting", as Steve puts it, and less on rational and logical discussion. And it seems that Steve has found himself on the receiving end of just such an argument:

I was recently in formed (sic) that my opinion on a technology was 'tainted' as I'd used multiple other technologies that competed with it and therefore was 'biased against it'.  I'd also used the technology in question and found that it was quite frankly rubbish.  Sure the base coding stuff was ok but when I looked at the tooling, ecosystem and training available from those competitors I just couldn't recommend something that would actually work for the client.  Experience and knowledge are not bias, thinking and being critical of new approaches is not a bad thing, thinking is not dirty.

As a result of all of this, he believes that design and architecture are disappearing skills, with critical (scientific) assessments replaced by "shouty fanaticism".

The focus of shiny technology over business outcomes and the focus of short term coding over long term design will ensure that IT departments get broken up and business folks treat IT as a commodity in an ever growing way. Thinking, design, planning, architecture and being skeptical on new technologies is the only hope for IT to remain relevant.

One of the commenters on Steve's article believes that we are seeing a relatively new wave where every new technology is deemed to be the silver bullet to solve all IT problems, and that the reason for this is that IT today is driven too much by people in "suits". However, Steve believes it is worse than that and the core members of IT, such as developers, architects etc. are not really thinking:

I wish it was just the suits, the real issue is that too much of IT is being delivered by people who think that formalism and rigour is a bad thing and that the important thing is that they need to be 'shiny'.

If the people delivering the implementations that are supposed to be solutions to business problems aren't looking beyond the hype and considering alternatives, especially when those alternatives may have been tried and tested for many years, then we are in for some very interesting times ahead. But maybe Steve is wrong? Perhaps this is an issue limited to his engagements and not that widespread?

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread
Community comments

Offering hype while being skeptical of hype by Dean Schulze

Where is "the weight of evidence behind the success of things like TDD". The only metrics I've seen are described here:

www.infoq.com/news/2009/03/TDD-Improves-Quality

"The pre-release defect density of the four products, measured as defects per thousand lines of code, decreased between 40% and 90% relative to the projects that did not use TDD. The teams' management reported subjectively a 15–35% increase in initial development time for the teams using TDD, though the teams agreed that this was offset by reduced maintenance costs."

One person pointed out that the study isn't valid because it doesn't determine what would happen on the non-TDD projects if they invested that extra 15-35% of project time into QA. The last sentence about reducing maintanence costs is speculation.

There is a video from the recent Norwegian developers conference where a group of agile luminaries are talking and Uncle Bob himself admits that no one has any metrics to show that agile works. (If anyone has that link I would appreciate it if you would post it.)

TDD has been offered up as a silver bullet, but so far the results haven't lived up to the hyp.

The lone voices in a sea of noise. by Ethar Alali

I love this guy!

Do I think this is the norm in industry? I have to say, in my experience, the answer is yes.

I work as a consultant in agile, lean and more heavyweight software engineering spaces (specifically UML and RUP). I work in both development and architecture roles and having been through a number of organisations at different points on their agile 'journey', a lot of tech is used to justify the lack of thinking or worse, needing to think. There is a lot of citing that That TDD will solve all the problems, but don't actually know what 100% (or even adequate) coverage is. So they try to move to continuous deployment too early and it causes them problems as bad or worse than chaotic development, which makes the already nervous businesses even more nervous and cuts back on the slack given to the developers, sometimes causing the business to take control of the development capability within an organisation.

Unfortunately, agile methods were introduced by developers for developers and appealed to their innate hatred of thinking outside their comfort zone and especially to their hatred of modelling, documenting and design. The agile manifesto is more often than not misquoted and used to justify the eradication of modelling and methods (and in a sense, thinking), but often the justification for the use of technology is that it will make the teams more agile... which is a 'tool' in the sense of "...process and tool". Quite ironic I feel, but the bottom line is that is was introduced to a market that was pre-disposed to it and so has been taken on by developers who are really just programmers as well as the much rarer breed of lean thinkers (who actually do it right, but that is very rare in my experience) and everything in between.

The irony is that if you want to be empowered to do the job and make the company trust you to do the job, you have to have MORE discipline and not less!

As for the evidence for the success of TDD, a comparison of agile versus heavyweight methods was conducted and published on the agilealliance website presented an observational study of the structure of organisations where agile works best and also where more heavyweight methods work best. Some companies are not cut out for it, for reasons such as geography, amount of legacy code etc. and an attempt to introduce this into such organisations will be catastrophic at best and generate a lot of waste.

Then there is something that agilists themselves hold in high regard (but don't really understand) and that is lean. It takes no thinking to be agile and the oft quoted lean ideals require at least some thinking to be a reality. I would love to see a company claiming to be 'lean' that has made it without measuring their waste. That would be an impossible task.

IT is not about technology anymore, it's about the people doing it by Adam Nemeth

Well, to start with an unconvenient fact:

Most of WS-* is just the thoughtless bullshit the author is claiming against.

However, I agree with the notion that IT has created a kind of "technology politics", in which the winner is who can gain the best emotional support, no matter how well it fits the actual problems it ought to solve.

I also agree with that architects and software designers aren't in need anymore, I feel it on my own skin, being unable to find a job and trying to start a startup in a specific (non-technical) domain instead, as my job isn't needed anymore, it was replaced by people with equal right to shout whatever they had read in some blog or tweet yesterday. And it's really hard to argue with them rationally.

For most people, IT is too hard to see clearly through. Our solution as a society seems to be to accept this as a fact, and put an emphasis on democratic collaboration instead of understanding problems rationally, in hope of the "wisdom of the crowds".

We know of course, that this approach has its drawbacks, like sometimes it has utter failures, on the other hand, everyone in the team feels themselves good. 


I don't know what's the solution: I'm not a fit into today's IT society, as I can't argue with believers: I don't care what you believe about Spring, or Agile, or TDD, Please Do Argue, and do argue with proofs, with support, with anything, but show me how will it solve our problems right now.

That's why I basically stepped out and try to start my own startup, despite the fact that I was an architect.

Arguing with users is always different: they do have a problem usually. Sometimes they don't, they just heard that technology X / Agile / Lean / whatever is cool, and then I'm also helpless, I'm sorry.

How to cope with that?

IT is more and more about people, and all these don't matter. And IT is more and more about the people doing it, not the people who are supposed to benefit from it: does not creating any useful documentation make the users' life easier? I disagree,but it certainly relieves IT personell to do a boring task which they don't like.

Perhaps we're already over singularity, and we can't understand our own computers anymore. And then, any charlatan with no education, knowledge or ability to think logically is just as good as I am, as it doesn't matter: we can't know anything. At least, everything points in this direction.

And it seems I have to look for a different profession, despite understanding most of it...

In defense of REST as a "thoughtful approach" by Adam Nemeth

(and to support my claim about WS-*)

Personally, I like REST, as it starts with developing every single protocol nearly from the start, and thinking it through. Oh, I'm pretty aware that it doesn't scale well, but on the other side, it scales considerably well. It works for the most idiotic web developer you can find as well as for architects at Amazon or Google.

For me, REST is not about all the bullshit RESTafarians put on Fielding's PhD dissertation, like, how to use HTTP verbs and whatnot, but rather, understanding the drawbacks and benefits, the design decisions and the patterns behind HTTP, and how to build on top of those, every single time from the start.

With WS-*, you were either unsure what happens inside, which, considering its performance and security implications, wasn't necessary a good point to start with, or you did, in which case either you or the vendor you employed decided to implement things differently, supporting a given task better - which brought you to the same "design from scratch" table as REST does, except now you had two problems at least: the problem of your domain, and the problems with your WS-* implementation, usually together with a bunch of misterious bugs.

Adam, FWIW, I feel your pain :-) by Ethar Alali

You are absolutely right. I made these arguments back in 2002, as part of an email tennis game to Ron Jeffries. I hate to say that this has come to pass.

This has been compounded by the fact that development frameworks/platforms such as .NET and Java hide a lot of the detail from developers. So as you say, it can allow people into the industry who do not have the skills to carry out the role outside throwing code at a problem (million monkeys, million typewriters and a productivity tool such as CodeRush/Resharper).

You are also right in that there is no consideration for trade-offs by developers. They see shiny, they see more fun programming and they introduce this into the systems for that reason, because it is fun. Keeping the code tidy, which I agree is important, is often a suboptimal decision that can fail a system when inspected at enterprise level.

Take IoC/Dependency Injection containers for example. They are a good idea, I agree, but the introduction of the IoC framework (they 'shout') is to reduce coupling. However, they seem to ignore the fact that the introduction of the IoC framework actually couples them to the framework (and hence the deployment process needs to take it into account and maybe a shared repository needs to 'store' it). So the benefit is only drawn when you get past the point at which the reduction in coupling one has, is offset by the coupling to the framework and that is not to mention the difficulties in testing with some IoC containers. However, some developers these days don't measure coupling unless it interests them (and they even don't like the idea of drawing up 'as-built' models to actually 'see it', so they don't even know there is a problem *facepalm*

Trade-off analysis has been around in many forms, as you are obviously aware. It was the reason for ATAM and in some ways it is encouraged in the TOGAF process. Measurement of waste is something that is espoused by developers these days, but nobody knows what it means and how to do it. Agile is inherently wasteful, but it delivers in a chaotic environment. BUFD is wasteful only in the context that you may be paying developers to sit around whilst a big, heavy, architecture is being created (and thus it can introduce risk that because, say, 50% of the project time is taken up with architecture activities, there is now only 50% of the project time left to deal with issues should they arise, which is somewhat risky).

The choice of methods really depends upon the organisation one finds themselves in. Even Fowler suggests "If you can't change your organisation, change your organisation" which infers you should leave if the organisation is not listening or you hit a glass ceiling. Thereby inferring that developers have the power to leave en mass if they wish. The fact of the matter is that I have found more resistance from developers than anyone else, as I at least attempt to bridge gaps. The more passionate and capable the developers, the harder the inertia is to change. I happen to sometimes just shut up and adapt to things when I know there are better ways, because as you say, "the wisdom of the crowds" will out. Ironically, that often makes me more agile than the more intransigent teams, but [partially] joking aside, developers/architects leaving for that reason will find themselves leaving the industry, simply because all companies will be populated by the vast majority who really don't understand anything about engineering software.

It is something that is a fundamental cultural shift. In the UK (as in the US) we will not get this right! Societal fundamentals mean that the inherently Japanese cultural ways of working will not port unmodified over to this side of the pacific. So we have to transition into it. I personally have added Agile and Lean methods to my repertoire as an architect and developer and will decide when I go to a company which weight of method to use if I am empowered to make that decision. If I see it is chaotic, with lots of collocated teams, then agile it is. Otherwise, it is heavyweight. But as you say, the perceived 'need' for architects is dwindling and fast, even though Kent Beck in XP said there is a place for them.

Godspeed! :-)

Quote ... by Ejaz Mohammed

I recollect this quote ...

Nothing is wrong if most support it and nothing is right if most oppose it.

Quote: Indeed. by Ethar Alali

That quote is a double edged sword. A Barnum statement if you will.

We critics can use it to indicate that the vast majority who don't understand or know other methods will 'vote' against those that do (and with architecture, it is easy. After all, in the UK developers outnumber architects by about 30 to 1). The supporters will use it to say that everyone is in favour of the method, so it must be right! This is especially true when developers talk to developers as anyone else has zero credibility with them. However, they will gloss over that most people with any influence in most companies are against Agile methods. It takes productive action to make it visible.

Also, a very non-committal statement I would say :-D

We are getting to the a point in history where the first waves of senior developers have come through who have never experienced developing in more rigorous environments (including those of RUP, with small cycles etc. which can outperform Agile methods when performed properly, since after a small spec for the iteration, the testing and development is written wholly in parallel). The bottom line is the development world is effectively evolving the role of the architect out of existence. This was not the intention of the Agile manifesto, nor was it the intention of XP. It certainly will not suit all companies and worse, the lack of talking about the failures of Agile projects and the lack of development paper trail (indeed, even accounting for developers is shielded from the business, as it is often presented as simply as a summary), means that there is no evidence to say Agile projects fail, but the fanfare and celebration when they do in that organisation.

Hence even if we did have some metrics resulting from companies, the reporting of these is very likely to be skewed by the lack of negative evidence.

Re: Adam, FWIW, I feel your pain :-) by Paul Beckford

Hi Guys,

I get the sentiment, but to hear you guys speak you would think that there was some golden age of "Software Engineering". Well I started programming professionally in 1990 and it definitely wasn't a golden age. What I remember of back then was big teams of 30+ developers suffering on 2 year plus death marches, that inevitably ended in a train wreck.

Regularly spending over a week trying to track down C++ bugs put pay to the idea that software development was a science :)

As for the idea of it being Engineering, well I am a BEng (computers and Communications), and I am also a certified CMM project assessor, and I lived through the pretence that software is Engineering and I saw numerous projects imploded under the weight of their own bureaucracy at the alter of so called "process". The military (where these ideas come from) use to love this stuff, ending up with monstrosities like Ada, and millions of dollars of tax payers money wasted.

If you go back and read the proceeding of the NATO conference on Software Engineering in 1968 you will see that the attendees were very unsure how to categorise software development. They knew it wasn't Engineering, but they lacked a more appropriate metaphor, so the "Engineering" label stuck. The rest is history.

People produce software. It's a craft. Good people produce good software... period. Oh .. one other point, software development doesn't scale very well, so small teams improve the likelihood of success :)

One last point. The problem with the role of so called Architect, or any other role that doesn't include programming is that you end up with a group whose design decisions are unvalidated. How do you know that your glossy UML diagram is actually going to work, never mind prove maintainable? You don't, yet many feel comfortable imposing half backed BDUF onto others who are left with the the unenviable task of getting them to work! Hardly an egalitarian approach!

Which brings me to values. Good people with the right values, and human scale organisation (7 +/- 2). That what it takes, and that is what the essence of Agile is actually all about, but I agree, like everything else, Agile has become yet another product to flog, and as a result, yet another fad.



Paul.

We can't continue our journey in a car when we hit the shores of the ocean. by Andras Ludanyi

I agree that there is no critical thinking, but I would be very careful about other things because we can actually miss the point (and focus on the wrong questions). Architecture? Design? The problem is that we must first define the question and its boundaries; the question is not that there is no need for architecture and design in agile (actually there is more architecture and design in agile because it is "alive", iterative and it changes all the time); the question is how should we do architecture and design? Everything in the beginning or continuously to the end of the project? Do the project actually ends? Because software have a lifecycle and it can live a long time... This are the questions we have to answer.

But this is not the real issue in IT, because this will be settled one way or another within the next few years. The real problem in IT today is that we are in uncharted territories (we hit the ocean and our car can't serve as anymore despite the fact that it served us so well until now, we need a ship...). What I am talking about is the fact that until now IT was mostly about human-machine interaction, now there will me much more machines than humans and the majority of interaction will be between machines (its all about automation - no human intervention or very little at all), so the old way of forcing the machine to emulate human processes is over, what we need is software and technologies which will embrace and build on machine specific capabilities without trying to emulate the analogue processes we humans are used to work with.

This mean that data processing, messaging etc. will also need to change. This mean that finally it's time to use that multiprocessing, super fast, large memory machines as computers and not as human being emulators; shifting data from A to B is what we humans want, not what machines needs, the "natural" way for machines is working with references, pointers and not copies, snapshots and differences and not separate copies of data with little changes. Tracking is hard for us but easy for the machines, that is why human processes are about minimizing tracking and having relevant data as closest to us as possible, we have short memories, machines don't, we can handle only a few items of information at any given time, machines are different...

Solving this sets of problems will be the real issue in the next 10 years, finding people who are capable to understand, design and implement these new patterns, algorithms will be the real challenge especially after the last 10-20 years (CS are practically decimated and at most universities more or less become insignificant and most courses are significantly inferior than they was before...)

There is a reason why "shouting" is the winner right now (just as there is a reason for that in politics), and that reason is not so much that we are not capable to handle the problems we had up to now, but that we don't know how to handle the new problems... there is an ongoing transition and to be honest nobody knows how to handle this new world... yet.

Good Discussions by Faisal Waris

Very interesting and lively discussion.

I tend to agree with Paul and Andras most (but others make very valid points).

The main point I want to make is that software today is built on top of many other layers - many of which you don't control. These 'layers' can be:
  • persistence-management technologies (e.g. RDMS)

  • UI technologies (e.g Web, Mobile, 'Apps')

  • other systems and services (i.e. distribution)

  • a long list of frameworks and libraries(open source and commercial)

  • a multitude of standards


  • The amalgamation of all these layers makes the software environment chaotic. I think today you have to build software experimentally (Agile) rather than empirically.

    With COBOL, IMS and 'green screens' it was a different world and you could 'spec out' most of the software in advance.

    Today, you can only partially specify software - the rest has to be 'discovered'.

    Also, there are rapid developments (social, mobile, etc.) that we have yet to fully grapple with.

    I for one am very hands-on and do like to try out any new technologies before establishing strong opinions about them (but it's impossible to 'test-drive' everything out there).

    For the record, the SOA vs. 'REST' discussion (at my workplace) was resolved with due deliberation, resulting in appropriate guidelines for each.

    Re: Adam, FWIW, I feel your pain :-) by Ethar Alali

    Your comparison appears to be about the old waterfall v agile perspectives again, which is definitely not what I am getting at.

    UML is a language to define the static and dynamic elements of a model. Nothing more. A full 'spec' of the use case under discussion (including OCL elements attached to both the UML diagram and maybe stand alone) will allow you to validate your design without recourse to writing a single line of code and not just that, after what I consider is that point of 'just enough' design, you then parallelise the technical testing and development tasks, which you can then run through an automated unit testing framework. The spec in this cases uses OCL, but it can be anything at any level. Integrations tests, BDD whatever. The point is to define the contract at the boundaries and work from that (design by contract in the true sense. Sorry to those who only believe it is about a language specific interface development, but there is more to it). When defining interactions between systems, you are defining a contract. When talking to a stored proc, this is similarly a contract. A class to class interaction is a contract. UX/UI designers experiment to define a contract. Of course, service/message/data contracts are contracts.

    When the contracts are intra-team, there is no need to make this piece explicit. It can evolve and only makes sense in that context, but when it is . Scaling up across teams, where the contract boundaries exist can be a massive problem where the coordination of the teams or even the bigger picture is not known.

    Additionally, I happen to agree on the human scale organisation point. However, systems and enterprises are much bigger than such human scale groups can achieve on their own, especially when you factor in the computational influences that Andras Ludany mentions. The best development achievements 'back in the day' happened to already be aware of what the lean crowd call Conway's law (who are not Agilists in the pure sense, bear in mind). You divide your systems architecture by 'component' and assign the development of those components to the individual, collocated, human scale teams to deliver to an agreed component contract which has been experimented and played out. It required a coordinating role to do that. and reduce the large amount of rework required to make that happen.

    I happen not to believe that architecture is a hierarchical position. It is unfortunate that it has been seen that way, but the reason has traditionally been a supply and demand issue (inflating salaries/rates) as much as it has been a viewpoint by management that one needs to exist (maybe due to the parallels in building construction, which I happen to thin are valid). An architect not able to write any code is not a very useful communicator to developers who do, but they can pair with those that can and are the experts to afford that coordinating guidance. All these techniques existed way before the agile movement came about and a whole lot more besides, but the 'besides' have often been thrown out.

    So I put it to you, in terms of staff, what is right values? What does that mean? Who defines it? Cockburn's agile scale of people? Who interprets that scale? The business at the end of the day need a number. That number is the bottom line. Agility doesn't change the bottom line costs. Even Cockburn has acknowledged that Agility does not change the parameters of software cost in an organisation, which is a line I definitely take and is worth repeating. It doesn't save money. What, in my experience, it does do, is provide more value by delivering in increments (which even what agile people consider BUFD/BDUF actually do. Consider TOGAF and RUP and their incremental delivery mechanisms). This brings delivery out of the chaos of changing requirements and effectively adds to the value stream for that same cost (over waterfall especially). So I personally recommend people use agile methods where their business is susceptible to changing direction very quickly. However, if they want to save money, or pivot the business quickly, they need agility at a much higher level than developers know or are concerned about (See Agile-EA for a discussion on this) and this almost always involved domains which are not standard IT system domains.

    In any case, in order to get that saving of cost, you have to be lean. Agility on its own does not give that and the use of burn up/down charts, cycle time, throughput, cumulative flow etc. is a wholly inadequate mechanism to track it in any level of detail if nobody is tracking the bigger picture.

    Re: We can't continue our journey in a car when we hit the shores of the oc by Ethar Alali

    To answer Andras' point about uncharted territory. I agree that some areas of this are uncharted. However, those with specific programming expertise are not the best placed to transition into the new worlds.

    To stretch the analogy you used, your car mechanic has driven into the ocean. How good is his boat making skills? At best you don't know, but put a formal mechanical engineer in the car, who truly understands statics and dynamics and you stand a much better chance of transitioning (building) a boat, or a plane, or a rubber tube, if that is what you need. It is the reason why good agilists rate polymaths, but why I rate those with mathematical backgrounds over those with computer science degrees any day(especially in this day and age).

    But then, I am biased. My undergrad is in software engineering whilst my postgrad is in Applied Maths (I did the 'stupid way round', because I felt there was no rigour in CS). The combination of platforms and software environments changing so rapidly means universities simply can't keep up. Here in the UK traditionally this has also meant that the demand for funding over recent years has required universities to shift what they teach, resulting in the incredibly poor IT related degrees you mentioned. Industry has been crying out for capable coders (and thinkers) as a result. The recent increase in tuition fees will make students more picky about what they choose and given you can get into the industry without a degree, that will be the preferred choice. Closing the door on CS degrees as a result. This will happen not because it is good engineering/science/craft, but out of necessity. That is what people are shouting about, not that it is better. Just like some have argued that older dinosaurs protest their position, the new crowd (some of which have never seen methods outside agile or 'nothing') protest their superiority. Both don't want to be driven into the ground by the other and both believe they are right. Indeed, when I was younger, I did.

    Re: Adam, FWIW, I feel your pain :-) by Adam Nemeth

    One last point. The problem with the role of so called Architect, or any other role that doesn't include programming is that you end up with a group whose design decisions are unvalidated. How do you know that your glossy UML diagram is actually going to work, never mind prove maintainable? You don't, yet many feel comfortable imposing half backed BDUF onto others who are left with the the unenviable task of getting them to work! Hardly an egalitarian approach!


    First off, I don't want to be egalitarian: a respect of an architect is based on the fact that he's the best, or at least, one of the best ones. This is not an equal situation. I don't want it to be an equal situation. Hierarchies are good, as long as they represent the right value system.

    And I believe that IT is full of people who shouldn't have a key role in decision-making: simply some people don't have the background. I'm sorry, but I won't argue with people who only read blog-posts. Sometimes I tried.

    And that's why I won't argue with you too much: if you can't test a design without actually building it, if you know no other way, than to deploy a full application, then we're not on the same grounds. Learn UX. Learn modeling. Learn anything, follow studies, conduct focus group surveys, do prototyping, I mean, hey, if the only thing you can do is to code things fully, and see it's a trainweck, you're a waterfall yourself.

    Those glossy UML diagrams, by the time they reach programmers are to be validated. They are to be validated by business people who can't understand code, they are to be validated by previous successes, they are to be validated by mathematical models, they are to be validated by many-many more ways the unit testing community can't even dream about, as yes, you can't validate a full-blown existing system:you can only validate if its simplified model matches some criterias.

    Re: We can't continue our journey in a car when we hit the shores of the oc by Adam Nemeth

    Andras, I don't agree with you.

    While I understand your point, I think that computers are to solve human needs. No matter what complex system you build, at the end, it is supposed to solve the existing problems for a community of people: it could be accounting, it could be some leisure activity, it could be communication, it could be banking, finance, healthcare, anything: at the end of the wire, there are and there will be always humans.

    I expect this not to change.

    And I expect our life to be tangled with computing, creating a kind of hybrid symbiosis: just like we're dependent on electricity, so are we more and more dependent on our computer networks, we're asking more and more complex questions, some of which would be unthinkable without our life tangled in computers.

    And teaching machines how to communicate - Hungarian CS education is perfect for backend-work. It's much harder to find someone who actually understands, that at the end of the wire, there is a human, and that human has needs, has a problem to be solved. You just can't go into a bunch of programmers who understand, that their hyper-inter-connected-whatever-backend-architecture is failing human needs sometimes. That stupid user, the one, who doesn't want to understand even a bit, a byte, a single-statement line of it, that is the one who has a problem right now.

    Hungarian IT people are really good with telling machines how to talk to each other. We still didn't solve the part where these machines talk to humans, as some IT guys went into IT because are reluctant to do that as well.

    Re: Adam, FWIW, I feel your pain :-) by Paul Beckford

    I've been an Architect and I'm sure that there are many good ones. My point is that the best architects program too, since this is the only real way to validate their technology/design decisions. Incidentally your point about the architects being the best programmers isn't born out in my experience. That would be to assume a meritocracy, which is seldom the case in many organisations, hence my point about values.

    Assuming that the Architects are the best programmers, my question would be why aren't they programming? How are the more junior programmers to learn, if not by example?

    Michelangelo when painting the sistene chapel didn't produce drawings and then leave his junior apprentices to it. No, he painted the most challenging parts himself whilst mentored and coaching his less able students on the job. This is what I mean when I say what we do is more of a craft then Engineering or Science. We've chosen the wrong metaphor.

    Paul.

    Re: We can't continue our journey in a car when we hit the shores of the oc by Andras Ludanyi

    The problem is not that we use computers to solve human problems (that's normal), the problem is that we force the machine to be a human accountant, we don't use the true capabilities of the machine, we have to upgrade accounting to new levels (and not just accounting), implement it in a way humans can't even imagine to do; not in a way we always did. In reality the computer replaced the human in many roles (it is a fast human emulator who can't make a mistake), while in fact it can be much more than that. Focus on human problems but solve them by new, innovative ways, ways that we humans can't emulate (not even at very slow pace).

    What I am trying to say is that there is way too much MANUAL work which can be automated especially if we adapt the process to the machines not to humans (which must be the case today because the manual work involved). We don't automate 10% of the things which can be automated. The "old" UNIX guys automated much more stuff than we have automated today exactly because they mindset was machine oriented... modern software is way behind hardware capabilities.

    Re: We can't continue our journey in a car when we hit the shores of the oc by Andras Ludanyi

    I would also like to add that if we really want to move forward, we need to accept the fact that most of the things machine will do they will do without human interaction. If we want to use machines in a way that we track and know everything they do, we won't be able to do much, our numbers and capabilities are limited, machines are not and we can scale machines at a very high rate, so why do we limit the capabilities of the hardware with software which is made in a way that we humans can understand and use instead in a way which optimally use hardware and communications infrastructure capabilities.

    There is only one little part of the software stack which MUST be designed with humans in mind and that is the USER INTERFACE, all the other layers in the stack if made with humans in mind are waste of machine time and capability, and what is even much worse, waste of human time as well.

    Re: Adam, FWIW, I feel your pain :-) by Paul Beckford

    Hi Ethar,

    Have you heard of emergent design? The formal planned "top down" approach you describe is one way of organising and scaling software development, but it is not the only way.

    Agile started life amongst the Smalltalk community who had developed a culture of using rapid feedback to validate design/architectural decision making. This emergent approach to design can also work at scale. It places more emphasis on bottom up decision making rather then top down orchestration. The problem it faces is it often finds itself at odds with enterprise management culture.

    So at root we are talking about a people problem. The Smalltalk community showed that button up emergent design using feedback is technically preferable to speculative BDUF. The challenge is to get development organisations to organise into small teams (7 +/- 2) , aligned with the business, and working on small projects. The business aligned teams release quickly to gain real world feedback, and collaborate to allow enterprise architectures to emerge over time in response to market and business needs.

    This type of emergence is much better suited to the fast pace of business change today. Have you ever seen a top down enterprise wide IT strategy ever reach maturity? I haven't. Long before that happens the favoured technology of the day is no longer hip, or the business has restructured or merged with another with a conflicting IT approach.


    Paul.

    Yes, of course I have heard of emergent design... by Ethar Alali

    Hi Paul,

    ...and yes, I know that 'top down' is not the only one way that it can be done (hence why I added that I will advise people to use agile methods/modelling in situations where the enterprise landscape is not well understood/chaotic). It isn't really top down either, as the delivery of use cases are done as iterations (not sure if you quite understand RUP to that degree).

    Emergent/middle out architectures often necessitate the introduction of much the same mechanisms that drive other types of architectural concern and indeed development concepts (such as low, thin coupling between systems; composition of, say, contracts instead of extending existing contracts etc.)

    I am TOGAF 9 certified. Even within that, there is only a nod to the focus on longer term strategy. The ADM is iterative in nature and aims to deliver the strategy through incremental delivery of capability (through projects and transition architectures). The bottom up approach you mentioned is again fabricated on the assumption of BDUF (which yet again I hasten to add, I didn't state. But I appreciate it is convenient to overlook this from the perspective of planting the agile flag :).

    What you have said about aligning the small teams with the business domains is effectively applying Conway's law. How does a small project then communicate with another small project to move interface together maintaining 'symmetry of conformance' say? Does the organisation have to purchase/learn a large ESB tool to allow that decoupling and work to develop a CDM? If so, that is an architectural concern and focuses on processes and tools over individuals and interactions (which is against an agile principle). Also, who coordinates it?

    You can certainly be emergent without architecture. However, the amount of rework is simply a shifting of the costs that would otherwise have deployed at other times in the lifetime of a project (using other methods), for the benefit of delivering something into the value stream. Such development is certainly not in any way lean by default. It more often than not increases combinatorial complexity of systems, potentially duplicates interfaces and as a result, increases 'architectural rework' which often affects several teams) and so the sub-optimal agility benefits of these smaller teams are lost in the effects of architectural rework, which is worse.

    BTW, a note about your definition. I will assume you mean 'enterprise solution architecture', as enterprise architecture is more often than not, also concerned with delivering the related business capabilities too (depending on the value stream).

    I am not 100% anti-agile. I am 100% against the mindset that seems to have evolved from it, that will (at this stage) prevent the vast majority of staff from stepping up to leaner practices.

    Cheers


    E

    Re: We can't continue our journey in a car when we hit the shores of the oc by Ethar Alali

    This is an interesting point. Are you an accountant? If you can't do the accounting, how do you get a computer to do it? Are you willing to wait for a genetic algorithm to evolve a program to do it? If so, and you get a computer to do accounting, then how do you then deal with dissymetry in the accounting process to governments and regulatory authorities? Do they then have to take your model? What about other companies?

    OK, say, the government can evolve their accounting process. That means you have to adhere to it, in which case you have a contract to work with, which is an architectural concern and gives you an acceptance criteria to 'evolve to'. So that doesn't really wash.

    Re: Yes, of course I have heard of emergent design... by Paul Beckford

    Hi Ether,

    Sorry I wasn't clear. I mean emergent architecture along with emergent design. I also agree with you that Agile isn't a silver bullet. You mentioned Alistair Cockburn, well his crystal series of methodologies vary in the amount of formal documentation in relation to team size, so I agree, one size doesn't fit all.

    I guess what I'm suggesting is that an Agile mindset may fit more situations then you may think (not knowing what you think, I can only guess :)).

    The biggest obstacle I find is the organisational culture, and the unwillingness to give teams autonomy. You suggest a number of technical solutions which would make teams and the business units they represent more autonomous (loose coupling).

    This autonomy leads to greater flexibility and adaptability to change. An example of this is the success of the web which has emerged bottom up, versus the many top down planned public networks that once existed, but are no more today.

    Paul.

    Re: Adam, FWIW, I feel your pain :-) by Adam Nemeth


    Michelangelo when painting the sistene chapel didn't produce drawings and then leave his junior apprentices to it. No, he painted the most challenging parts himself whilst mentored and coaching his less able students on the job. This is what I mean when I say what we do is more of a craft then Engineering or Science. We've chosen the wrong metaphor.


    Of course, I also did the heavy-lifting in most of my projects, unless there were some experts to the topic (eg. if I have a UX expert, I just check sometimes if everything is OK, but don't do the full UX myself, or if I have a security expert, I just handle all the info he needs). My role is to be a generalist, but I have to make sure the difficult parts are met.

    And most of my time while being an architect was about coaching: showing junior programmers how to do MVC correctly, explaining why the continous integrator tool complained on their code, why is it still bad despite running, what kind of changes do we expect based on the project's historical data or well-known strategic plans, and, in some cases, I had to demonstrate how to write the part correctly, while doing the UML-dance with everyone, as a visualisation tool.

    Then Agile kicked in, and sooner-or-later the newly-arrived junior devs are crying out to remove style checking from the tools, or they're promoted to senior based solely on their time spent, etc.

    So I did some one-man shows, doing everything from spec to implementation, not just being the architect, but the sole developer, the UX guy, the project manager, everything what wasn't domain expertise: it's doable, it's just it's of course slow and energy-consuming. I did ask for code reviews, and despite that I haven't written full apps for about 5 years, they still said it's one of the most beautiful codes they've seen, so I guess I'm not that bad at programming (and the guys who said these were from different teams and listen to punk music, if you know what I mean). I even gave it to "business people", the users, domain experts, removing semicolons and certain type/visibility informations, they also did understand it.

    It's just, you know, that I hate programming: it's simply boring for me. I do it, of course, especially when it comes to complex tasks, and I regularly do open source projects, educational code, and when Agile kicked in, to the whole of my codebase, but I just hate it, it's like being a typist.

    I was never an overdesigner. Of course, everyone who I met and told them I like to work with UML assumed that I'm one, but that's not true: even in UML it was just enough design, to make sure everything works and will work for the foreseeable future. I don't have these GeneratorLocatorFactoryLocatorStrategyTemplateFactory whatever classes (like Spring Social has), and I try to avoid certain Java libs like the plague.

    It's just I'm getting bored with dealing with problems like "ah, I have to wrap this to a decorator", "I have to type in about the same amount of code as unit tests in order to reach 80-85% coverage, only to find out there were no bugs so far" (I do the test-first dance in UML, why do I need to do it in code as well? It's just time-consuming, boring, slow).

    Yes, with UML, you can be only 95% sure, that what you wrote is correct, as it does hide the details: but that's why it takes 10 percent of the time! And no, I wouldn't generate code from UML: I'd end up being where I started - code generation is just programming.

    So, that's why I hate the agile world and try to stay an architect at all costs: first, because I still understand modeling, second because 90% of all codebases are under my boredom line, and third, because I'm a narcissist dick, and I don't believe that I really have to argue with people coming in from the streets, accepted by someone else to the project.

    I do argue with customers, I do communicate with them, it was never a problem, the problem is always with devs: the users have a problem to be solved. The devs just want to have their toys in, and sometimes this would hinder us too much, especially when their toy is a full framework or programming language.

    Re: Adam, FWIW, I feel your pain :-) by Ethar Alali

    Yes exactly! Devs who add the 'shiny' add a new framework or new language which increases the amount of potential for error as much as writing code (indeed, usually more so) and as I mentioned previously, it couples the team to the framework. Additionally, UML validation (especially where someone has a maths background and runs with the OCL) can validate the process in minutes. Whilst a computer can validate it in seconds, you have to write code and then produce the artefact to validate it is correct. You can get validators that work on specs as well, as a result, validating the 'analysis/design' before it gets out of the BAs hands.

    To the Sistene chapel example, I ask this. If the Sistene chapel was twice as big, would Michelangelo have had the time? Put a deadline on his work and suddenly it is a different ball game. I personally don't agree with the agile school of thought that software will be ready when its ready (i.e. non-committal processes), as people are paying for that time. If you asked for an extension to your house and are paying for time and materials, would one not expect to ask how long it would take and infer how much it would cost?

    Incidentally, I forgot to clarify something earlier today, as I think there was a misunderstanding. Architects CAN and MUST be able to code, but are not the expert programmers they used to be. They offer guidance and direction and have enough of an overlap in their roles with developers, TQA's and BAs to be able to pair with any of them and give that guidance. However, I see developers at odds with them all the time.

    @Paul - If you go to the agilealliance website and look at the paper by Boehm and Turner, this introduces the analysis of when agility should be used versus more heavyweight methods. If you don't mind dry papers, it is a very useful read. They published a book where they have a nice graph (for those who like pictures) which indicates that there exists a 'sweet-spot' of how much up front design is required to trade off from the amount of rework/refactoring in agile methods versus the BDUF methods. They even have a radar chart indicating the critical factors and then go on to indicate how those dimensions should be used, together with advice to mitigate actions in the large grey area that exists between agility and heavyweight methods (which is the 'more projects than I think' which you are right, you were guessing about).

    For me, when working in an architecture capacity, I look at these factors. Without them one or other set of methods will fail in either extreme. Other than that, a different set of mitigating processes have to be carried out to cover the weaker dimensions in each method if there is no clear distinction.


    E

    Re: We can't continue our journey in a car when we hit the shores of the oc by Andras Ludanyi

    No, I am not an accountant, although I have an advanced understanding of accounting principles, but that isn't the point. Accounting was probably not the best example and because of it's inherent inertion for change (mostly because of governmental/bureaucratic reasons) it isn't the place we should start our journey.

    Anyway, I never said it's easy, the problem is exactly that we have no idea how to do this, but the bigger problem is that most of us don't even think about this, and to get the idea we must first experiment, try out new concepts... made a large number of mistakes... in short first come chaos :) then came capability...

    A bit of 'refactoring' of your point of view Andras? ;-) by Ethar Alali

    I was only being half facetious, but did somewhat aim to show an interesting parallel which I am wondering if anyone else spotted (the lack of commentary makes me think nobody necessarily did in the extra time I gave this thread before commenting). I shall present it with a fully facetious comment :-D This is not meant to be nasty, I best say that first.

    I am a thinker first, do after. Andras, I can quite safely assume you are a build first and improve later sort of person. The discussion about the accountancy analogy, which was your analogy which you threw in to the ring you backtracked on (having seen the analogy doesn't quite fit). Which is a parallel of the fundamental way you think. I decided to rebut the analogy by thinking about it, taking it in and countering, which is fundamentally the way I think (but is certainly not the only way).

    Up until this current post by me, you had used up 2 posts to come to the conclusion that accounting was not the best analogy. I only needed one :-D Who generated less waste on the topic?

    I am just joking about the specifics of course, but the message is somewhat the same. You need to think improve, to analyse waste, to analyse the efficacy of small improvements you make. Whilst every system can be described using two things (the initial conditions and the system operation), in order to improve it, you need the output result (which includes the output, its analysis and the eventual feedback if working in a closed loop feedback system, which evolutionary/iterative methods such as agile are).

    You have done that to some degree, have decided that the analogy is a bad one, so you backtracked it. Your feedback from that (regardless of waste) was the new knowledge that the accountancy analogy is a bad one...

    ...and maybe that you shouldn't take on a thinker on his turf ;-)

    (That last comment is a joke and certainly not something I would encourage, as then it encourages the loss of valuable feedback from both sides of the agile divide).

    Re: A bit of 'refactoring' of your point of view Andras? ;-) by Andras Ludanyi

    First: IT is not just about accounting, so...

    Second: No, I am not a build first improve later person either, although I prefer this approach, it is a bit naive if you have the capacity and the required information to think about the problem first and you don't do that, so I would say whenever I can think first, I do, but not too much (I don't like when paralysis by analysis hits me). Also it is worth to say that engineers are learn in order to build, while scientists build in order to learn, and if you want to move forward you need the scientists way, if you want to get mature and stable then you need engineers way... so I believe both methods are equally important and valuable if you employ them at the right time.

    Third: I am afraid we missed the point. When I said that we need to embrace the machine and to use it's capability to solve problems in a machine way, I never had the idea to change accounting or any other human process (although that could came later most probably), but to change the way how the machine internally process the data.

    For example consider Git vs. CVS/Subversion. Git was designed by a guy (Linus Trvalds) who speak "machine language" (who understand CS and how the machine actually works on the low level) that's why he never tried to force the analogue desk/folder/paper model on the machine; he never get the idea to make 77 copies of the data if he needed 77 branches (that is the analogue human way), instead he simply use pointers and hashing and some other CS stuff, things humans never use, but machines does. The real power of Git comes from these and some other similar things, and the real shortcomings of other "simulating the analogue human process" software mostly comes exactly because they largely ignored the true capabilities of the machine.

    Another example... imagine that instead of using alpha-beta pruning of binary trees you implement the game of chess in a way that you try to simulate the human chess playing process, well most (non-scientific) enterprise software today does exactly that.

    In short what I really wanted to tell (without wasting anyone's time) is that our current IT systems move arround way too much data, they are mostly suboptimal and the worst thing of all, the "simulating the analogue way" for internal processes limits true value creation. We need to fix this if we really want to move forward (and it's not going to be easy).

    Re: A bit of 'refactoring' of your point of view Andras? ;-) by Ethar Alali

    Ahh, now we are converging on common ground.

    Firstly, engineering is really the application of science to solve a problem. Science can do that in many ways, such as applying mathematics prospectively in the form of a theory and looking for the empirical evidence, or developing a hypothesis, taking it's null and then performing studies on it (such as randomised, double blind) to provide a probability small enough that the null hypothesis is unlikely to be true. Note, engineering also has feedback loops, especially when looking to, say, use materials where the they are novel or unknown. The sacrifice of both the time and the material for the greater cause is accepted. So in my mind, this is the same as spikes in agile method.

    I 100% agree with you that there is too much data being pushed around! However, the vast majority of this has come from projects where not enough thought has gone into it. Often, the excuse that 'bandwidth is cheap' is used. However, when scaling becomes a problem, suddenly there is a big issue with it, when it is more expensive to fix.

    As for the comparison of Git and CVS, this is useful, but there is a wider analogy that can be applied.

    Every piece of software has a problem posed by some stakeholder. Just like any interacting entities, in any part of life, the stakeholder is a system (whether human or machine) and has an interface to interact with any other system, whether by user interface, socket connection, HTTP or anything else. The key choices as to which to use are trade-off decisions, which result in as optimum a system as can be (but in reality there is always a system constraint) and in the wider context, some of this can be seen as suboptimal. Science and Engineering both help by allowing us to both qualify and quantify these trade-offs to pick a solution which is an optimal best fit, "all things being equal" (note, I state this as the first statement of Occam's Razor, not the KISS type principles us software folk are wrongly used to... IMHO ;) The paradigm of a 'folder', 'desktop' etc. need not actually exist in the computer, but just as we view the world through the projection of our senses, we view these virtual 'folders', 'desktops' etc. through the projection of the computer. So I see no reason why the business stakeholder, who wants a bank statement, can't have a bank statement from a set of transactions in a log (aka rows in a table) and and aggregate (more rows in another table or a sum on the original table, however it is done). The transaction "rows" are 'projected' into the context of a statement and that gives the illusion of having an "account". It was never about representing the exact entities in the computer it was presenting the illusion/model of having them, which is after all, what software is about.

    So in the case of Git, the set of stakeholders includes developers and there were problems with previous models, so a more optimum solution was implemented by Git (which I admit that even from the limited use I have had of it, I find it more elegant) and Torvald's key insight was to optimise his interpretation of the paradigm to the platform, via his prowess as a programmer. After all, a computer doesn't know what a 'Wiki' or 'version controlling' is, these are domain specific concepts that belong to the development world. Indeed, at a slight tangent, high level programming languages are domain specific languages for a programmer. This is an 'effective model', which is something that allows us to reason about the problem we are trying to solve. In my mind, being Linus Torvalds and creating concepts are not at all mutually exclusive!

    Optimisation can take many forms, but in software, all I seem to see people do is use these micro-optimisations, sometimes shifting constraints or suboptimal points. For example, creating indices to speed up a badly written query. Adding indices can give you some substantial speed increases, but rewriting a query can get you substantial increases of exponential orders of magnitude, but that requires analysis and thought. You can do this via theory of constraints, algorithmics, linear programming, recurrence relations, bisection or anything that is a useful method pertinent to the computational domain.

    Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

    Email me replies to any of my messages in this thread

    Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

    Email me replies to any of my messages in this thread

    27 Discuss

    Educational Content

    General Feedback
    Bugs
    Advertising
    Editorial
    InfoQ.com and all content copyright © 2006-2014 C4Media Inc. InfoQ.com hosted at Contegix, the best ISP we've ever worked with.
    Privacy policy
    BT