Bio The panel was moderated by Martin Fowler, and included POSA author Frank Buschmann (from Siemens), Steve Cook (working on DSL tools at Microsoft), Jimmy Nilsson (author of Applying DDD), and Dave Thomas (founder of OTI who created Visual Age).
Immediately to my left is Frank Buschmann from Siemens, well known for being the lead for POSA pattern series and well know figure for writing in the European and international software area. He works at Siemens.
Frank: I'm working with corporate technology whose mission is to explore new and promising trends in all areas of technological interests for the company, interesting to the business units to make productive use of them. I'm doing research and consulting.
Next over from Frank is Steve Cook. Steve was one of the leading people in the OO community in UK when I was getting started in objects. I would go to conferences like this and see Steve on stage and find his talks very interesting and useful. He spent quite a lot of time as an independent. He spent some years working with IBM, he did have some involvement in coming up with the version 2 of the UML and now he's at Microsoft where he's working on the Domain Specific Language Tools. He's kind of developing what might be a future way of developing software in the future at Microsoft.
Next over is Jimmy Nilsson. I ran into him when I was running around trying to find out if there were any decent books written about the design enterprise software on the Microsoft platform, about 2 or 3 years ago. People kept saying no, but vendors said I might run into this fairly obscure book by this Swedish guy. I picked up a copy and I was very impressed. I've been very interested by Jimmy's work and since then we've been together at various workshops. I picked Jimmy because he's the kind of rooted into today and he's the kind of person who's probably going to be cynical of any big advances in the next ten years. I thought he would add a little bit of realism and cynicism to the panel and that's always a useful thing to have. He's in independent consultant based in Sweden doing work in Sweden and internationally.
Just in case I didn't have enough cynicism on this panel I thought I'd bring in one of my favourite panellists. He's a panellist I never want to be in the same panel as, in case I might get into an argument with him, but I'm quite happy as a moderator as I can keep clear of it. That's Dave Thomas who you may have come across. He's the founder of OTI - Object Technology International - big proponents of SmallTalk and from the earliest days of objects. His great ambition was to get SmallTalk running on main frames down to wrist watches. I'm not quite sure you got that far as to wrist watches, but you did get to the main frame. He was also a professor at Carleton University, Canada. Several people I know were his students there. He sold OTI to IBM and bought this Caribbean island to live on and now wonders around the object scene imparting a good bit of good advice to all people in the software development area. Dave is always a reliable source for good opinions about what should be done in the software development world.
These are our panellists. I want to focus on the main questions and I'm going to ask each panellist to speak briefly to that.
The theme of the panel is "Who's going to be developing software in ten years time? Is it still going to be professional programmers like us? Are we going to see it different as to who is doing the work? How is it going to be done? Is it going to be similar to how software is developed now? Are we going to see any significant differences as to how software is going to be done? Where is it going to be done?
I don't think that in ten years from now there will be too much difference in who is writing the software. I still think there is an enormous need for professionally trained and skilled software developers. They were in the past, they are today and they will be ten years from now. I addition I think there will also be domain experts, non-software people that together with software people often write software systems.
That is due to the change in some of the technologies, to domain specific languages that allow better communication between domain experts; software developers help non-software people to better understand and configure model software of which software is configured or generated without a little further human intervention. It is the set of tools that people have available to write new software or software that makes a difference. Software is still written for humans and mostly by humans.
One difference that can be observed today is another group of software writers are software systems themselves. We see this in the area of autonomic computing where there is self-healing, self-configuration, self-repair, self-protection. These systems are able to configure, adapt or evolve by themselves. The reflective community has been into this for years and I think we see more systems of that type, especially at least I see it in my environment. Especially with big systems, pervasive systems, with the hundred million of connected devices. The software is self evolving and we need mechanisms like reflection for doing this. We may have different ways of software evolution by new ways of interaction between otherwise loosely coupled software modules.
If you were here a couple of years ago listening to the DARPA director talking about the bio-informatical age, bio-informatics where software is programmed in a very genetic way that allows for evolution and modification through itself. That is beyond the ten years. Now, let me answer to the where question. I think it is everywhere. Being cynical, I think we have observed with the offshoring, outshoring, that this is a trend to move towards the east, but on the other hand this may come back if you go east of China you'll eventually land in the US and from there back to Europe.
I don't think that the off-shoring and outsourcing is the ultimate solution. And we also see that companies founded in China or other eastern countries, recently opened subsidiaries in Europe to develop software there, closer to where their customers are. We also see that it will be a mix and match. So in the past there has been Northern America and Europe only with few exceptions; in the future it will be everywhere.
When Martin asked me earlier today, the first thing I thought was what was it like ten years ago? I wanted to try to find a bit of a measure of what ten years was. Ten years ago in 1995 it was C++ and we didn't have any of the .com nonsense, we had no UML, we had no XML, and it was kind of client server and computers were a tenth as powerful as they are today.
I think in ten years time there will be more change from today than there was from then to now. Change is increasing. There are going to be lots of changes of emphasis, the kind of things software will be about. I've written down some phrases to describe that digital convergence, moving from computers to appliances, globalisation, service orientation, greatly increasing importance of other concerns that typically don't really impact too much on us today, but are very much increasing their impact on things like security.
Security is going to become enormously important. Metadata, in a very broad sense; modelling and versioning of things- a lot of software about software. I think it has to do with what Frank was speaking about autonomic computing. The platform is going to be more introspective, much richer, you'll definitely find, for example modelling as part of the operating system. That's not the kind pf Microsoft promise but that's the kind of thing you'll definitely expect. There will be many changes of emphasis. I reflected on a figure that I read not long ago, the percentage of the US gross domestic product, which has essentially to do with software. I think today it is about 20%.
I'm going to predict that in ten years time something like 30%-40% of the gross world product will have directly to do with software. That involves vast new capabilities for creating software. We're seeing it today in India or in China, we'll see it all over the place. It involves also vast new markets for software. It will be a terrible mistake to see outsourcing as a kind of threat. I think the whole thing is expanding and there will be huge new opportunities for software like, those in trying to solve some of the global problems of the world: global warming, poverty, new business models resulting from globalisation. Enormous new opportunities for software.
I think we're going to begin to see the software industry starting to become mature. In a mature industry is one where you have a lot of specialization, automation, mass customization; supply chains will evolve in the software industry so there will be supplies of particular kinds of software and consumers of those kinds of software that supplies those other kinds of software to consumers of those kinds of software which will ultimately end up with the end users.
And how it will be written? It will be written in all kinds of ways, as it depends on where you're in the supply chain. In some place it will be written using domain specific languages, in some using assembler. Where will it be written? It will be written everywhere. It will be written by people involved in supply chains. If you want some software you go to a person close to you who can do that kind of software. They would go to a person close to them who can provide them with what they need and so. That's what I reckon is going to happen in ten year's time.
I think that old dream of having business experts help us write some software, will finally come true in a couple of years. Not that they will write everything of course we will still be needed. Another group of people that will write software are probably our children. Some software will be written by people making films today. Even business systems will have more focus on the usability aspects in extreme ways for visualizing the business information (for example) in strange ways. Concluding about the "who" question is that it's going to be us writing the software, for a long time.
How? One thing we can wish upon is to say that all those pretty nice and good ideas that we talk about all the time, will come into use in a broad sense. For example sometimes you get the feeling that the test driven development is something everybody is using, but I'm so sure it's not the case. It's still kind of a niche technique (for example). Maturity is what I meant there. We will be a much more mature industry in ten years. Some of my clients keep telling me that for one hour I spend with them they can have one month with some others guys, but still they are buying from me. They are still buying from me because it's nice to build software locally. Not to say there won't be any outsourcing of course, it will be a combination.
Predicting the future is always very dangerous. If The Doug video from 1968 demonstrated everything and more that is in modern non-numeric computing: the browser, the mouse, teleconferencing, structured text, version management browsers - it's all there in 1968 video. We all knew it was going to happen, but we didn't know when. But the internet would happen. Unfortunately we still haven't built that kind of universe that Doug envisioned. No one can fly around documents and drawings like Doug can, even though he's 80 years old and he still runs on an emulation. It is very hard to tell.
The only thing that we seem to know is that every 20 years we get good ideas coming back. Languages and platforms tend to have a real half live of something on the order of 7-10 years and then there seems to be something that causes them to create their own bulk and then people start looking for other things. I'm actually pretty optimistic that the abysmal situation that we live today with these really bug written platforms that are so bulky and complicated to use, that enslaves young people to work on jobs that are truly horrible, trying to make the software work.
I include in particular the Eclipse team, because most of what they do is actually working on other people buggy software and trying to work on it along with contributing with a few bugs of their own. I think it is going to change because it is too hard. The other issue is that there really is a skills problem and it doesn't matter that you'll get outstanding skills from another country, which we will. We've been doing that for a long time, except now they won't have to move to America or Europe any more.
The challenge is that we technology is growing at a quadratic, but the value in business software is growing exponentially in terms of domain content. If you just look at something simple, like the whole point of test driven development, the key success in Agile is having acceptance test, which is a happy path of use case, of the story written by or with a business user. Unless we unable this, we're going to die in the testing hell that we've been in for a long time. 50% of any real successful product is testing, so I think shifting that is going to be very important.
The kinds of hardware we're going to see in 2010 and 2015 which is our target research area, you're going to be able to directly run those mathematic applications you do in engineering. You're not going to need to use them as a spec that you give to some c++ programmer who ship the code late; they are going to be able to execute it directly. If you look at the things people are doing in molecular modelling and bio informatics, if you look at languages like Tysla and so on, these languages are so powerful and flexible that they make many of our current query systems look terrible. They can create text documents, experiments and so on. These are based on really deep roots and category theories.
There are a lot of people being very successful or people on Wall Street using radically different technologies with 512 Gigs of memory doing real time data over billions of rows where they get responses and queries back that nobody can dream to use the best Oracle database with DB2. I see lots of evidence that there's going to be different kinds of programming going on. The challenges you're going to have to understand the domain, and you'll have to understand how to do something beyond the current kind of object stuff that we have, because the only way we're going to harness these kinds of powers of these machines (Google has already got it demonstrated) is people are going to have to learn some of these old ideas like MapReduce.
Dave: Since I've referenced Google they've definitely are way ahead in terms of the kind of infrastructure that we envisage people using massive amounts of processors in which processors are not visible to you in your house or your office. The programming models they are using a very interesting the kind of language technology is really rocks and sticks, its sort of Perl and CPerl. But the capability of what they can do, on the other hand, just because a large American corporation can generate a global fabric that's pluggable does not mean that they won't suck the dollars out just like every other large American corporation. Sometimes IBM is popular when it's friendly because it does open source, other times when it screws people over, it becomes unpopular; this is basically the challenge one has. It's bad idea to own more than 60% of a market because you end up getting a lot more competitors. It's actually good to give up some of the market. If Google can find a way to share the wealth, then they are going to be unstoppable.
Frank: Suddenly Google has and will have influence in the way software is developed and SOAs may evolve. Last year I listened to a talk by somebody from the hardcore embedded community and he said that 99.98% of all processors are embedded, and 98% of all written software is for embedded systems. Google is certainly serving a certain kind of business web based, service oriented architectures. But there are so many other businesses and so many other domains out there where those techniques don't apply and these are by far the majority. But you won't use a web service driving by wire. If you press the break you want your car to stop. The act of steering - there is no physical connection between anymore between your pedals and steering wheel and the machinery that does it. You don't want to use SOA for that xml beans change and protocol negotiations. There are other businesses and I think Google will have influence in its place but not beyond. Perhaps ideas beyond but it will not dominate the world.
Steve: I'm too old. I hope that as the software industry matures, the days where anyone company can own the business are gone. I think Frank is right and Google will be successful in their place. They are not going to replace Microsoft. There's room for everyone in the world of the future; there is competition, but the idea that Google is going to take over everything is just absurd. They'll have a little bit of effect on programming. I think .NET has had a bit of effect on it too. I don't actually think that Google is a particularly big player in the question "who's going to build software in ten year's time".
Jimmy: I totally agree there won't be only one market leader. As we have seen in the past a couple of times, there will probably be much more of a split in the market.
Dave: We've just talked about it in the example of the infrastructure. One of the things that's really good is that lots of people are building grid computing. One of the great opportunities is that grid software is complete garbage. It's basically old scientific stuff trying to do it; people trying to build batch processing on top... trying to build a mainframe of old scientific software with MPI and all this cruel and unnatural programming. I think there's a great opportunity for software infrastructure that uses higher order technologies to exploit these, because data parallelism is by far the easiest way to take advantage of these things. There are tuples spaces and actors as well.
You can't program if you can count the processors, if you if you actually know where the processors are how many there are it's impossible to program them. I think for those kinds of algorithms where you can get data flow parallelism or where you can naturally partition, you're going to be able to do all sort of exciting things, whether it's going to be free or not... I think someone is going to be paying for the power and the cooling. If you want to have any connectivity at all, if you want to share information (Google is lucky because it has got stateless - not all applications the same way Google does).
If you have state and you have to share it you're probably going to see these things in a datacentre. Computing on demand basically is called "the data centers are back." The real challenge is that you have to redo all the existing applications because you can't do SAP on demand because SAP is designed to host a single company in one SAP instance. If you really want to service 20 companies over a group pf processors then you're either going to get a smoking deal on licenses from SAP or you're going to have to rebuild all the SAP competitive infrastructure spread across a lot of processers and meter it out on-demand. There is some interesting metering technology available, so it can be done.
Frank: This is already happening so it doesn't need change anymore. It's no big secret that companies like Siemens or IBM build systems that are very large scale evolving million of devices. This is already distributed computing. I think the biggest problem with distributed computing today is that many people who build distributed systems don't understand that the domain, that the network is a domain of its own with different behaviour. Time becomes an issue, latency, other ways of communication. The single paradigm doesn't help. You must be aware of the network and than you can use it. That is already happening: look at big telecommunication management systems, mobile telephony - these already involve massive distributed computing.
Jimmy: Not that long ago someone said on a different panel that language innovation in programming is not important any longer and I thought that was totally rubbish. I think this is another of those areas. I think it was Herb Sutter publishing an article a year ago about this problem of parallel computing which will be mainstream in the near future. It adds on complexity tremendously and all of us will have to cope with it. I believe this is another area where we would need some language innovation. For example C-Omega has some of it, as an example of that.
Dave: I just want to comment on multi-core. Multi-core is an example of how hardware vendors can hack hardware into getting a new product and make something that's a complete pain in the ass and impossible to program. It is very hard to get the benefit of these multi-cores. If we take another example, the Sony/Toshiba/IBM project which is building the cell processor, cell processor as a 64 bit PowerPC cpu and depending on what disclosures you have 4-8 high performance vector units that can run 4 mpeg4 streams concurrently. But programming those streaming vector units which are very fast and still challenging. Many scientists today, scientific applications - people are running their stuff on the GPU, because the GPU performance is going up. I would say it's easier to switch to a GPU than to a multi-core if you actually want to get more performance.
Frank: Another thing I want to add is that it also depends on the domain. Some domains can not be parallelized and parallel computing does not help at all. It comes back to what Dave said in the beginning.
Steve: I have just two comments. One is that parallel distributed computing is extremely main-stream today in the online games industry. There are millions of people around the world playing counter-strike online. That's a pretty amazing piece of distributed computing. The other point is that we are going to move into a world where software becomes virtualized and running on distributed virtual machines; so the software can move around and doesn't really care where it is running and can acquire and discard machines as it feels like it. I think we get into a very interesting world for security where interesting possibilities for writing all kinds of extremely virulent software. I think we have to be quite careful about that because the scope of damage of the economy will increase exactly in proportion to the amount we virtualise the platform.
Frank: I don't know what I would have said ten years ago but I know what I would have said twenty years ago, because that was the time I got my first Macintosh. I had a discussion with my colleagues and professors at the University on how computing will be and we had a strong discussion about user interfaces. They said to look at the Mac simple devices, user centric computation, these will go on. The user needs to be more involved. If I'm an expert using the system, this won't work in the future. That became true in some sense, even though sometimes, in a very interesting way. In addition I also thought that the programming languages were all hack and you don't want to do this the larger the systems become. I also went into communicating sequential processing and distributed computing. At that time I thought it very interesting and considered it would dominate the world in some time from the mid 80s. It became true. Some of what I said 20 years ago became true earlier than I expected and some of it later and some of it didn't become true. The predictions are hard. The good things is that maybe in ten years we'll run a panel on what we (the same panellists) have said ten years before and what became true.
Jimmy: I especially remember one prediction I did ten years ago. I said that relational databases would go away and we'll just use object databases instead, I'm not sure that was correct.
Steve: I hoped that SmallTalk would take over the world and that everything would be programmed in SmallTalk and that it would become the operating system. I didn't get that one right.
Dave: As I said I think that all you can do is to say "this is a strong technology which has a potential impact" but you can't predict random events on the market, you couldn't predict the coffee cup. The fact that a cancelled project at Sun for building a set top box got together with the netscape browser and there was a coffee cup and somebody thought that Java OS could actually take on the Windows desktop which was the real reason for the switch to Java by major vendors, was actually the thin client - we can actually get at Microsoft by going with the Java OS. It was actually Java OS that caused IBM decision not the Java language per-se; it was the opportunity to have a thin client that would be able to take them back into the PC business. You can't predict a random event like that nor the success of Google or Microsoft. You can always be blindsided. The real issue is to plan - this is the best information /I have about possible events and when those events happen or don't happen than you have to re-plan.
Jimmy: Somebody famous from Aarhus (Denmark) Bjarn Stroustrup once predicted 15 years ago that using a computer would become as easy as using a telephone. He never thought this prediction became true but in the opposite way that he thought [that using a phone would be come as easy as using a computer]. Martin: In our office in London our telephone actually has a log-in button; you have to log in to the phone to use it.
Dave: I'd like to predict that the current stacks of software by 10-15 years are going to be in a much worse legacy and more of a nightmare to maintain. You're going to have employment forever maintaining this stuff. C++, Java code, C# code, this stuff is very complicated and very brittle with all these class libraries and frameworks. We're digging ourselves in a really big hole and there will be a lifetime of opportunities for you people to maintain this stuff that you're creating.
Frank: I can also predict a solution to that. We add yet another level of indirection above and even bigger middleware. ?
Massively distributed applications - non collaborative vs collaborative
something to do with the platform
Today humans are interfacing more via the web than on direct desktop apps. The web has changed everything and forced us to re-invent a lot of the development concepts and styles that we had a long time ago. For example, it's a bit ridiculous that we (or maybe just I) get excited when I see web apps becoming more like GUI's with widgets like modal dialogues. We had all this before, just on a different platform (OS-specific).
I think in 10 years we will see computers all over the home in all kinds of devices that you wouldn't imagine them in, such as the current infant movement of home media computers. How that will impact human-computer interaction, I don't know. :)
Re: something to do with the platform
I think in 10 years we will see computers all over the home in all kinds of devices that you wouldn't imagine them in, such as the current infant movement of home media computers.
Wasn't this the initial assumption of creating Java? And I don't think things have moved pretty far in this direction. Having all people to accept computers ubiquity may take longer than what some may expect.
Imo the big shift is the parallel computing. We will need to come up with better concepts and tools to ease our lives when developing for multi-cores. And the developers will be quite the same, probably just learning to interact more and more with the end users (and sometimes making these directly involved through smart DSLs).
:Architect of InfoQ.com:
.w( the_mindstorm )p.
Re: Massively distributed applications - non collaborative vs collaborative
Few years ago, my Ericsson t39 was able to work about between 3 days and a week on one battery charge. These days I charge my mobile every night, just to be sure that it won't die on me in the middle of the day while I use it to check my email, read news, play games and so on. The main power consumers are the high-res back-lit color screens, the RF circuit and the CPU.
Re: Massively distributed applications - non collaborative vs collaborative
> for long-running applications whose data can be
> partitioned well?
Unfortunately, most tasks are a lot heavier on the data reqs than they are on the crunching reqs .. that's why most problems tend to scale poorly in stateless systems (because most problems have state ;-).
Look at it this way: If an execution unit requires B bytes of data and P processor cycles, and assuming there are no QoS issues with the data (i.e. it can be stale, so distributing it to a cell phone isn't a problem), then you have to look at the cost (call it "C") of CPU power and other resources at the data-center for pushing out B bytes of data in order to save P processor cycles. If the management cost C exceeds the P processor cycles from the cell phone, then you achieve negative scalability (bad). Most problems will fit this pattern, unfortunately.
Oracle Coherence: Data Grid for Java and .NET