BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Interviews Ralph Johnson, Joe Armstrong on the State of OOP

Ralph Johnson, Joe Armstrong on the State of OOP

Bookmarks
   

1. If you look back at where object orientation came from with Smalltalk-80 with message passing, and look at the current state of inheritance and things like that, have we gone down the wrong path? Should we go back to basics?

Ralph Johnson: One of the things that always happen when you get an idea and it comes out and it's too radical for most people. Most people don't adopt the whole thing, they take a piece of it and then you get this approximation. I can remember a time when people refused to use Smalltalk because it had garbage collection and it had this virtual machine bytecodes and a virtual machine and who would want to use something that was so inefficient, everything needed to be compiled. Those sort of claims are gone.

Nobody says, there are certainly special cases where garbage collection gets in the way if you are doing low level or real time embedded programming, but by and large, people can see the value of that. Now thinking of Java I think the static type system gets in the way and people are putting templates in to make the type system more powerful. So all this is getting more and more complicated. It always happens, you pick a language and if it's a little bit too restricted then you have to keep loosening it up and you go down a particular road that fits their strategy.

C++ of course has had that same thing, it was important to be compatible with C and that really was more compatible with the C way of thinking than it was compatible with a particular C compiler. In that sense it has been successful and it has stayed compatible with the C way of thinking. But, as a person who is not a C++ programmer looking at it, it is incredibly complicated. They made this horribly complicated language but the people who use it are happy with it.

As an outsider to complain about it, is sort of pointless, because it's not my language, but I see things like Ruby are a lot closer to Smalltalk.

   

2. What Smalltalk got right/wrong?

Ralph Johnson: Again, I was sort of thinking of Smalltalk as the ideal, one of the things occurred to me a few years ago where I think Smalltalk made a fundamental error. I think it's hard for people who aren't Smalltalk programmers to appreciate this, but when you are programming in Smalltalk, when you are debugging in Smalltalk, you are debugging the whole system.

The debugger is just an application that's running inside, there is a compiler, there is all this, you got the collection classes, you got all these things. When you are debugging, what you are doing is editing objects. The program that there is no standard way to print Smalltalk code on paper. Smalltalk code is the object. It's like this guy who edits the Postscript. That seems really bizarre to us, but that is in fact what Smalltalk programmers are doing. It's just that it's easy to understand in Smalltalk.

We've got all these tools to present it and make that really nice interface. Smalltalk programmers often make complaints like "These people who think you want to store code in file systems, store it as text files. How antique and 20th century! In fact that so 1970s we would say, because in 1980 we had better ways of doing that. But there is a downside to that, which is that it encourages people, whenever they want to add something, you just make the image bigger. The image gets bigger and bigger.

What occurred to me was there was at the same time the Smalltalk came out on the Xerox Machine they had another operating system and when you would debug it was more a pascal-ish type operating system. When you debug on this operating system, you'd say "Debug", it would save all the memory to the disk and it would reboot the operating system with the debugger, but then you would be single stepping through your code, you'd be in the debugger doing everything on the disk image.

You had that image and if you decide you debug the debugger, you'd hit the button again, you'd push the debugger out on the disk, you'd pop up another image and now you'd be running your debugger on the debugger on the original program as many times as you want. You got obviously slower every time you did that, but the fact was it was easy to keep track of the difference between, if I have a new version of the debugger I of course have to debug it with my old debugger and on my application it's easy to keep track of all of these things.

Because in Smalltalk you have everything in the image. You can't keep track of the versions between the old and the new one - it's a pain in the neck but also now we are going to this distributed computing or parallel programming. People say "We want to have multiple threads inside Smalltalk." No, you don't want to do that! Because you are just getting back to all those problems. What you want to do is have multiple images in sending messages back and forth if you want fault tolerance.

It started years ago more but because we had this way of doing things, we just put everything in one image and there is also the issue of complexity. You build a system, so it gets to the limit of what a few people can do and there Smalltalk doesn't work too well. If it actually took 20 people to build your system, Smalltalk is not very good. If you could build it with 4-5 people, fine. They all sit in the room and Smalltalk is just fabulous and you could build with 4-5 people something that would take 50 people in Java, but what if it would take 200 people in Java?

You're not going to do it with 4-5 people in Smalltalk. It gets to this point where it's really designed for smaller systems, small being relative to good programmers who are programming at really high speed. But still there is that limit and if instead we were doing it by having multiple images and we're doing message sending, it would not just deal better with fault tolerance, Joe is an expert at that, but I think it leads to larger groups because this is an issue there and it would help some of the tool problems.

I can look at Smalltalk and think "Here is something that's wrong", but by and large, the dynamic nature of it I love, the ability to change things so easily, the reflective nature of it is really powerful that you do all sorts of things. By and large, when I look at the rest of the world, what I think is "They are slowly catching up". Looking at Ruby which has a lot of stuff, but for some reason the Ruby people don't believe that tools are important, which is just an odd thing to me. Anyway, there are a lot of good things going on out there and people are gradually catching up - that's my feeling.

   

3. Is Erlang object oriented?

Joe Armstrong: Smalltalk got a lot of the things right. So if your question is about what I think about object oriented programming, I sort of changed my mind over that. I wrote a an article, a blog thing, years ago - Why object oriented programming is silly. I mainly wanted to provoke people with it. They had a quite interesting response to that and I managed to annoy a lot of people, which was part of the intention actually. I started wondering about what object oriented programming was and I thought Erlang wasn't object oriented, it was a functional programming language.

Then, my thesis supervisor said "But you're wrong, Erlang is extremely object oriented". He said object oriented languages aren't object oriented. I might think, though I'm not quite sure if I believe this or not, but Erlang might be the only object oriented language because the 3 tenets of object oriented programming are that it's based on message passing, that you have isolation between objects and have polymorphism.

Alan Kay himself wrote this famous thing and said "The notion of object oriented programming is completely misunderstood. It's not about objects and classes, it's all about messages". He wrote that and he said that the initial reaction to object oriented programming was to overemphasize the classes and methods and under emphasize the messages and if we talk much more about messages then it would be a lot nicer. The original Smalltalk was always talking about objects and you sent messages to them and they responded by sending messages back.

But you don't really do that and you don't really have isolation which is one of the problems. Dan Ingalls said yesterday (I thought it was very nice) about messaging that once you got messaging, you don't have to care where the message came from. You don't really have to care, the runtime system has to organize the delivery of the message, we don't have to care about how it's processed. It sort of decouples the sender and the receiver in this kind of mutual way. That's why I love messaging.

The 3 things that object oriented programming has it's messaging, which is possibly the most important thing. The next thing is isolation and that's what I talked about earlier, that my program shouldn't crash your program, if the 2 things are isolated, then any mistakes I make in my program will not crash your program. This is certainly not true with Java. You cannot take 2 Java applications, bung them in the JVM and one of them still halts the machine and the other one will halt as well. You can crash somebody else's application, so they are not isolated.

The third thing you want is polymorphism. Polymorphism is especially regarding messaging, that's just there for the programmer's convenience. It's very nice to have for all objects or all processes or whatever you call them, to have a printMe method - "Go print yourself" and then they print themselves. That's because the programmers, if they all got different names, the programmer is never going to remember this, so it's a polymorphism. It just means "OK, all objects have a printMe method. All objects have a what's your size method or introspection method."

Erlang has got all these things. It's got isolation, it's got polymorphism and it's got pure messaging. From that point of view, we might say it's the only object oriented language and perhaps I was a bit premature in saying that object oriented languages are about. You can try it and see it for yourself.

Ralph Johnson: The thing about Erlang is that it's in some sense 2 languages, at least you program it 2 levels because one is the functional language that you use to write a single process and then there is what you think about all these processes and how do they interact, one process is sending messages to the other. At a higher level, that Erlang is object oriented, at the lowest level it's a pure functional language and that's how it got advertised for a long time.

At a higher level, when you are looking at it more from an architectural and high level design it is quite object oriented. I think you are redefining isolationism. It's all running on one computer and if one process goes wild, it hogs the processor. I think they look more like the importance of garbage collection so that you don't have to make sure that you agree on how you're releasing things. The only way in Smalltalk to interact with an object is send it a message, but the issue is what message do you have. It's the same thing in Erlang.

If you allow a huge number of messages that allow, return all the values of your local variables and that everybody's send you messages and you'll set the values of your local messages, if you did something like that, you'd basically lose a lot of the value of the isolationism. That's why you have to design things properly. The language offers some mechanism.

Joe Armstrong: It's useful when we've got million core computers.

   

4. The role of OOP over the past decades and in the future.

Ralph Johnson: When you actually start having multiple processors you really want that. Looking at how a new idea comes in, Jim Coplien today said something about object oriented programming not being a real paradigm shift, because it's all something you can do in Pascal. I don't know why he says that, because he was around watching people struggle where the paradigm shift comes from how people think and a new way of thinking. It takes a very long time.

I remember I was at the very first OOPSLA, which was in '86 and one of the things people discussed there was "Will there be a 10th OOPSLA" and we sort of decided there wouldn't be, because one of 2 things would happen. One, object oriented programming was as great as we thought it was, in which case the whole world would be doing it 10 years later. Having an object oriented programming conference would be about as silly as having a structured programming conference. We all do it, so why have a conference on it?

The other was of course, that it wasn't so good, we were deluded, in which case it would go away. Of course, neither one of those things happened, they actually decided last year would be the last year the big conference was going to be called OOPSLA and they were going to have a different name the next year. That was 25 years of it just took, it was a lot harder for it to spread out. It just takes a long time for ideas to spread. We all like to think we learn stuff so quickly, everybody else is going to learn it so quickly, new ideas are just going to spread across the world like that. Not true at all! It takes a very long time for even good ideas to spread.

Joe Armstrong: They spread inside their little community, but not outside. Smalltalk is an example - lovely stuff, 20-30 years ago.

Jul 08, 2010

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

  • News Flash!

    by Matt Giacomini,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Joe doesn't really like much beyond Erlang. More at 10:00...

    I actually really like Joe, just thought I would poke fun ;)

  • This is unexpected...

    by Eric Aguiar,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Ralph is indeed an important figure for all developers... but he is a tad rude for just babbling on like an old man in this, very inconsiderate of him but ok. I hope he reads this and at least gives any future interview partners more of a discussion or feedback loop; a more communitive discussion next time instead of trying to talk until your blue in the face from out of breath.

  • I guess the article title

    by gauthier segay,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    is what makes everyone puff, looking toward the statelessness of functional programming

  • Re: News Flash!

    by Paul Beckford,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    I blogged about Joe's misconceptions on OO before. I'm glad he is big enough to recognise when he has made a mistake:

    pab-data.blogspot.com/2008/11/why-bad-oo-sucks....

    The bigger issue is the fact that the dominant crop of OO languages aren't actually OO, and very few people recognise that fact.

    Joe is also wrong for claiming that Erlang is the only true OO language out there. One of the goals of Alan kays team was uniformity, where the language consists of message sends all the way down (in Smalltalk a character literal is an object capable of receiving messages). Erlang is sort of two languages as Ralph describes.

    For a true successors to the Smalltalk OO crown, try looking at Self, or better still Newspeak from Gilad Bracha. Newspeak builds on Alan Kays vision and extends the idea of uniformity into beyond classes with architectural modules which are themselves literal objects.

    newspeaklanguage.org/

    And it's not just a research toy. Gilad has focused on making Newspeak production ready whilst fixing many of the design flaws in Smalltalk (including the reliance on a single monolithic image as mentioned by Ralph).

    Paul.

  • OOOA/OOD/OOP crap

    by Denis V,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    BTW, the first OOP language was Simula-67: en.wikipedia.org/wiki/Simula

    Simula is a name for two programming languages, Simula I and Simula 67, developed in the 1960s at the Norwegian Computing Center in Oslo, by Ole-Johan Dahl and Kristen Nygaard. Syntactically, it is a fairly faithful superset of ALGOL 60. [1]:1.3.1

    Simula 67 introduced objects[1]:2, 5.3, classes[1]:1.3.3, 2, subclasses[1]:2.2.1, virtual methods[1]:2.2.3, coroutines[1]:9.2, discrete event simulation[1]:14.2, and features garbage collection[1]:9.1.

    [...]


    When thieves came out in the early 90's to inflate the .COM bubble and later steal 25% of all saving the Americans had accumulated since the First Great Depression -- something around 5 to 10 trillion USD -- some of the scammer hijacked the original OOP terminology. Remember Uncle Grady's "OOA/OOD" book which was published almost 20 years ago -- in January, 1990? Claim after claim, without any theoretical foundation.

    Watch how the OOP scam is implemented:

    ocw.mit.edu/courses/electrical-engineering-and-...

    [...]

    As we talk about this, as people talk about this, in the context of our object-oriented programming, they typically will talk about it in terms of message pass, a message passing metaphor. I want to mention it's just a metaphor, just a way of thinking about it, it's not anything very deep here

    [...]


    So, according to them, sending a message is "just a metaphor", i.e. a sequential call (not even a synchronous, obviously there are no timers involved). LOL. Joe was right when he came up with a notion of the agent-oriented programming, since the original OOP needed to be separated from the "modern OOP", which is one of the corner stones of the .COM bubble. There is no benefit to the "modern OOP", other than adding another level of modularization in a form of the class.

    Here is a rule of thumb: if some "technology" is not based on any sound theoretical foundation, just ignore it or at least don't take it seriously, since it's there for a different reason -- it's a kick-back- and greed-oriented pattern. Say, Ted Codd on his own created the relational model of data (actually, relational theory of data) which is based on the predicate logic, when the so called networking and hierarchical "models" started failing. And what do we see now? We see morons talking about XML data management (i.e. going back to the hierarchical model). It's not crazy, it's just sickening.

    PS
    Do yourself a favour and learn lambda-calculus, combinatory logic and start using functional programming, since everything else is just anti-scientific pile of crap.

    And yes, they started teaching Scheme at MIT again -- based on the best CS books of all times: mitpress.mit.edu/sicp/full-text/book/book-Z-H-4...

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT