Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Interviews Philip Wadler on Functional Programming

Philip Wadler on Functional Programming


1. I’m here with Philip Wadler at QCon, so Philip can you tell us a bit about yourself?

I work as a professor at the University of Edinburgh. I’m a professor of Theoretical Computer Science but I think they hired me because I try to take the theory and apply it in very practical ways. I helped take ideas of parametric polymorphism which appear in languages like Haskell and move them into languages like Java. I’ve done some work with taking Monads which were an idea from algebraic topology developed in the ’50, and other people like Eugenio Moggi figured out how to apply them to semantics, and then I figured out how to apply them to structuring libraries, for programming languages. And then people in the database field realized they could apply those ideas and later I took some of the ideas that they developed and moved them back into programming languages again, doing work with XML and I have a language called Links, which is trying to do very similar things actually to what Microsoft’s LINQ does. They have very similar names, they came about at similar times, they come from the same roots in Haskell in fact, but the fact that they have the same name is complete coincidence.


2. We get back to Monads - Haskell is like ten years old now or something?

No, it’s more than twenty years old. We wrote a twenty year retrospective paper for History of Programming Languages a few years ago, so that’s how I know it’s twenty years old.


3. So twenty years old and a lot of concepts from Haskell particularly and functional programming in general are going mainstream, everyone is talking about things like Monads, like type classes, all of these things. Is it the success of functional programming finally becoming mainstream?

It’s very interesting to see now that a lot of these ideas that have been around for a long time, that the mainstream is beginning to pay attention to them. We’ve been saying for years and years that Functional Languages can be particularly helpful when you need to deal with concurrency because the kind of interaction side effects that you get in imperative object-oriented languages generally does not happen in a functional language. When it does happen we use techniques like Monads for taming that. So they are supposed to work better with concurrency and now all of the sudden that people need to do a lot of concurrency that’s seems to be actually panning out and I think that is one reason for the increased interest. We have always thought, I think, that you do a good language and everybody will use it and it does not really work out that way, there are what economist called "Network Effects" that means, the languages that people use ares the languages that people use a lot.

So it’s not a sort of been that everybody looks at Haskell and says "Oh yes, I’ll use that!" because they’ve got to use whatever they are using at work, but what does seem to be happening gradually is that ideas from functional languages get adopted by more mainstream languages, and the other thing that is happening now that I think is really very exciting, it’s that there is a new generation of languages that pay close attention to interoperating with what is already out there. So there is Scala which works well with Java, there is F# that works well with .Net and there is also Erlang which is fairly well integrated with some libraries, but is particularly good for building concurrent and reliable systems. So all three of these are now getting a lot of mainstream attention and that is very exciting to see, so I think that functional languages will have more and more impact on what people are doing.


5. Why did it take so long to come to the mainstream, functional programming, is it performance?

I’m not a mainstream programmer, you tell me. It used to be people would complain about the performance, but the performance has been excellent for many years, but that did not matter, people would still complain about the performance even when the performance was very good. I think people just find reasons that they want to put forward for not adopting new things and that does not apply just to functional programming, any new thing can fall prey to that.


7. Don’t you feel a bit sad that people are just getting interested about the fact that is good for concurrency and for getting all the other modular aspects of the concepts and of the languages that exist?

When people learn the ideas for whatever reason then they’ll see what ideas are useful to them, then they pick up and use those, and I think that is fine.


8. First you applied Monads to programming and then that are a lot of people and it's been a few years now, even Microsoft took monads into C# to represent several things, to abstract over several cases and a lot of other languages took them too, right? What do you see the most interesting use of monads today?

The use of Monads for structuring in F#, for doing things like concurrency, I think is very interesting. I think the other thing that I found very interesting was what I mentioned before when Peter Buneman picked up and discovered that they are very helpful for structuring query languages for databases. I guess those are the two that I find most interesting.


9. Now do you still work with Haskell?

From time to time, yes, I teach it to my students.


10. What you are mainly doing as work? What is the area that really gets you interested?

What I spend a lot of time over the last few years looking at is something called the "Blame Calculus" and the purpose behind that is to try to take an untyped language like Scheme and a typed language like Haskell and to make the two interoperate. And to do that, what you basically want to do is take the types and turn those into checks that you do automatically on the untyped functions, so when you pass something untyped into the typed language you can guarantee that it has the properties that you expect, or raise an error message automatically if it does not. The person writing the typed code does not have to keep doing by hand a lot of tests for the untyped code. Now these days that’s become increasingly of interest because you have untyped languages like JavaScript and typed languages like the different .Net languages, F# or C#, or like Java, that need to interoperate with them. I’m trying to work on the fundamental ideas, but I hope that they have a real impact because there is a real need right now to get dynamically typed and statically typed languages to play well together. That is one thing that I’ve been working out.

Another thing that I’ve been looking at recently is the talk that I gave here was focused on the idea that Lambda Calculus was discovered twice: once by a logician, actually by two logicians, but one logician interested in it as logic and one logician interested in it has a programming language, which is pretty amazing because computers didn’t exist back in 1930-1940 when it was developed. This gives you a real assurance that it’s a good idea that will stand the test of time, the fact that it was discovered independently twice. So concurrency is very important, and what you’d like to do is to find a language for concurrency that was discovered independently twice, because that will give you some assurance that this is a very fundamental way of going about things. So people have been looking for such things for a long time. A very important step was Jean-Yves Girard developed something called Linear Logic and from the very beginning people said "This has something to do with concurrency."

A little bit after that people came up with something called Session Types and they use some of the notations from Linear Logic, there was some clear overlap, but it did not look like exactly the same thing, it was just related somehow. So there is some old work by Samson Abramsky saying "Well, you can make this sort of look like Pi Calculus", and some very recent work by Luis Caires and Frank Pfenning and others saying "Actually we can make it look just like the session type things, that people thought had something to do with Linear Logic. Now you could get a much closer match." I find that very exciting, so I’ve been doing some research in that area because if we can end up having something that comes both from the programming side and from the logic side that deals with concurrency, that could be very powerful.


11. Are there any languages that implement this for the concurrency?

Well, there are lots and lots of interesting programming languages for concurrency coming out of the theory community. There are fewer mainstream languages based on this, but a lot of this is rooted in something designed by Robin Milner and that is called the Pi Calculus and many ideas from Pi Calculus have made it into various things including XML languages for orchestrating processes. But session types per se have shown up in Microsoft's Singularity Project, but less so in other mainstream projects.


12. Also when we have talking about functional programming concept that is coming also in the mainstream, we talk also about continuations, so is there any mathematical model behind continuations, what is the model behind the continuations?

There are some very nice simple models of continuations going back to their origins due to Gordon Plotkin and John Reynolds and others. So that was just looking at continuations as a programming language feature and there is a lot of interest in that. I mentioned this idea that Lambda Calculus is wonderful because it was invented twice, once by Gentzen and once by Church, and that particular correspondence is for something called Intuitionistic Logic which does not have the law of the excluded middle and it does not have the law of double negations. You have A implies not not A but not conversely. So in classical logic of course you have both ways around, there is a lot of interesting work developing continuations and a researcher named Tim Griffin had a look at one of the popular operations for continuations, something called "Call/cc". And he wrote out the type of it and he stared at it for a moment and he said: "Wait a minute, that is Peirce's Law" which is one of the laws that underlies classical logic, so there are various things you can drop, law of the excluded middle, law of double negation, Peirce's Law, they all give you classical logic.

So all of a sudden there is this huge research on how does classical logic and that corresponding to a programming language and many interesting ideas, one thing called The Lambda Mu Calculus worked by Curien [and others] and I did some work on that as well, and it turns out that you can get a particularly nice correspondence. When you learn classical logic it’s all done in terms of "and" and "or" and the De Morgan’s Law that relates "and" and "or" or via negation. And you tend not to use implication because what is the dual of implication, there is not one. So instead use the fact that A implies B can be defined as not A or B. Implication in the logic corresponds to functions. Not A it turns out corresponds to continuations and so the right way to understand continuations in terms of classical logic is to say: "Right, you take continuations as primitive and you define functions in terms of them rather than the other way around."

And one thing that falls out of this is there two main ways of evaluating a functional language: you can do it call by name or call by value. One of the things that emerged from this line of work and that I was able to write down in a particularly clear form is that under De Morgan duality call by name and call by value are exact duals, which was quite a surprise, when it was first discovered, so you have these lovely things that just fall out from out of nowhere and that is how you know you are really on to something, when you totally bump in to this unexpectedly beautiful relationship. That’s what should be at the core of the development of all systems.


13. That’s what should be at the core of all information systems. So can you talk about the Links project that you mentioned earlier?

Links is a project that we did less motivated from the theory side and more motivated from the side of saying: "There's this real problem out here. People need to program the web, how do you go about doing it?" So one of our routes for that was a research that had been done by Christian Queinnec and others again dealing with continuations saying the right way to view responding to a web page is by structuring it in terms of continuations, which gives you the inversion of control that makes it much easier to build this kind of interactive applications. This is back in Web 1.0 day not the JavaScripty Web 2.0 days. So we used that as one route of what we are doing to make it easy to do interactive web applications, and then really what the web is all about is tying it to databases so we wanted tight integration of databases into our programming language and the notion is that instead of writing queries in SQL which are generated by programs written in Java we just write all our programs in Links and parts of the programs then get compiled into queries, and one of the interesting things was working out, how do you make it clear, which bits are suitably simple to be compiled into queries and which bits can’t.

So it turns out that one of my students Ezra Cooper and others discovered that a very old type idea - effect systems, could be applied to this exactly, and still with Sam I’m working out the ramifications of this, but you can very simply describe which bit of a programming language can be converted into SQL. And these days, this idea of saying: "Right, I’m going to write my program in one language but then compile it to many targets" is becoming very important because you might want to compile it to multiprocessors, you might want to compile to GPUs, you might need to compile part of it to run in the database, so I think learning fundamental things about how to integrate programming languages with all the other things we might be using, I hope that will turn out to be a value in a long term.


14. Philip, can you tell us about three books that you find interesting?

There are many books that I find interesting, I love reading fiction, I love reading Science-Fiction, I love reading Popular Science, I think one book that I really enjoyed reading was "The Selfish Gene" and the whole notion of how evolutionary theory works is very interesting. The fact is it connects a lot to computing, in one of Dawkins latest books he used computer models of evolution to try to understand these things. So I really liked that book, I particularly liked how he try to write it both to explain what he was doing to his fellows scientists and to explain it to the general public at the same time. I think that is a great idea to have both audiences in mind simultaneously. Another book that I really liked along those lines was: "Guns, Germs, and Steel" by Jared Diamond, which really talked about human history in a very broad scope. Europe seemed to have ended up technically ahead of the rest of the world, how did that happen, and he actually tracks down the broad swathes of history with some understanding of what is going on there.

Again both directed at experts and at a broad audience and I guess I have to pick one Science-Fiction book so I’ll pick Charles Stross, he wrote a book called: "Halting State" which I really loved because it was all about information technology, there is a lot of use of what we now call "Augmented Reality" in the book, but it was also set in Edinburgh, so it was set where I was living, it was set in the future that my colleges are helping to create, and he is a great writer so I really enjoyed that.

May 03, 2012