Tech Mesh, the alternative programming conference,focuses on promoting useful non-mainstream technologies to the software industry. The underlying theme is "the right tool for the job", as opposed to automatically choosing the tool at hand. By bringing together users and inventors of different languages and technologies new and old), speakers will get the opportunity to inspire, to share experience, and to increase insight.
1. We are here at Tech Mesh 2012, in London, I’m sitting here with David Nolen. David, today you’re going to give a talk called “Post-Functional programming”; post-functional, what’s that about? Functional is the future, there’s nothing after that.
Yes, I’ve picked a controversial title on purpose, for me this conference has been really great, because there is a real love of functional programming here, whether it’s dynamic, like a dynamically typed functional language, like Erlang whether it’s Haskell. It’s really great to see people be excited about approaching solving engineering problems with functional programming. I think it does simplify lots of things. So the idea behind the talk though is that there are two kinds of programming and people often talk about the fact that they like functional programming because functional programming is more declarative, the way you write the code, it focuses more on the “what” versus the “how”.
But even with functional programming, what you encounter is that there’s often a lot of “how” involved. There’s a really great talk from Guy Steele where the subtitle is “Why foldl and foldr are considered harmful” and his point is that functional programming, because it comes from recursion theory and has strong foundations in logic and induction, that we sort of have encoded in our functional programs this notion of sequence and that actually is really bad when you want to build a truly concurrent application, you really don’t want to be too specific about the direction, but that’s where “how” is creeping into our functional programs and so part of my talk is that I think the logic programming community, because they are actually a little bit further down the declarative path, that they tend to write code that is more susceptible to concurrency optimizations and I don’t think it’s a coincidence that Erlang started in Prolog. There is something about a purely declarative Prolog program where you look at it and it's like time doesn’t even seem involved in the code, it seems like “Oh, we could start anywhere and it’s still going to work out” and I think that’s really powerful and part of my talk is getting functional programmers to look at this stuff, because I think with functional programming we’ve found an excellent “how” language and I think we can actually use functional programming as a foundation for something higher-level, we can actually use functional programming to implement logic programming ideas.
Yes and not just Prolog, I think the area of logic programming is quite wide, I’ve been working on a library which tries to embed some of the semantics of Prolog, but the library is called core.logic for Clojure, it’s about two years old now, a year ago I would have said ”Yes, it’s just Prolog” but it’s now becoming a general constraint logic programming framework. There’s a language called “Mozart/Oz” which is really fantastic. There’s an amazing book called “Concept techniques and models of computer programming” which a lot of people say it’s like the book you should read after “The structure and interpretation of computer programs” and I totally agree. For me it’s been very eye opening constraint logic programming, the types of problems you can solve with this much higher level tool and it’s much more descriptive. So my talk sort of explores how you can actually play with these ideas in a functional programming context, you can actually take logic programming and constraints and you could just mix it up with functional programming and I think it’s pretty cool.
So, again, with Prolog had this problem which is that the foundation of Prolog is still about lists and sequences and it’s very hard in Prolog, for example, to do declarative programming with arithmetic and they actually, in the 80’s, they’ve realized this, that all that beautiful stuff that they can do with data structures around this, just doesn’t work with numbers and you add numbers to your program and suddenly all the declarative beauty of Prolog disappears, becomes very much, again, “how” oriented programming. And they eventually came upon the solution which was a framework for building constraint languages and embedding that into Prolog to regain declarative programming. And this is an area of research that's still very active because there are still questions about “How we can make that efficient?” and so on.
Things I think that are worth looking at, that have actually nothing to do with Prolog or things like Gecode. Gecode is a C++ constraint logic programming framework that you can use. There’s another one called, I’m not sure if I’m saying it correctly, called Jcop which is for Java, that does the same thing. The difference between core.logic and those things is that what’s fun about core.logic is that it’s designed to be deeply integrated with Clojure and so there are a lot of conveniences there that you wouldn’t get from sort of these other constraint solvers.
So, again, with the “what” and the ”how”, I think what’s fascinating about this is that we can program in a language that is about the “what” and the more we understand about the “what”, you’ll encounter problems, you’ll say like “Well, I can describe the “what” but the default strategy is very slow” and then you’re like “Ah, this constraint logic programming thing, this is a scam, this doesn’t work”. And a ton of researches has been put into… we want to take this declarative specification of what we want and then you as the programmer can come with domain specific information, and say “Well, we don’t want to change that, we actually want to leave that alone” but we can come and we can… maybe there’s a hook in the solver, and you’re like “I want to stick a little bit more information about what I want the solver to look at when it tries to run the problem”. But, again, this is what we want. We want to leave the description alone and maybe we might want to tweak the strategies that are at play.
That is very much true, so this is again why I brought up “Mozart/Oz”. If you look at Mozart/Oz, they show you how you can write the programs and they work and what’s beautiful is that you’ve got the program, it works and they show how… if you think a little bit about the domain, you can add a little bit more information to make the solution run faster, because you are adding a little bit more extra information and there’s a whole suite of strategies, so you have to actually learn about, given a particular problem, you might want to tweak the strategies in this way. But it still feels better, because often the strategies they are all themselves declarative components, it’s a lot less “how” and still more “what”. So a common thing is this, let’s say they have some sort of sets of constraints in they're numbers and they say these variables range over this set of domains and what happens is that the constraint programmer will eliminate some values and often times what you want to do is you want to say ”Well, we need to start searching for an answer, so let’s look at the variables that have the smallest numbers of possibilities first” and because of that, that will constrain other variables and they will shrink much faster. That’s a strategy which is called “Fail first”. A search over the variables that have the smallest number of possibilities, so that’s one of the most simple efficient strategies and you often see it like “Oh, it’s seven times faster now” and that’s a tiny bit of information that you can add.
Werner: So in essence in the future we will all be database administrators or database optimizers because…
That’s partially true, again, I think my experience is that there is a ton of work to be done on the solver space, the declarative specification was actually quite nice because we live in a world where we’re software engineers, we have to work with people and experts and the experts really understand their domains, whereas we as engineers, we try to write programs that solve that problem for them. So we are kind of allowed to focus on the thing that we are good at. Algorithms and data structures, all this stuff has to be worked on and yet, it’s all working towards this very hopefully simple or even complex, but a specification which is declarative, that’s not polluted with how is that going to work.
Werner: : And I guess this also features in Rich's work on Datomic essentially, where they use Datalog.
No, not at all. So, again, that’s what’s so exciting for me about the logic programming space, there’re a lot of approaches and it’s good to understand the tradeoffs that are involved in different approaches. So Datomic is a database and so what they want to do is they really want to be set oriented, they want to say “I want tuples that match this” and they don’t want to iterate over that, they just want to lift the whole set out of the things that match, exactly like our databases were. With something like Prolog, there’s just a different set of goals in mind you might want to generate information, you’re not going to use Datomic to do constraint solving, that might be something that you really want to do with core.logic. So it’s just a different set of tradeoffs for the types of problems that you might want to solve.
I mean a lot of the strategies are… so again, if you’re into Prolog, you would have to have a lot of experience with the… most modern Prologs actually comes with constraint logic programming facilities. But then you have thing like branch and bound, you want to minimize, you want to maximize, you want to fail first and this is all stuff that… my talk is that if you haven’t heard about this stuff, it’s really cool stuff and it’s worth digging into, to get familiar with some of these ideas.
So Mozart/Oz is really cool. It’s much more. Mozart/Oz is what happens when you get really crazy Prolog hackers to sort of reinvent programming. One of the guys that worked on Mozart has actually worked on this thing called Aquarius, which is pretty fascinating, which was an experiment in the early 90’s, to see if they could get C-equivalent or C-competitive performance out of Prolog and they were actually able to do this and there are very powerful commercial Prolog compilers which generate very good code based on that research. So they went off and said “Can we have a truly multi-paradigm language?” Taking some of the lessons from Prolog, from object oriented programming, from Dataflow languages and can we put these in a cohesive conceptual framework? So Mozart works that way, you do OO programming, you can do functional programming and at the same time you can do this sort of logic programming and constraint logic programming. And Mozart/Oz like Erlang, sort of empathizes distributed computing. But again, it’s nice because it’s all built off this notion of… there’s a sort of declarative essence to the language and it’s cool how they are able to build these paradigms on top of that.
I am not exactly sure what the name refers to, I know that there may have been a prior language called Mozart and then as it evolved, because it started pretty early, I think in early 90’s and it since evolved into Mozart/Oz, but I don’t know exactly where that name come from.
Werner: We’ll just googlify it.
Yes, I am stealing from it right now.
Yes. There’re so many great ideas in there and the problem with Mozart I think it’s just that not a lot of people know about it and people will read the book and the book is incredible. You don’t actually have to really get into Mozart/Oz to appreciate the text, but it seems unfortunate to me that I don’t have access to those powerful primitives in the programming languages that I have been using today and part of why I work on core.logic is that Clojure has some draw and I’m trying to get people like “Hey, try this stuff out and this language that you are having fun with, this is really cool stuff”, in hopes that people will continue to spread these ideas and other languages out there and use it.
So the Dataflow stuff is sort of implicit in Prolog, Dataflow sort of falls out of… you have logic variables and they unify once and so you sort of get deterministic behavior. I don’t think we’ll ever get to what… Mozart/Oz kind of has the Erlang’s sort of like super lightweight thread process model and Clojure, being hosted onthe JVM, really doesn’t. The JVM is getting there and we can do some things like with “Fork/Join” but the runtime just isn’t designed that way. The way that, again, that I think Haskell has super lightweight processes, Erlang has super lightweight processes, Mozart/Oz does, but I think you’ll still be able to get a lot of benefits of Dataflow, if you use core.logic in your programs.
Werner: So I think that we have a lot to read about, the library is core.logic, we’ll just googlify that and thank you very much, David Nolen.