BipIO Goes Beta - An Interview With Michael Pearson
BipIO is a light-weight open source IPaaS which lets you visually interconnect various cloud services as micro-apps or personal workflows. After running as a private beta, BipIO recently opened up to the public. We spoke with Michael Pearson, Founder and Technical Lead for BipIO about the platform and his experience developing it using NodeJS and lots of public APIs.
InfoQ: Congratulations on opening up BipIO to a public beta. Can you give us an overview of what BipIO does?
MP: Thanks, I’m thrilled the project is up and running! BipIO is a light-weight open source Integration Platform As A Service (IPaaS) for orchestrating your digital self. It lets you visually interconnect various cloud services as micro-apps or personal workflows which can optionally be shared and improved upon socially. It was inspired by Yahoo Pipes but goes beyond RSS feed transformation, allowing you to leverage a wide variety of API’s for stream processing, personal content curation and production, with new ones being added every week. If there’s a service you love but not supported then no worries, it’s open to contributors.
It uses a concept of ‘bips’ which are ephemeral, graph based pipelines. Each vertex in a graph is an external API call, and messages are transformed across edges - either explicitly or as a learnt behavior. The best thing about the approach is its high level of versatility, with vertices being hot-swappable or extensible with minimal fuss whether the ‘bip’ is handling content from an incoming web-hook, email or event trigger. It’s really something you can just play with without much constraint and get to know minus an enterprise overhead, but if you want to get serious you can. It’s built for People and Robots, people and their programs.
InfoQ: You've created connectors (Pods) for about 40 different APIs now, are there any lessons learned from that? Is API design improving or are they still all over the place like George Reese lamented in his API Design book?
MP: The nice surprise has been that the API experience across the providers BipIO so far supports has been remarkably consistent. Clearly every provider has their quirks but I think at this point most developer teams understand REST, security, OAuth (mostly), streaming, realtime, the web gamut and are driving towards interoperability because it’s such a natural evolution. It’s been good to see some normalizing forces at work.
The integrations I’ve done however have been 100% hand made because there’s so few machine-ingestible schemas. API-first is a cultural shift for many organizations, and there’s invariably some divergence between the product and engineering sides on how they can deliver the most value in this new industry facet. So while API’s have been productized and generally a joy to work with, the important work of automatic discovery hasn’t really been solved yet. For example, just having a RAML file, or JSON-Schema etc that could be imported ( like MuleSoft’s RAML Notebook) would have made things so much easier. There’s tooling but its not ubiquitous, so that’s a larger challenge.
It’s not a glowing endorsement, there were a few providers I had problems with or didn’t eventually execute on integrating because it was so outside the norm, but overall I don’t think things are getting less chaotic.
The nice thing about standards is that you have so many to choose from.
- Andrew S Tanenbaum.
InfoQ: One feature of BipIO is that you can download and run it on-premise. Why have you chosen to do that? Are you seeing much adoption there or is the bulk of the utilization via the SaaS platform?
MP: The server is open source and downloadable because it’s the kind of product I wanted to use myself, something that gives you the option of making things personal if you want and take total ownership of data and security, but can also build a rapid application prototype upon with minimal overhead and investment. Being GPLv3 open source aligns well with that goal. BipIO sits somewhere in-between being a platform for makers and experimenters, and solving real problems. There’s about a 50/50 split between self hosted installations and users running through the public site, with several power users really testing its limits (hundreds of thousand messages each month) and providing valuable feedback. The platform becomes more useful the more people share their Bips and message transforms, so the focus now is in cultivating a social aspect to API integration and making the SaaS platform one which is more broadly appealing.
InfoQ: Your implementation uses NodeJS. We've all heard cautionary tales about callback hell. What's been your experience with that platform?
MP: NodeJS was a natural choice for building a light-weight network application like BipIO so I’ve focused on exploiting what it does best without trying to make it a silver bullet for everything product related. The pipe & filter patterns just fit cleanly into NodeJS’s core design principles. For instance, early on the graphs were resolved using node’s native pipes with external API’s (channels) being potential single points of I/O - it took no time to swap in RabbitMQ when parallelism was becoming a problem, with no change to the data structures. I love that flexibility. The async nature of the node runtime represented its own challenges as its a relatively new addition to the web development toolkit, and there’s lots of synchronous (or sync-like) polyfill libraries around, such as async and Q (promises) which make the application structure easier to debug, measure and optimize. Generally I have had a lot of fun with it.
It has a great active developer community, and modules for everything, making it very easy to get something up and running. The huge variety of modules has been a bit of a double edged sword and it's been some effort to curate the best modules to use for the server with many not suited to production environments. I hope that by giving real world examples and feedback to the package maintainers we can all start to turn NodeJS into a very legitimate web applications platform.
InfoQ: Your API is split into two parts: one RESTful and the other RPC. What's the reason got that? Is it architectural on your side or a usability thing?
MP: The server was built with an ‘API First’ mindset, so that has largely driven the design choices. The public site/dashboard for instance was built a fair way after the server software and is just a very thin client which uses all the endpoints provided by the server. It’s how users are able to ‘mount’ BipIO servers from a browser even if behind a firewall or across a VPN.
I think it's important not to conflate paradigms when designing an API as it drives the type of standard fragmentation everyone is working hard to avoid. By supporting both REST and RPC the API conveys semantically the right approach for a resource when building a client. The REST endpoints are core system resources, behave in a consistent way and there’s really no surprises. RPC’s on the other hand may have more complex flows which can generally be understood by a browser to resolve a resource - 3rd party OAuth negotiation is a great example of where a RESTful approach would break down very quickly. In many cases also, RPC’s the server provides encapsulate or proxy 3rd party API calls, which is outside of BipIO’s domain of responsibility. As machine comprehensible schemas become a more standardized, and clients can build automatically, the semantics of REST/RPC will have less relevance than they do now.
InfoQ: BipIO has a nice graph processing structure. Were you inspired by Apache Storm?
MP: Actually that’s very astute, yes it was! The first iteration of the server was built on Pylons/Storm/Postgres a few years ago as a personal tool, and I was having trouble modelling my vision of dynamic pipelining with Storm's static topologies. As I was working with NodeJS and RabbitMQ every day anyway, I decided to take a leap and prototype a distributed graph system using a completely different stack. BipIO doesn’t try to be a stand-in for Storm, or Kafka or anything like that - its specifically light weight to lower the opportunity cost for developers who don’t want or need to plan elaborate infrastructure.
The graph system utilises the pipe & filter pattern where the outputs of channels are transformed across adjacent edges as an input to upstream descendents. A graph is also very easy to code a UI for, and compact enough to view a large workflow on small devices. BipIO does this by abstracting and normalizing the API resources of all supported providers into a set of import/export JSON-Schemas, so all you have left to visualize are the connections between channels themselves. Developers can use these ad-hoc standards without knowing the underlying implementation of an API. It’s these schemas and transforms that make up the graphs which BipIO learns over time to provide best guesstimate message transforms for novice users.
InfoQ: The leading edge of the API market seems to be shifting from providing to consuming. How do you see that panning out in the next few years?
MP: APIs have evolved into products in their own right, and if not outright then as part of larger core products where a well documented, accessible API augments and enhances an existing service. By virtue of the productization of API’s and their increasing accessibility and user friendliness, you can start mashing together content from different services to create something greater than the sum of its parts with relatively little effort if you’re a developer, and people more generally are starting to understand that. Part of the broader movement towards consumption and automation has been enabled by Internet Of Things because it puts the power of API’s directly in peoples homes where it’s most contextual and relevant to their day-to-day. The natural progression from that point might be, ‘well why can’t my IoT movement sensor alert me if there’s a break-in through all my other channels at once’ - so for me I think there’s a lot of convergent forces which are transforming API’s from development tools and B2B value-add’s into a more organic type of marketplace. I think the more interesting aspect to that, is its about exercising your options with a level of personal freedom and taking control.
I’m trying to facilitate a more consumer oriented approach to APIs with BipIO in making it a tool where you can actually take control of the content and services you care about and turn it into something that’s more personal, useful and relevant. The project is really about bringing content, personal automation and data ownership closer to home. There’s obviously the automatic, machine discovery aspect of API consumption but I think that’s more of a developer concern (but the most difficult and ambitious to deal with). I’m looking forward to a future of API ubiquity, not just as a developer, but as a consumer as well.
InfoQ: Michael, thanks for your time.
Steven Ihde,Karan Parikh Mar 29, 2015