3scale Targets API Consumers with APITools Offering
Earlier this year, 3scale launched APITools targeted at API consumers. InfoQ spoke to Vanessa Ramos, product and marketing lead for APITools at 3scale, regarding the motivation, underlying architecture, use-cases, roadmap and community initiatives around APITools.
InfoQ: What is the motivation for APItools and who is the intended audience?
Today’s web and mobile applications depend on a wide range of backend APIs to make them function, but staying in control of these external systems is a huge pain point. To help application developers with this challenge, APItools lets them quickly and easily track and monitor all of an application’s API traffic. APItools is designed for API consumers to help both during development and operations.
APItools is a backend proxy build using Lua + Nginx (OpenResty) that allows developers to track, transform and analyze API traffic.
InfoQ: Why is it a separate offering from the 3scale API management platform? Do they share the same underlying technology?
At this point, the needs and requirements of those who create and manage APIs on our platform, and many of the developers who then consume them, are quite different. So we developed a new product, APItools, that clearly targets what is now an underserved segment.
3scale and APItools share some foundation technologies like Redis and Nginx, but its custom parts are brand new, and built from ground up. It’s an opportunity to reuse a common strong core, extend it to meet some very specific needs.
InfoQ: What phases of the API lifecycle is it designed for and how are API consumers expected to use it? Can it be used in production?
First it’s important to clarify that we are talking about the API lifecycle from an API consumer's perspective. Specifically the phases including exploration, integration and development time, and production time. APItools covers each phase to an extent, being useful for testing and debugging while developing, but the primary focus is at production time.
InfoQ: How does it identify and categorize traffic from individual end users in production for troubleshooting?
APItools does not have a built-in concept for these users. It acts as an intelligent proxy, in the sense that it sees traces, does some (optional) work with them and then retransmits it but it does not "pay special attention to user_ids". If someone wants to isolate the traffic of one particular user to study it they’ll need that user's request to contain identifying information. Examples of identifying information will be header fields with a token or the use of different URLs per user. APITools makes it easy to filter through existing fields, but it does not create new ones unless instructed to via a middleware.
InfoQ: What about out-of-the-box policies for API security, traffic management, mobile optimization, etc. for middleware? Is the community expected to contribute?
Yes. The first step we are taking in this direction is launch an on-premise version. Our next milestone will be to create a middleware component sharing layer and to build a community around the middleware. We already have a nice collection of middleware, but as said, first things come first and right now our next step is making possible to use APItools in an on premise as well as a hosted environment.
InfoQ: Why did you choose Lua as the language for defining the middleware policies?
InfoQ: Do you plan on exposing an API for APItools?
Yes, there will be an API. We are still deciding which data to expose because we want to see what information developers would be more interested in consuming.
InfoQ: What are the use-cases for Active Docs?
There are two main use cases. The first one is a ‘visual curl’ which pre-fills the structure and enables faster iterations by making calls and seeing the results. We’ve seen too many people editing 200 char curl commands with the arrows. The second is knowing how much of the surface area of an API is being used, and seeing real usage. For internal apps that communicate one to one, you have a console to reproduce the calls. Think of it as generating a (basic) GUI for internal APIs.
InfoQ: The most challenging task for API consumers is maintenance through continual updates to accommodate API changes. Can APItools ease code maintenance activities?
Definitely. For starters, you can set up email alerts to get a notification when, for example, HTTP(s) requests throw errors or when requests take longer than expected. Here's an example of a middleware to receive an email alert when an HTTP(s) is not a 200:
return function(request, next_middleware)APItools can also be used as a "legacy tool" or a "deprecation tool". Depending on how complex your API change is, you can implement fallbacks to previous versions in APItools itself.
local five_mins = 60 * 5
local res = next_middleware()
local last_mail = bucket.middleware.get('last_mail')
if res.status ~= 200 and (not last_mail or last_mail < time.now() - five_mins) then
send.mail('email@example.com', "Something's going on...", "request is not a 200, see"..request.uri_full)
For example, if in the new version of your API a field (user_id) has been changed, you could use a middleware to provide a "transition implementation":
return function (request, next_middleware)
local res = next_middleware()
‐‐ Transform the user_id back to a string if the user requested version 1 of the API. But add a deprecation warning too
if request.headers['Version'] == "1" then
local body_data = json.decode(res.body)
body_data.user_id = tostring(body_data.user_id)
body_data.deprecation_warning = "WARNING: Version 1 of the API will stop being supported on the 2015-‐01-‐01. Please update to Version 2"
res.body = json.encode(body_data)
The advantage of using this mechanism is, instead of making changes directly on your servers, you can just remove the middleware when version 1 of the API reaches end-of-life (EOL). You can also do more advanced stuff for warning about deprecations like sending an email. Your main API code will be easier to maintain, since you can start doing version 2.0 from the beginning. There would be some managerial cost since you would have code in two places, your API and in APItools and the tradeoff will depend on the complexity of the version change and the amount of clients that you estimate will remain in version 1. In the mid term, you will also be able to see activity related to a certain API such as comments on APIs last changes, patches, documentation updates, etc.
InfoQ: What are some upcoming features that will be of interest to the community?
Apart from the on-premise version, which we’ll be launching very soon, we are planning on building middleware component sharing layer. With this we are aiming at making it easier for developers to consume APIs by being able to reuse or adapt components created by other developers to their own needs.