These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is defining not just their APIs, but their schema, and other moving parts of their API operations.05 Jul 2017
I saw a blog post come across my feeds from the analysis and visualizaiton API provider Qlik, about their Qlik Sense API Insights. It is a pretty interesting approach to trying visualize the change log and road map for an API. I like it because it is an analysis and visualization API provider who has used their own platform to help visualize the evolution of their API.
I find the visualization for Qlik Sense API Insights to be a little busy, and not as interactive as I’d like to see it be, but I like where they are headed. It tries to capture a ton of data, showing the road map and changes across multiple versions of sixteen APIs, something that can’t be easy to wrap your head around, let alone capture in a single visualization. I really like the direction they are going with this, even though it doesn’t fully bring it home for me.
Qlik Sense API Insights is the first approach I’ve seen like this to attempt to try and quantify the API road map and change log–it makes sense that it is something being done by a visualization platform provider. With a little usage and user experience (UX) love I think the concept of analysis, visualizaitons, and hopefully insights around the road map, change log, and even open issues and status could be significantly improved upon. I could see something like this expand and begin to provide an interesting view into the forever changing world of APIs, and keep consumers better informed, and in sync with what is going on.
In a world where many API providers still do not even share a road map or change log I’m always looking for examples of providers going the extra mile to provide more details, especially if they are innovating thike Qlik is with visualizations. I see a lot of conversations about how to version an API, but very few conversations about how to communicate each version of your API. It is something I’d like to keep evangelizing, helping API providers understand they should at least be offering the essentials like a roadmap, issues, change log, and status page, but the possibility for innovation and pushing the conversation forward is within reach too!
The new Microsoft Excel API has its own built-in chart resource, allowing you to drive visualizations from the spreadsheet API. I'm sure the suite of business focused visuals they provide by default will meet a few of the common needs of the average business user. However knowing the appetite of the average excel users for charts, graphs, and other visual eye candy, I predict someone is going to do well if they get to work providing a robust, plug and play, API-driven visualization solution on top of the spreadsheet API.
The leading analytics players like Tableau will be quick to serve the space, but I think that a more open approach using D3.js would do well. Don't go thinking you have the latest startup idea please, but I bet if you start crafting a set of D3.js visualization solutions that worked easily with the Excel API, I'm guessing you will do well--especially if you start looking at delivering some niche solutions, beyond basic charts, which D3.js is wells suited for.
I recommend taking a look at OAuth.io for the authentication portion of it and deploy each solution as an individual Github repository. OAuth.io lets you handle the OAuth dance 100% client side, the Excel API will provide the data, and D3.js embedded on HTML pages will provide the eye candy. One of the most frustrating portions of reverse engineering D3.js visualizations is the data connector layer, and if you swap out with a seamless Excel API browser--game over! #winning This is your revenue generator.
To help make the Excel API, D3.js visualization be more embeddable and shareable, I recommend providing a caching option that would take a JSON snapshot of the spreadsheet data and allow it to be shared via link, embedded with a copy / paste, email, or any other common channel for collaboration. If done right, this embed and social sharing strategy would act as the marketing, and word of mouth for the visualization solution.
I am just sharing some of my thoughts, as I'm playing with the Excel API some more. I am not in the business of acting on most of these opportunities, and would rather articulate them, and put them out there for others to step up and provide a solution.
This weekend I took my API Stack tag cloud, and made it driven by API collections defined using APIs.json and OpenAPI Spec. Instead of driving it from a simple tag JSON file, I wired it up to the APIs.json for each of my research projects, and it loops through each API that is indexed, finds their OpenAPI Spec, and uses various elements to publish as tags.
Then I wanted to scale it, and see what the tag cloud would look like when applied to a larger collection:
I'm not sure if these visualizations offer me any value, but it gets me thinking about APIs at the macro level, considering different ways to slice and dice the information available as part of any of the API's indexed. The verb tag cloud is extracted from an API I have that returns the verb count for any APIs.json collection, which gives me one possible data point to consider when quantifying how open, or closed an API is. Its not always constant, due to the wide variety of ways people design their APIs, but when you see an API that is all GET, there is good chance they are pretty tight with their resources.
There are number of areas across the API life-cycle that are being expanded upon in the current space, thanks to the evolution of API definition formats like Swagger, API Blueprint, and RAML. One area I haven't seen as much growth as I'd like, is in the area of visualizations driven by API definitions.
There are two distinct pools of API definition driven visualization: 1) Letting you visualize the surface area of an API 2) Letting you visualize the resource made available via an API. One area my friend the @APIHandyman has been exploring is around the surface area of API.
@APIHandyman has a nice prototype created that he is calling "Swagger Specification Visual Documentation". The API Definition driven visualization uses A D3.js visualization to help you explore the surface area of any API that is defined using Swagger. I have written about API definition driven visualizations before, so I am happy to see the concept being pushed forward, as we have a lot of iterations to cycle through before we find a visualization format(s) that works for different API designers, architects, and developers.
The visual documentation that @APIHandyman created runs on Github, and he is looking for feedback on the micro tool, and where he should take it next. He recently added a bigger information display area, but could use the communities ideas on how to make it more useful. This type of work is a time drain. Every time I started playing with Swagger + D3.js I would lose an entire evening, and have very little to show for work, so I know how valuable feedback can be.
I strongly feel that API definition driven D3.js visualizations will be the future of API design, management, and orchestration. APIs are going to continue to grow in number, and scope, and we will need simple, visual ways we can quickly traverse the landscape, and makes sense of things. If you are working on an API definition driven visualization tool, either for the surface area of an API, or helping visualize the actual resources being served up, please let me know so I can showcase.
As I am working on the API and JSON driven visualization strategy for my Adopta.Agency open data work, I saw cloud monitoring platform Librato, publish their new "Space" interface as a Heroku add-on. I like dashboards and visualization tooling that can live on multiple platforms, and engineered to be as portable and deployable as possible.
In a perfect world, infographics would be done using D3.js, and would all show their homework, with JSON or API definitions supporting any visualizations. All of my Adopta.Agency projects will eventually possess a simple, embeddable, D3.js visualization layer that can be published anywhere. Each project will have its JSON localized in the publicly available Github repository, and be explorable via any browser using Github Pages.
The Librato approach reminded me that I'd also like to see modular, containerized versions of more advanced tooling, dashboards, and visualizations around some projects. This would only apply in scenarios where a little more compute is needed behind the visualizations, that could be done with simple D3.js + JSON, hosted on Github. Essentially giving me two grades of portable visualization deployment: light and heavy duty. I like the idea that it could be a native add-on, whereever you are deploying an open API or dataset.
I still have a lot of work to do when it comes to the light duty blueprint of JSON + D3.js, and API + D3.js, to support Adopta.Agency. I will focus on this, but keep in mind doing modular cloud deployments using Docker and Heroku for the datasets that require more heavy data lifting.
I came across an interesting piece of technology today while doing new curation for API.Report. RASON, an interesting approach to API driven analytics and potential UI and visualization, that kind of resembles what I have been envisioning for one possible future. The analytics tool is created by a company called Frontline Systems, and I’ll let them articulate what it is:
RASON™ software is designed to make it much easier to create, test and deploy analytic models that use optimization, simulation, and data mining. RASON stands for Restful Analytic Solver™ Object Notation.
RASON targets analytical professionals, excel power users, and web app developers, but here is where it gets over my head, "Problems you can solve with the RASON service include linear programming and mixed-integer programming problems, quadratic programming and second-order cone problems, nonlinear and global optimization problems, problems requiring genetic algorithm and tabu search methods -- from small to very large." — sounds impressive to me!
I signed up and played with RASON a little bit, but it wasn't as intuitive as I hoped. I think I have a little more to learn about RASON. The RASON models are very cool, I like the airport hub, I just don’t have enough knowledge to make it work right now, however I’m digging the idea, and it reflects what I’ve been seeing in my head when it comes to defining API driven analysis--when you connecting that with API generated visualizations, hypermedia, spreadsheets, APIs.son, and more—I start to get a little too excited.
Anyhoo. Just sharing a new technology that I found. Once I learn more about RASON, hopefully I will be able to see where RASON fits into the bigger API life-cycle, and share a little more.
I was reviewing one of the many entries in my review queue of companies who are doing interesting things with APIs, and stumbled across the data visualization API—Lightning. Their implementation grabs my attention on several fronts, but their focus on delivering their API within your own infrastructure via a Heroku button, is one of the most relevant aspects.
This approach reflects a seismic shift occurring in how we deploy APIs, and how we deploy architecture overall. You need a data visualization API, let me deploy my API into your cloud, or on-premise infrastructure using popular approaches to virtualization—developers do not need to go to the API, the API will now come to you, and live within your existing infrastructure stack.
I’ve been talking about wholesale APIs a lot lately, showcasing the white label approach by some API providers, and exploring within the evolution of my own infrastructure, and as more savvy API providers jump in on this opportunity, you’ll see more stories emerge trying to understand the shift going on. Lightning is accomplishing this with Heroku, and their embeddable button, but companies who embrace a containerized micro-service centered approach to API deployment, will have a wide open playing field for buying and selling of the wholesale API driven technology being deployed across the emerging API economy.
Some of the side effects of being so open and transparent about my ideas, like the one I had around API visualizations, is that people who are doing similar things, like Ardoq, eventually find you. Even better, is when someone closely follows your thoughts, and takes your ideas, and sets into motion, your original idea, in a way that will allow it to become bigger than the original idea.
Last week Chris Spiliotopoulos (@chefarchitect) sent me an email, with a simple Chrome extension attached, asking what I thought. After installing the add-on (just drag onto your Chrome extension page), I visited my notebook of Swagger defined APIs over at API Stack, and when Twilio’s Swagger definition loaded in the browser, I saw a little Swagger icon show up in my browsers address bar—you know kind of like when there is an RSS feed available.
I click on the icon, and a new layer to my browser opens up, with a simple, crude, yet potentially powerful visualization of Twilio’s API surface area.
Honestly, the visualization does little for sorting out the complexities of the API, but it demonstrates a possible future, where we can browse the Internet, stumble across APIs, and their machine readable definitions, and open up an entirely new, visual layer that helps us quantify, and make sense of what an API does. APIs are a very abstract concept, and helping developers understand the scope and value of an API can be difficult, the introduction of API definition driven visualizations go a long way in helping speed up the conversation.
Imagine being able to immediately understand the scope of a microservice. How big is micro? Can I see a tag cloud of parameters? Is there a visualization layer to explore the underling data model? Can you provide me with a visualization diff between two similar APIs, defined using Swagger or API Blueprint. I’m just getting going on brainstorming ideas for visualizations, and so is Chris—a conversation that I think will be never-ending as we continue to work to understand the digital resources being deployed across the API landscape.
I have an alpha version of an APIs.json and Swagger editor, that I’m using for the redesign of my platform using microservices, and don’t just see the conversation Chris is started exclusively being about visualizing on the meta layer of APIs, but also directly connecting and exploring the valuable data, content, and other digital resources being made available via APIs. I also envision being able to explore collections of APIs defined using APIs.json, allowing not just software to navigate between many APIs, but also humans—something that is core to the APIs.json vision.
Swagger.ed is open source, and available in a Github repo. It is just an initial prototype, but imagine what is possible when you can take a machine readable Swagger or API Blueprint file, and instantly explore the meta layer to any API, then also visually explore data returned, like the FDA is doing with clinical drug trials. If you have any visualizations you’d like to see, feel free to submit as an issue on Chris’s work, or in the comments here. I would love to better understand how visualizations can better help you understand your own APIs, and the APIs that are littered across the online landscape, thta you depend on.
I'm just going to keep putting my ideas out there, so that y'all will build what is needed for the API space. In support of my API design tool, and my interactive API documentation tool, I want a Swagger generated visualization layer for APIs, using D3.js.
I’m envisioning a whole marketplace of visualizations I can choose from, driven from various popular APIs like Twitter, Crunchbase, OpenCorporates, and much more. There are number of proprietary data visualizations tools emerging out there (I’m watching you), but what I'm looking for is specifically an open solution using D3.js.
I don't have a problem if there are premium layers, and features that are driven by commercial APIs, or there being added charges for API consumption in general, but I want the visualization, and underlying JSON to be open and configurable—encouraging access and re-use.
It can be tough to help people understand just exactly what an API is, and I feel like an open visualization layer using D3.js, driven from Swagger would be one of the quickest ways to help people understand exactly what value an API can deliver.
I have been doing a lot of research into the world of financial APIs, specifically looking at some of the larger companies providing APIs that deliver market news, data, corporate profiles, and other data that make markets go round.
As I consider some of the common building blocks that are common across many financial API--real-time data frameworks, and visualization tools are two of the top items that I think will be part of every financial API stack in the future. Almost every API I looked at had some sort of real-time stream, promising data faster, as well as a way to extract meaning from these streams using template, or custom visualizations.
I’m tracking on real-time API services and tools, and i’ve been seeing some of these frameworks, like Firebase getting baked in by default to some API platforms. I am also tracking on visualization tools, I just don’t have the research published as a Github repository yet, like I do with my real-time research.
I will keep tracking on API providers who are doing interesting things with real-time or visualizations, and hopefully be able to publish more examples. I can’t help but think there are some pretty interesting opportunities for open frameworks, and white label solutions for API providers when it comes to real-time, and visualization layers on top of their existing APIs.
I recently reviewed a new API initiative from the Food & Drug Administration, called OpenFDA. I gave a whole list of things that they did right when launching the API, but one item that I thought was particularly interesting, was the actual interactive documentation for the Drugs API endpoint.
I talk a lot about interactive documentation for APIs, something that has become commonplace, and a building block that developers are starting to expect. What is different about the OpenFDA Drug API, is that the interactive documentation provides a visual interface for building API calls, going beyond the interactive, and often very form based documentation that is commonly seen in other developer areas.
Via the OpenFDA Drug API documentation you can actually build an API query by selecting from radio button values, which then updates the resulting URL query, some summary text, and generates a graph visualization of the resulting query. After building your filter, you can run the API query, and see the request and response, which is a common feature other interactive API documentation implementations.
The addition of a visualization, that is driven by each endpoint is very interesting, and something I’d like to see baked into the DNA of interactive API documentation. Helping me build an API call, visualize and understand the value contained within an API has huge potential. Currently we have Swagger, API Blueprint, and RAML generated interactive documentation solution, which are pretty similar—I’d love to see more visualizations integrated into future interactive documentation implementations.
There are some really great examples of embeddable, open data goodness over at the OpenSpending project, which is operated by the Open Knowledge Foundation, a non-profit with a missions to promote open knowledge and data.
The OpenSpending platform has a wealth of data regarding spending budgets from all over the world, providing key data that allows anyone track government and corporate financial transactions globally.
The OpenSpending platform has plenty of data visualizations avialable for use, but until recently these tools were only fixed and available on the OpenSpending site. Now they have begun developing and publishing a handful of cool, embeddable widgets that can be published anywhere:
All three visualizations are available as open-source code on Github. OpenSpending also provides examples you can play with using jsfiddle:
The embeddable tools provided by OpenSpending are exactly the types of tools I want to organize as part of my Hacker Storytelling work. I’m looking to build a wealth of embeddable tools that help people tell more meaningful, data driven stories.
I will be curating and tagging as many examples like this as I can, and continue to publish via Hacker Storytelling, for anyone to use.
API driven analytics and visualizations is one of the new areas of API usage I'm tracking on. There are many “big data” platforms emerging these days, but I’m looking for dead simple tools and services anyone can use to generate analytics and visualizations via APIs.
Think of reciprocity providers like IFTT, Zapier and Elastic.io. These new API driven service providers, make it easy to migrate data between cloud services, using a simple set of source API, triggers, actions and target APIs--with a dead simple icon and wizard based UI, allowing any user to put the platform to work.
I want this approach for embeddable analytics, visualizations and other widgets that are easily generated via APIs. Last week I came across a new platform called Ducksboard, which allows you to easily generate some pretty sophisticated analytical widgets from common API sources. The platform even comes with a marketplace where you can find other widgets. The only problem is that Duckboards is meant to generate dashboards, and not really open and portable widgets. We'll keep an eye on this platform, see where it goes.
At first glance, this space can feel just like earlier waves of widget building platforms of the Web 1.0 & 2.0 worlds. But I think we hadn’t reached critical mass, in the number of available API resources, as well as an awareness of APIs in general, in order to realize the true potential of widgets. I think we are getting closer in 2013.
I’m optistic that a new breed of API driven analytics and visualization tools will emerge making dead simple, icon based interfaces that allow anyone to generate useful widgets from valuable API resources and embed them anywhere on the open web or in private portals.
Are there any other similar platforms that I'm missing?
- Here is an overview of this tool: http://news.prnewswire.com/DisplayReleaseContent.aspx?ACCT=104&STORY=/www/story/04-08-2009/0005002822&EDATE=
- Here is a demo of the actual tool: http://pivot.panorama.com/Panorama/FoodMart.htm (make sure and drill down in the spreadsheet, its huge)
- Here is a video explaining how its used: http://www.panorama.com/google/pivot-table-tutorial/
The Google Visualization API also provides a platform that can be used to create, share and reuse visualizations written by the developer community.
You can embed visualizations directly into your website and display attractive data on your website by choosing from visualizations created by the developer community.
The Google Visualization API provides simple Gadget extensions to its API to create visualization Gadgets. Publish these here or in the Gadget directory. Become an active participant in the developer community; reuse and share visualizations with others.
Create extensions to Google products: Write visualization applications for Google products such as Google Docs. With a growing list of products that support Gadgets, syndicate your app.
You can use many data sources with one API. A visualization apps created using the API are able to access any compliant data source with no required code changes to your application. Developers can start building apps immediately using Google Spreadsheets as a supported data source.
Here are a few examples of the visualizations that are created in the Google Visualization API Gadget Gallery:
If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.