{"API Monetization"}

Lots Of Talk About Machine Learning Marketplaces

I spent last week in San Francisco listening to Google's very machine learning focused view of the future. In addition to their Google Next conference, I spent Tuesday at the Google Community Summit, getting an analyst look at what they are up to. Machine Learning (ML) was definitely playing a significant role in their strategy, and I heard a lot talk of machine learning marketplaces.

Beyond their own ML offerings like video intelligence and vision APIs, Google also provides you with an engine for publishing your own ML models. They also have a machine learning advanced solution lab, throwing a machine learning hackathon, and pushing a machine learning certification program as part of their cloud and data offerings. As the Google machine learning roadmap was being discussed throughout the day, the question of where can I publish my ML models, and begin selling them, came up regularly--something I feel like is going to be a common theme of the 2017 ML hype.

I'm guessing we will see a relationship between the Google ML engine, Google Cloud Endpoints emerge, and eventually some sort of ML marketplace like we have with Algorithmia. We are already seeing this shift in the AWS landscape, between their Lambda, ML, API Gateway, and AWS Marketplace offerings. You see hints of the future in the AWS serverless API portal I wrote about previously. The technology, business, and politics of providing retail and wholesale access to algorithms and machine learning models in this way fascinates me, but as with every other successful area of the API economy, about 90% of this will be shit, and 10% will be actually doing anything interesting with compute and APIs.

I'm doing all my image and video texture transfer machine learning model training using AWS and Algorithmia. I then use Algorithmia to get access to the models I've trained, and if I ever want to open up partner level (wholesale), or public (retail) access to my ML Models I will use Algorithmia, or an API facade on top of their API to open up access, and make available in the Algorithmia ML marketplace. I'm guessing at some point I will want to syndicate my models into other marketplace environments, with giants like Google and AWS, but also other more niche, specialty ML marketplaces, where I can reach exactly the audience I want.

See The Full Blog Post


Dreaming Of A More Modular Event Driven API Monetization

I was learning about the approach Amazon has taken with their serverless API developer portal, and highlighting their approach to API plans, and couldn't help but think there was more to it all than just rate limiting your API. Amazon's approach to API plans is in alignment with other API management providers, allowing you to deploy your APIs, meter, rate limit, and charge for access to your API--standard business of APIs stuff.

Controlling access to a variety of API resources is something that has been well-defined over the last decade by API management providers like 3Scale, and now Tyk and DreamFactory. They provide you with all the tools you need to define access to APIs, and meter access based upon a wide variety of parameters. While I haven't seen the type of growth I expected to see in this area, we have seen a significant amount of growth because API management providers are helping to standardize things--something that will grow significantly because of availability from cloud providers like AWS, Microsoft, and Google.

We have a lot of work ahead of us, standardizing how we charge for API consumption at scale. We have even more work ahead of us to realize that we can turn all of this on its head, and start paying for API consumption at scale. I do not understand how we've gotten so hung up on the click and the view, when there are so many other richer, and more meaningful actions already occurring every second, of each day online. We should be identifying these opportunities, then paying and incentivizing developers to consume APIs in most valuable way possible. 

With modern approaches to API management, we already have the infrastructure in place. We need to just start thinking about our APIs differently. We also need to get better at leveraging POST, PUT, and PATCH, as well as GET, when it comes to paying for consumption. Imagine a sort of event driven API affiliate layer to the web, mobile, device, and even conversational interfaces--where developers get paid for making the most meaningful events occur. It makes the notion of paying per view or click seem really, really, shit simple.

Anyways, just a thought I needed to get out. The lack of innovation, and abundance of greed when it comes to API monetization and planning always leaves me depressed. I wish someone would move the needle forward with some sort of modular, event-driven API monetization framework--allowing some different dimensions to be added to the API economy.

See The Full Blog Post


Including End Users In the Conversation About Their Bits Being Sold

Fitbit recenttly announced a program to pay their wearable users up to $1500 for integrating their Charge 2 into the UnitedHealthcare Motion program powered by Qualcomm Life’s 2net Platform. The "UnitedHealthcare Motion is an employer-sponsored wearable device wellness program that offers financial incentives for enrollees who meet daily step goals". Pulling back the curtain just a little bit on the value of your Internet of Things data, and specifically the devices you strap to your body.

I am not a fan of corporations strapping devices to their employees as part of these wellness programs (or for any reason), and using cash incentive to achieve the desired behavior. I feel this is a doorway to some pretty dark human resources strategies, but I do think these events pull back the curtains on what is going on, even just a little bit for users. I am sure it's not Fitbit's intention to include end-users in all of their monetization of their data, but I see this as an opportunity to educate end-users in these situations.

Most Internet users are not aware of the amount of information being gathered, bought and sold when they use the Internet, and their mobile phones. This lack of awareness is translating pretty nicely to the world of connected devices, adding some valuable demographic dimensions, to an already valuable user profile. While insurance companies are interested in improving their margins with this data, they are also interested in the new revenue streams they can create by selling data to other brokers, hedge funds, and more. 

One of the only hopes I have in this area is that startups will continue to pull back the curtain on this behavior either intentionally or unintentionally with products, services, and programs like the wellness program that Fitbit is offering. Showing users that there is value in their data, and this can be a positive first step in educating them about what is happening in the tech world. Once they get a taste of making some actual cash from their data, I'm hoping that more users establish an appetite for understanding the value of their information.

I'm counting on future waves of startups blindly disrupting industries by continuing to pull back the curtain, as well existing companies looking to gain a competitive advantage by operating this way. Leveraging market forces against industry leaders is one of the most important tools we have in our toolbox to combat exploitation of our data. You will find me encouraging companies to do this as a disruptive tactic, and as a competitive edge, not because I want to help them, but because I want to help end-users with each wave. It is my way of using their quest for startup success against the practices of the wider industry and leveraging it to help end-users in any way that I can.

See The Full Blog Post


Thinking About The Monetization Layer For Public Data

This is my walk-through of the concepts involved with the monetization of public data using APIs. In this work I am not advocating that companies should be mindlessly profiting from publicly available data, my intent is to provide a framework for organizations to think through the process of generating revenue from commercial access to public data, acknowledging that it costs money to aggregate, serve up, and keep data up to date and usable for the greater public good--if public data is not accessible, accurate, and up to date it is of no use to anyone.

I have long argued that companies and even government agencies should be able to charge for commercial access to public data and be able to generate revenue to cover operational costs, and even produce much-needed funds that can be applied to the road map. My work in this has been referenced in existing projects, such as the Department of Interior and Forest Service looking to generate revenue from commercial access and usage of public data generated by the national parks systems. In my opinion, this conversation around generating revenue from publicly available digital assets should be occurring right alongside the existing conversations that already are going on around publicly available physical assets.

Building Upon The Monetization Strategies Of Leading Cloud Providers
My thoughts around generating revenue from public open data is built upon monitoring the strategies of leading online platforms like Amazon Web Services, Google, and others. In 2001 a new approach to providing access to digital resources began to emerge from Internet companies like Amazon and Salesforce, and by 2016, it has become a common way for companies to do business online, providing metered, secure access to valuable corporate and end-users data, content, and other algorithmic resources. This research looks to combine these practices into a single guide that public data stewards can consider as they look to fund their important work.

Do not get me wrong, there are many practices of leading tech providers that I am not looking to replicate when it comes to providing access to public data, let alone generating revenue. Much of the illness in the tech space right now is due to the usage of APIs, it is due to a lack of creative approaches to monetizing digital assets like data and content, and terms of service that do not protect the interest of users. My vantage point is the result of six years studying the technology, business, and politics of the API sector, while also working actively on open data projects within city, state, and federal government--I'm looking to strike a balance between these two worlds.

Using Common Government Services As A Target For Generating Much-Needed Revenue
For this research, I am going to use a common example of public data, public services. I am focusing in this area specifically to help develop a strategy for Open Referral but it is also a potential model that I can see working beyond just public services. I am looking to leverage my existing Open Referral work to help push this concept forward, but at the same time, I am hoping it will also provide me with some public data examples that are familiar to all of my readers, giving me with some relevant ways to explain some potentially abstract concepts like APIs to the average folk we need to convince.

For the sake of this discussion things down and focus on three areas of public data, which could be put to work in any city across the country:

  • Organizations - The business listings and details for public and private sector agencies, businesses, organizations, and institutions.
  • Locations - The details of specific locations which organizations are providing access to public services.
  • Services - The listings and details of public services offered at the municipal, county, state, or federal levels of government.

Open Referral is a common definition for describing public services organizations, locations, and services, allowing the government, organizations, institutions, and companies to share data in a common way, which focuses on helping them better serve their constituents--this is what public data is all about, right? The trick is getting all players at the table to speak a common language, one that serves their constituents, and allows them to also make money.

While some open data people may snicker at me suggesting that revenue should be generated on top of open data describing public services, the reality is that this is already occurring--there are numerous companies in this space. The big difference is it is currently being done within silos, locked up in databases, and only accessible to those with the privileges and the technical expertise required. I am looking to bring the data, and the monetization out of the shadows, and expand on it in a transparent way that benefits everybody involved.

Using APIs To Make Public Data More Accessible and Usable In A Collaborative Way
Publicly available data plays a central role in driving websites, mobile applications, and system to system integrations, but simply making this data available for download only serves a small portion of these needs, and often does so in a very disconnected way, establishing data silos where data is duplicated, and the accuracy of data is often in question. Web APIs are increasingly being used to make data not just available for downloading, but also allow it to be updated, and deleted in a secure way, by trusted parties. 

For this example I am looking provide three separate API paths, which will give access to our public services data:

  • http://example.com/organizations/ - Returns JSON or XML listing and details of organizations for use in other applications.
  • http://example.com/locations/ - Returns JSON or XML listing and details of organizational locations for use in other applications.
  • http://example.com/services/ - Returns JSON or XML listing and details of public services for use in other applications.

A website provides HTML information for humans, and web APIs provides machine readable representations of the same data, making it open for use in a single website, but also potentially multiple websites, mobile applications, visualizations, and other important use cases. The mandate for public data should ensure it isn't available on a single website but is as many scenarios that empower end-users as is possible. This is what APIs excel at, but is also something that takes resources to do properly, making the case for generating revenue to properly fund the operations of APIs in the service of the public good.

The Business of Public Data Using Modern Approaches to API Management
One of the common misconceptions of public web APIs is that they are open to anyone with access to the Internet, with no restrictions. This might be the case for some APIs, but increasingly government agency, organizations, and institutions are making public data available securely using common API management practices defined by the Internet pioneers like Salesforce, Amazon, and Google over the last decade.

API management practices provide some important layers on top of public data resources, allowing for a greater understanding and control over how data is accessed and put to use. I want to provide an overview of how this works before I dive into the details of this approach by outlining some of the tenets of an API management layer:

  • Users - Requiring users to register, establishing a unique account for associated all API and public data activity.
  • Applications - Requiring users to define the application (the web, mobile, visualization, etc.) and other viable information regarding their access to the public data.
  • Keys - Issuing of unique API keys for each application, requiring their inclusion in all consumption of public data via the API.
  • Service Composition - Placement of public data resource (organizations, locations, services) into tiers, defining which APIs different users have access to and the scope of that access.
    • Resource Specific - Allowing access to specific data resources to a select audience.
    • Read / Write - Restricting write access to select users and applications. 
    • Data Specific - Limiting which data is returned, filtering based on who is requesting it.
  • Rate Limits - All APIs are rate limited, allowing for different levels of access to public data resources, which can be defined in alignment with the costs associated with operations.
  • Logging - Each API call is logged, with required user application keys, as well as details of the request and response associated with each API call.
  • Analytics - The presence of a variety of visual layers that establish an awareness of who is accessing public data APIs, what they are accessing, and details on how and where it is being applied.

These seven areas provide some very flexible variables which can be applied to the technical, business, and politics of providing access to public data using the Internet. Before you can access the organizations, locations, and service information via this example public services API you will need to be a registered user, with an approved application, possessing valid API keys. Each call to the API will contain these keys, identifying which tier of access an application is operating within, which API paths are available, the rate limits in existence, and logging of everything you consume and add so it can be included as part of any operational analytics. 

This layer enables more control over public data assets, while also ensuring data is available and accessible. When done thoughtfully, this can open up entirely new approaches to monetization of commercial usage by allowing for increased rate limits, performance, and service level agreements, which can be used to help fund the public data's mandate to be accessible by the public, researchers, and auditors.

Providing The Required Level Of Control Over Public Data Access
Understandably, there concerns when it comes to publishing data on the Internet. Unless you have experience working with modern approaches to delivering APIs it can be easy to focus on losing control over your data when publishing on the web--when in reality data stewards of public data can gain more control over their data when using APIs over just publishing for a complete download. There are some distinct ways that API providers are leveraging modern API management practices to evolve greater levels of control over who accesses data, and how it is put to use.

I wanted to highlight what can be brought to the table by employing APIs in service of public data, to help anyone make the argument for why providing machine readable data via APIs is just as important as having a website in 2016:

  • Awareness - Requiring all data to be accessed via APIs which required keys to be used for ALL applications, combined with a comprehensive logging strategy, brings a new level of awareness regarding which data is accessed, and how it is being used, or not used.
  • Security - While API keys are primarily used for logging and analytics, it also ensures that all public data resources are secured, providing tiered levels of access to 3rd parties based upon trust, contributing value to the data, and overall platform involvement--allowing data to be securely made available on the open web.
  • Quality Control -  APIs act as central gatekeeper regarding how data is updated, evolved, and deleted, allowing for a centralized, yet also potentially distributed, and collaborative approach to ensuring public data is accurate, possessing a high level of quality control.
  • Terms of Service - All API usage is governed by the legal terms of service laid out as part of platform operations, requiring all users to respect and follow terms of service if they expect to maintain their public data API keys.
  • Governance - Opening up the overall management of the availability, usability, integrity, and security of the public data which may include oversight from governing body or council, a defined set of procedures, and a plan to execute those procedures.
  • Provenance - Enabling visibility into the origins, history, and oversight of public data, helping establish the chain of custody regarding shared use of valuable data across platform operations.
  • Observability - Allowing for the observability of data resources, and its contributors and consumers using existing platform outputs and mechanisms, enabling high levels of awareness through the API management framework employed as part of platform operations, meeting service level agreements, and expected availability.

It is important to discuss, and quantify this control layer of any public data being made available via APIs if we are going to talk about monetization. Having APIs is not enough to ensure platform success, and sometimes too strict of control can suffocate consumption and contribution, but a lack of some control elements can also have a similar effect, encouraging the type of consumption and contribution that might not benefit a platform's success. A balanced approach to control, with a sensible approach to management and monetization, has helped API pioneers like Amazon achieve new levels of innovation, and domination using APIs--some of this way of thinking can be applied to public data by other organizations.

Enabling and Ensuring Access To Public Data For Everyone It Touches
Providing access to data through a variety of channels for commercial and non-commercial purposes is what modern API management infrastructure is all about. Shortly after possessing a website became normal operating procedure for companies, organizations, institutions, and government agencies, web APIs began to emerge to power networks of distributed websites, embeddable widgets, and then mobile applications for many different service providers. APIs can provide access to public data, while modern API management practices ensure that access is balanced and in alignment with platform objectives--resulting in the desired level of control discussed above.

There are a number of areas of access that can be opened up by employing APIs in the service of public data:

  • Internal - APIs can be used by all internal efforts, powering websites, mobile applications, and other systems. The awareness, logging, and other benefits can just as easily be applied to internal groups, helping understand how resources are used (or not used) across internal groups.
  • Partner - After internal access to data resources, access levels can be crafted to support partner tiers of access, which might include access to special APIs, read and write capabilities, and relaxing of rate limits. These levels often include service level agreements, additional support options, as well as other benefits.
  • Public - Data can be made publicly available using the web, while also maintaining the quality and security of the data, keep the access as frictionless as possible, while ensuring things stay up and running, and of expected quality and availability.
  • Privacy - Even with publicly available data there is a lot to consider when it comes to the privacy of organizations, locations, and services involved, but also the logging, and tracking associated with platform operations.
  • Transparency - One important characteristic of API platform is transparency in the API management layer, being public with the access tiers, resources available, and how a platform operates--without necessary transparency, consumers can become distrustful of the data.
  • Licensing - Ideally all data and all schema in this scenario would be licensed as CC0, putting them into the public domain, but if there are license considerations, these requirements can be included along with each API response, as well as in platform documentation.
  • Platform Meta API - APIs do not just provide access to the public data, they also provide access to the API management layer for the public data. Modern API management allows for API access to the platform in the several important ways:
    • Users - Providing API access to user's data and usage.
    • Apps - Providing API access to applicaiton level data and usage.
    • Tiers - Providing API access to platform tiers and details.
    • Logs - Providing API access to the platform log files.
    • Billing - Providing API access to the platform billing for access.
    • Analytics - Providing API access to the analytics derived from logs, billing, and usage.
  • Observability - An API management layer on top of public data makes data access observable, allowing platform operators, government agencies, and potentially 3rd party and independent auditors--observability will define both the control as well as access to vital public data resources.

In a world that is increasingly defined by data, access to quality data is important and easy, secure access via the Internet is part of the DNA of public data in this world. API management provides a coherent way to define access to public data, adhering to the mandate that the data is accessible, while also striking a balance to ensure the quality, reliability, and completeness of the public data.

There has been a lot of promises made in the past about what open or public data can do by default when in reality opening up data is not a silver bullet for public services, and there is a lot more involved in successfully operating a sustained public data operation. APIs help ensure data resources are made available publicly, while also opening up some new revenue generation opportunities, helping ensure access is sustainable and continues to provide value--hopefully find a balance between public good and any sensible commercial aspirations that may exist.

APIs Open Up Many Potential Applications That Support the Mission
As doing business on the web became commonplace in the early 21st century, Amazon was realizing that they could enable the sales of their books and other products on the websites of their affiliate partners by using APIs. In 2016 there are many additional applications being developed on top of APIs, with delivering public data to multiple web sites being just the beginning.

  • Web - It is common for websites to pull from a database. Increasingly APIs are being used to drive not just a single website, but networks, and distributed websites that draw data and content from a variety of sources.
  • Mobile - APIs are used to make data and content available across a variety of mobile applications, on different platforms.
  • Embeddable - Delivering data to buttons, badges, bookmarklets, and widgets that can be embedded across a variety of websites, and applications.
  • Conversational - Using data in conversational interfaces like for bots, messaging, and voice-enabled applications.
  • Visualizations - Including data in visualizations, showing API consumption, and platform usage around public data.
  • iPaaS / ETL - Enabling the migration of public data to and from other external 3rd party platforms using traditional ETL, or more modern iPaaS solutions powered via the API.
  • Webhooks - Notifying external systems of relevant events (location or service update) by pushing to URLs via what is called a webhook.
  • Spreadsheets - Publishing of data to Microsoft Excel or Google Spreadsheet using the public data APIs, as well as spreadsheet APIs.

This is just an overview of the number of ways in which a single, or multiple APIs can be used to deliver public data to many different endpoints, all in service of a single mission. When you consider this in support of public services, a bigger picture of how APIs and public data can be used to better serve the population--the first step always involved standardized, well-planned set of APIs being made available.

The Monetization Requirements Around Public Data API Operations
This is where we get to the portion of this discussion that is specifically about monetization of the operations around publishing and maintaining high-quality sources of public data. Before a sensible monetization strategy can be laid out, we need to be able to quantify what it costs to operate the platform and generate the expected value from everyone at the table.

What are the hard costs that should be considered when operating a public data platform and looking to establish some reasonable monetization objectives?

  • Data Acquisition - What one-time, and recurring costs are associated with acquiring data. This might include ongoing payouts to API contributors who are adding, updating, and validating data via the API.
    • Discover - What was spent to discover data, and identify its use on the platform.
    • Negotiate - What time to I have in actually getting access to something.
    • Licensing - Are there licensing costs or fees involved in the acquisition of data.
  • Development - What one-time, and recurring costs are associated with platform development.
    • Normalization - What does it take me to clean up, and normalize a data set, or across content. This is usually the busy janitorial work necessary.
    • Validation - What is involved with validating that data is accurate correct, providing sources, and following up on references.
    • Database - How much work is being put putting into setting up the database, maintaining, backing up, and delivering optimal levels of performance.
    • Server - Defining the amount of work put into setting up, and configuring the server(s) to operate an API, including where it goes in the overall operations plan.
    • Coding - How much work goes into actually coding an API. Ideally, open source frameworks are employed to reduce overhead, maintenance, and the resources needed to launch new endpoints.
  • Operational - What one-time, and recurring costs are associated with platform development.
    • Compute - What costs are associated with providing server compute capacity to process and deliver public data via APIs.
    • Storage -What costs are associated with on the disk storage, for both the database and other images, video, and related objects.
    • Network - How much bandwidth in / out is an API using to get the job done, as well as any other network overhead.
    • Management - What percentage of API management resources is dedicated to the API. A flat percentage of API management overhead until usage history exists.
    • Monitoring - What percentage of the API monitoring, testing, and performance service budget is dedicated to this API. How large is the surface area for monitoring?
    • Security - What does it cost to secure a single API, as part of the larger overall operations? Does internal resource spend time, or is this a 3rd party service.

Understand The Value Being Generated By Public Data
Now that we understand some of our hard costs, let's have an honest conversation about what value is being generated? First, public data has to offer value, or why are we doing all this hard work? Second, nobody is going to pay for anything if it doesn't offer any value. Let's stop for a moment and think about why we are doing all of this in the first place, and what value is worthy of carving off to drive monetization efforts.

  • Direct Value Generation - What direct value is being generated by the public data platform operations.
    • Usage - How is usage wielded as part of value generation? Is value about the increased usage of a resource, or possible value generated by a minimum usage of a resource? Usage is an important dimension of determining how value is generated as part of API operations.
    • Users - How is the value generated on a per user level? Are more users valuable? or possibly more targeted users? Teams, groups, and many other ways to determine how users impact positively or negatively the value generated from platform usage.
    • Relationships - How can relationships between users, or companies be applied to value generated? Will access to more relationships positively or negatively impact how value is generated for the platform and consumers?
    • Data Acquisition - Is the acquisition of data part of the generation of value via the public data platform, encouraging the growth of data.
    • Applications - Is value generated looked at on a per application basis? Does having multiple applications impact the value generate? Coming up with interesting ways to understand how applications impact platform value for everyone.
    • Integrations - What external integrations are available? How can these be leveraged to enhance the value for consumers? Are some integrations part of base operations, where others are accessible at higher levels, or on a one-off basis.
    • Support - Is access to support something that impacts the value being generated? Does access to support resources introduce the optimal levels of value consumers are looking for? How is support wielded within overall platform monetization?
    • Service Level Agreements - Are we measuring the number of contracts signed, and partner agreements in place? And how we are delivering against those agreements?
    • Revenue - What revenue opportunities for the ecosystem around an API and its operation, sharing in the money made from platform operations. Obviously, anything beyond operating costs should be applied to expansion of efforts.
  • Indirect Value - What are some of the indirect value being generated by the public data platform operations.
    • Marketing Vehicle - Having an API is cool these days, and some APIs are worth just having because of the PR value, and discussion.
    • Traffic Generation - The API exists solely for distributing links to the web and mobile applications, driving traffic to specific properties - is this tracked as part of other analytics?
    • Brand Awareness - Applying a brand strategy, and using the API to incentivize publishers to extend the reach of the brand and ultimately the platform - can we quantify this?
    • Analysis - How can analytics be leveraged as part of API value generation? Are analytics part of the base of operations, or are they an added value incentive for consumers, and platform operators.
    • Competitiveness - Is the public data effort more agile, flexible, and competitive because it has an API and can deliver on new integrations, partnerships, and to new platforms easier, and more cost effectively?
    • Public Service - Making data available for use on many different web, mobile, and other applications demonstrates a commitment to public service, and the good public data can do.

While there may be other hard costs associated, as well as areas of value being generated, this should provide a simple checklist that any open data provider can use as a starting blueprint. Additional costs can be included on in these existing areas or added on as new areas as deemed relevant--this is just about getting the monetization conversation going.

There are two main objectives in this exercise: 1) understanding the hard costs and value associated with operations 2) assembling into a coherent list so that we can explain to others as part of transparency efforts. When it comes to the business of public data, it is more than just generating revenue, it is about being upfront and honest about why we are doing this, and how it is done--mitigating the political risk involved with doing business with public resources out in the open.

Putting Together A Working Plan Involving Public Data
With an understanding of the hard costs of providing a public data platform and an awareness of the intended value to be generated via operations, we can now look at what details would be involved in a plan for executing this monetization strategy. API management practices are architected for metering, measuring, and limiting access to data, content, and algorithmic resources in service of a coherent, transparent public data monetization strategy. 

Here is a core framework of API management that can be applied to public data that can be used to drive monetization efforts:

  • Access - What are the default access levels for public data access via the API.
    • Self-Service - Public access to the platform via user registration, or 3rd party authentication like Twitter, Facebook, or Github.
    • Approval - Access level(s) that require the approval of user or application before they are able to access platform resources.
  • Tiers - What are the multiple tiers of access to all data resources available via API.
    • Public - Define the default public access for the platform, with a free, limited access tier that is obtainable via a self-service registration without approval.
    • Contributor - Providing a tier of access to contribute content, validate and management data on the platform.
    • Service Provider - Providing a tier of access for service providers involved with public data operations.
    • Internal - Access tier for internal groups, used by all websites, mobile applications, and system integrations.
    • Partner - Tier(s) of access design for data, and other partners involved in the management of public data resources.
    • Commercial - Access tier(s) for commercial usage of public data resources with higher levels of access for set fees.
    • Non-Commercial - Access tier(s) for non-commercial usage of public data resources with specific access waving fees.
    • Government - A set of API resources is available specifically for government access.
    • Auditor - Access across APIs specifically designed for external 3rd party auditors.
  • Elements - What are the core elements that make up the service composition for the monetization plan(s).
    • Paths - Establishing plans based upon the availability and access to different API endpoints, including platform meta API
    • Read / Write - Restricting read and write access to public data to specific tiers, limiting who writes data to only trusted applications.
  • Time Frames - What are the timeframes that impact the public data / API monetization plan(s) and consumption.
    • Daily - What are the details for managing, guiding, and restricting plan entries each day.
    • Weekly - What are the details for managing, guiding, and restricting plan entries in weekly timeframes.
    • Monthly - What are the details for managing, guiding, and restricting plan entries on a monthly basis.
  • Metrics - What is being measured to quantify value generated, providing a framework to understand monetization possibilities.
    • API Calls - Each call to the API is measured, providing the cornerstone of monetizing access and contribution to public data--remember not all calls will cost, some will add value with contributions.
    • URL Clicks - Each click on a URL served up via the API drive data and content is measured, providing details on value delivered to internal and external websites--URL shortener required for this.
    • Searches - All searches conducted via the API are measured, providing details on what users are looking for.
    • Users - Measure user acquisitions and history to keep track of the value of each platform user.
    • Applications - Measure the number of applications added, with details of activity to understand value generated.
  • Limits - What are the limitations imposed across all tiers of access as part of the API monetization plan.
    • API Calls - How many API calls any single application can make during a specific time frame.
    • IP Address - Which IP addresses you can request data from, limiting the scope of serving data.
    • Spend - How much any user can spend during a given time period, for each user or application.
  • Pricing - What prices are set for different aspects of the monetizing the platform.
    • Tiers - What are the base prices established for each tier of API access.
    • Unit - What are the default unit prices of per API call access for each tier.
    • Support - What charges are in place for receiving support for platform applications.
    • SLA - What costs are associated with delivering specific quality or levels of service and availability?

These are the moving parts of a public data monetization strategy. It allows any public data resources to be made available on the web, enabling self-service access to data 24/7. However, it does it in a way that requires accountability by ALL consumers, whether they are internal, partner, or the public at large. This API management scaffolding allows for the frictionless access to public data resources by the users and applications that are identified as worthwhile, and imposing limits, and fees for higher volume and commercial levels of access. 

Speaking To A Wide Audience With This Public Data Monetization Research
I purposely wrote this document to speak to as wide as possible audience as possible. In my experience working with public data across numerous different industries, there can be a wide variety of actors involved in the public data stewardship pipeline. My objective is to get more public data accessible via web APIs, and generating revenue to help fund this is one of my biggest concerns. I am not looking to incentivize people in making unwarranted profits on top of public data, this is already going on. My goal is open up the silos of public data out there right now, make them more accessible, while opening up the opportunity for delivering to a variety of applications, while also funding this important process.

I wanted to help anyone reading this to craft a coherent argument for generating much-needed revenue from public data, whether they are trying to convince a government agency, non-profit organization, institution, or a commercial company. Public data needs to be available in a machine-readable way for use in a variety of applications in 2016--something that takes resources and collaboration. APIs are not another vendor solution, they are the next step in the evolution of the web, where we don't just make data available for humans by publishing as HTML--we need the raw data available for use in many different applications. 

See The Full Blog Post


Exploring The Economics of Wholesale and Retail Algorithmic APIs

I got sucked into a month long project applying machine learning filters to video over the holidays. The project began with me doing the research on the economics behind Algorithmia's machine learning services, specifically the DeepFilter algorithm in their catalog. My algorithmic rotoscope work applying Algorithmia's Deep Filters to images and drone videos has given me a hands-on view of Algorithmia's approach to algorithms, and APIs, and the opportunity to think pretty deeply about the economics of all of this. I think Algorithmia's vision of all of this has a lot of potential for not just image filters, but any sort of algorithmic and machine learning API.

Retail Algorithmic and Machine Learning APIs
Using Algorithmia is pretty straightforward. With their API or CLI you can make calls to a variety of algorithms in their catalog, in this case their DeepFilter solution. All I do is pass them the URL of an image, what I want the new filtered image to be called, and the name of the filter that I want to be applied. Algorithmia provides an API explorer you can copy & paste the required JSON into, or they also provide a demo application for you to use--no JSON required. 

Training Your Own Style Transfer Models Using Their AWS AMI
The first "rabbit hole" concept I fell into when doing the research on Algorithmia's model was their story on creating your own style transfer models, providing you step by step details on how to train them, including a ready to go AWS AMI that you can run as a GPU instance. At first, I thought they were just cannibalizing their own service, but then I realized it was much more savvier than that. They were offloading much of the costly compute resources needed to create the models, but the end product still resulted in using their Deep Filter APIs. 

Developing My Own API Layer For Working With Images and Videos
Once I had experience using Algorithmia's deep filter via their API, and had produced a handful of my own style transfer models, I got to work designing my own process for uploading and applying the filters to images, then eventually separating out videos into individual images, applying the filters, then reassembling them into videos. The entire process, start to finish is a set of APIs, with a couple of them simply acting as a facade for Algorithmia's file upload, download, and DeepFilter APIs. It provided me with a perfect hypothetical business for thinking through the economics of building on top of Algorithmia's platform.

Defining My Hard Costs of Algorithmia's Service and the AWS Compute Needed
Algorithmia provides a pricing calculator along with each of their algorithms, allowing you to easily predict your costs. They charge you per API call, and the compute usage by the second. Each API has its own calculator, and average runtime duration costs, so I'm easily able to calculate a per image cost to apply filters--something that exponentially grows when you are applying to 60 frames (images) per second of video. Similarly, when it comes to training filter models using AWS EC2 GUP instance, I have a per hour charge for compute, storage costs, and (now) a pretty good idea of how many hours it takes to make a single filter. 

All of this gives me some pretty solid numbers to work with when trying to build a viable business built on top of Algorithmia. In theory, when my customers use my algorithmic rotoscope image or video interface, as well as the API, I can cover my operating costs, and generate a healthy profit by charging a per image cost for applying a machine learning texture filter. What I really think is innovative about Algorithmia's approach is that they are providing an AWS AMI to offload much of the "heavy compute lifting", with all roads still leading back to using their service. It is a model that could quickly shift algorithmic API consumers to be more wholesale / volume consumers, from being just a retail level API consumer.

My example of this focuses on images and video, but this model can be applied to any type of algorithmically fueled APIs. It provides me with a model of how you can safely open source the process behind your algorithms as AWS AMI and actually drive more business to your APIs by evolving your API consumers into wholesale API consumers. In my experience, many API providers are very concerned with malicious users reverse engineering their algorithms via their APIs, when in reality, in true API fashion, there are ways you can actually open up your algorithms, make them more accessible, and deployable, while still helping contribute significantly to your bottom line.

See The Full Blog Post


Investing In Your API Community Like Amazon And Slack

The messaging platform Slack made waves when they launched their Slack Fund as part of their API release, putting up $80M to invest in developers who were interested in building cool things on the API. Slack has continued telling their story, talking about how they have invested some of the fund and are being pretty transparent about which API integrations made the cut. 

After reading about the Slack Fund I wanted to see which other APIs were investing in their communities like this. Next, I came across the Alexa Fund where Amazon is investing in developers building voice-enabled apps, giving their Echo platform the boost of applications it will need to find success. After poking around the AWS platform, you also find they have also had their AWS Grants program, investing in the best of breed projects that use the AWS platform to give back to the world.

I came across a number of other fund announcements about available funds, only to find the pages gone, but MailChimp had an interesting accounting of their fund which they started in November of 2010, pledging $1M to "provide developers with funding assistance to build integrations on our email platform". I also found another fund developer fund out of Cisco, to encourage development with Cisco Spark and Tropo, and the other APIs they provide at Cisco DevNet, that was noteworthy.

There are not enough examples of API investment funds out there for me to add the concepts as what I'd consider being a common API building block, but there are enough out there, especially from leading pioneers, to keep me paying attention. API developer funds don't seem like something you have to go as big as AWS or Slack did, but it seems to me, if you provide a small fund like MailChimp did, it could incentivize development significantly--at least make your launch be fewer crickets.

See The Full Blog Post


Adding A 3rd Dimension To My API Monetization Thinking

When it comes to the API space, it always takes numerous conversations with API providers and practitioners, before something comes into focus for me. I've spent five years having API management conversations, an area that is very much in focus for me when it comes to my own infrastructure, as well as using as a metric for reviewing the other public and private APIs that I review regularly.

While I have been paying attention to API monetization for a couple years now (thank you @johnmusser), in 2015 I find myself involved in 20+ conversations, forcing the topic to slowly come into focus for me, whether I like it or not. When talking to companies and organizations about how they can generate revenue from their APIs, I generally find the conversation going in one of two directions:

  • Resource - We will be directly monetizing whatever resource we are making available via the API. Charging for access to the resource, and composing of multiple access tiers depending on volume, and partnerships.
  • Technology - We believe the technology behind the API platform is where the money is, and will be charging for others to use this technology. Resulting in a growing wholesale / private label layer to the API economy. 

90% of the conversation I engage in are focused in the first area, and how to make money off API access to a resource. The discussion is almost always about what someone will pay for a resource, something that is more art than science--even in 2015. The answer is, we don't know until there is a precedent, resulting in an imbalance where developers expect things for free, and providers freak the fuck out--then call me. ;-)

As more of my thoughts around API monetization solidfy, a third dimension is slowly coming into focus, one that won't exist for all API providers (especially those without consumers), but is something I think should be considered as part of a long term roadmap.

  • Exhaust - Capture, and make accessible the logs, usage, tooling, and other resources that are developed and captured through the course of API operations, and make available in a self-service, pay as you go approach.

There are many ways you can capture the exhaust around API operations, and sell access to it. This is where the ethics of APIs come into play--you either do this right, or you do it in a way that exploits everything along the way. This could be as simple as providing an API endpoint for accessing search terms executed against an API, all the way to providing a franchise model around the underlying technology behind an API, with all the resources someone needs to get up and running with their own version of an API. If you are very short-sighted this could be just about selling all your exhaust, behind the scenes to your partners and investors.

To me this is all about stepping back, and looking at the big picture. If you can't figure out a theoretical, 3rd dimension strategy for making money off the exhaust generated by the resource you are making available via an API, and the underlying technology used to do so---there probably isn't a thing there to begin with. Well, if you can't do this in an ethical way, that you will want to talk about publicly, and with your grandmother, you probably shouldn't be doing it in the first place. I'm not saying there isn't money to be made, I'm say there isn't real value, and money to be made that also lets you also sleep at night.

This opens up a number of ethical benchmarks for me. If you are looking at selling the exhaust from everything to your VC partners, and never open it up via your public APIs, you probably are going to do very well in the current venture capital (VC) driven climate. What I'm talking about, is how do you generate a new layer of revenue based upon the legitimate exhaust, that is generate from the valuable resource you are making available, and the solid technological approach that is behind it. If there is really something there, and you are willing to talk about it, and share publicly, the chances I'm going to care and talk about on my blog increases dramatically.

If you do not have a clue what I'm talking about, you probably aren't that far along in your API journey. That is fine. This isn't a negative. Just get going as soon as you can. If you are further along, and have people and companies actually using your API, there is ap robably a lot of value already being generated. If you partner with your ecosystem, and educate, as well as communicate with end-users properly--I am just highlight that there is a lot of opportunity to be captured in this 3rd dimension.

See The Full Blog Post


Are There Really Any Monetization Opportunities Around Open Data And APIs?

One of my readers recently reached out to me, in response to some of my recent stories of monetization opportunities around government and scientific open data and APIs. I'm not going to publish his full email, but he brought up a couple of key, and very important realities of open data and APIs that I don't think get discussed enough, so I wanted to craft a story around them, to bring front and center in my work.

  • Most open data published from government is crap, and requires extra work before you can do anything with it
  • There currently are very few healthy models for developers to follow when it comes to building a business around open data
  • Business people and developers have zero imagination when it comes to monetization -- aka ads, ads, ads, and more ads.

My reader sums it all up well with:

I don't dispute that with some pieces of government data, they can be integrated into existing businesses, like real estate, allowing a company to value add. But the startup space levering RAW open, or paid government data is a lot harder. Part of my business does use paid government data, accessible via an API, but these opportunities the world over are few and far between in my mind.

I think his statement reflects the often unrealized challenges around working with open data, but in my opinion it also the opportunity when it comes to the API journey, when applied to this world of open data.

APIs do not equal good, and if you build a simple API on top of open government data, it does not equal instant monetization opportunity as an API provider. It will take domain experts (or aspiring ones) to really get to know the data, make it accessible via simple web APIs, and begin iterating on new approaches to using the open data to enrich web and mobile applications in ways that someone is willing to pay for.

The reality of taking an open data set, cleaning it up, and then being able to monetize access to it directly via an API is simply not a reality, and is something that will only work in probably less than 5% of the scenarios where it is applied. However this doesn't mean that there aren't opportunities out there when it comes to monetizing adjacent to, and in relationship to the open data.

Before you can develop any APIs that any business or organization would want to pay for you have to add value. You do this by adding more meaningful endpoints that do not just reflect the original data or database resources, and provide actual value to end users of the web and mobile devices that being built--this is the API journey you hear me speak of so soften.

You can also do this by connecting the dots between disparate data-sets, in the form of crosswalks, and the establishing common data formats that can be applied across local, and regional governments, or possibly an industry. Once common data formats and interface models are established, and a critical mass of high value open data, common tooling can begin to evolve, creating opportunities for further software, service, and partnership revenue models.

The illness that exists when it comes to the current state of open data is something partly shared between early open data advocates when it came to over-promising the potential of open data, and their own under-delivery, as well as the governments under-delivery when it came to the actual roll-out and execution around their open data efforts. Most of the data published cannot be readily put to work, requiring several additional steps before the API journey even begins--making more work for anyone looking to develop around it, putting up obstacles, instead of removing them.

There is opportunity to generate revenue from open data published by government, but it isn't easy, and it definitely isn't VC scale opportunity, but for companies like HDSCore, when it comes selling aggregate restaurant inspection data to companies like Yelp, there is a viable business model. Companies that are looking to build business models around open data need to tamper down their expectations of being the next Twitter, and open data advocates need to stop trumpeting that open data and APIs will fix all that is wrong with government. We need to lower the bar, and just get to work doing the dirty work of exposing, cleaning up, and evolving how open data is put to work.

It will take a lot of work to find more of the profitable scenarios, and it will take years and years of hard work to get government open data to where it is default, and the cleanliness and uselessness levels are high enough, before we see the real potential of open data and APIs. All this hard work, and shortage of successful models, doesn't mean we shouldn't do it. For example, just because we can't make money providing direct access to Recreational Information Database (RIDB), doesn't mean there isn't potentially valuable APIs when it comes to understanding how people plan their summer vacations at our national parks--it will just take time to get there.

My Adopta.Agency project is purely about the next steps in this evolution, and making valuable government "open data" that has been published as CSVs and Excel, more accessible and usable, by cleaning them up and publishing them as JSON and / or APIs. I am just  acknowledging how much work there is ahead of us when it comes to making the currently available open data accessible and usable, so we can just begin the conversation about how we make them better, as well as how we generate revenue to fund this journey.

See The Full Blog Post


Catching My Breath On My API Monetization Ramblings Before I Enter Into Some New Conversations

I have two more conversations kicking off on the topic of API monetization, so I just needed to take a moment to gather up the last wave of posts on the subject, catch my breath, and refresh my overall thoughts in the area. What I really like about this latest wave, is that they are about providing much needed funding for some potentially very important API driven resources. Another thing is that they are pretty complicated, unproven approaches to monetizing APIs--breaking ground!!

Over the last couple weeks, I have be engaged in four specific conversations that have shifted my gaze to the area of API monetization:

  • Audiosear.ch - Talking with the PopupArchive team about making money around podcast search APIs.
  • Department of Interior - Providing feedback on the Recreation Information Database (RIDB) API initiative.
  • Caltech Wormbase - Helping organize a grant effort to fund the next generation of research, from Wormbase, and other scientific database.
  • HDScores - Mapping out how HDScores is funding the efforts around aggregating restaurant inspection data into a single, clean API.

 As I think through the approaches above, I'm pushed to exercise what I can from these discussions, on my own infrastructure:

  • My API Monetization - As I look to add more APIs to my stack, I'm being forced to clearly define all the moving parts of my API monetization strategy.
  • Blockchain Delusions - While thinking again about my API pricing and credit layer, I'm left thinking about how the blockchain can be applied to API monetization.

The API Evangelist network is my research notebook. I search, re-read, and refine all the thoughts curated, and published here. It helps me to aggregate areas of my research, especially in the fast moving areas, where I am receiving the most requests for assistance. Not only does it refresh my memory of what the hell I've written in the last couple weeks, I also hope it gives you a nice executive summary in case you missed anything.

If you are looking for assistance in developing your API monetization strategy, or have your own stories you'd like to share, let me know. If you have any feedback on my stories, including feedback for the folks I'm talking to, as well as items missing from my own API monetization approach, or blockchain delusions--let me know!

See The Full Blog Post


Expanding On My API Monetization Strategy And Research

This is a full walk-through of me trying to distill down my approach to API monetization, in a way that can be applied across not just 30 APIs, but potentially 300, or 3000. There are several things converging for me right now, which includes the maturing of my own infrastructure, as well as conversations I'm having with startups, enterprise groups, federal government agencies, and my own partner(s).

I need to keep hammering on this to help me operate my own infrastructure, but I am also partnering with APIWare to help me deliver on much of the API design, deployment, and management, so I need to have a good handle on my costs. As with all of my areas of research, within the area of API monetization I am just trying to get a handle on the common building blocks, and provide a checklist of considerations to be employed when I'm planning and operating my API infrastructure.

To help me build a base, let's walk through some of the building blocks of my own API monetization strategy.

Acquisition
What do I have invested into any single API. Even if I am building something from scratch, what went into it? Every API I possess has some sort of acquisition cost, even if it is just $14.00 for the two pints of beer I bought while I was brainstorming the idea.

  • Discover - What did I spent to find this. I may have had to buy someone dinner or beer to find, as well as time on the Internet searching, and connecting the dots.
  • Negotiate - What time to I have in actually getting access to something. Most of the time its on the Internet, and other times it requires travel, and meeting with folks.
  • Licensing - There is a chance I would license a database from a company or institution, so I want to have this option in here. Even if this is open source, I want the license referenced as part of acquisition.
  • Purchase - Also the chance I may buy a database from someone outright, or pay them to put the database together, resulting in one-time fee, which I'm going to call "purchase".

Having a framework for me to think about the acquisition of each API resource I possess, makes it easier for me to think it through when I am brainstorming new API ideas. Something that makes sure I am tracking all details from the moment of inception, to when I commit to actually making it available via an API on my platform.

Development
What does it actually take to stand up an API. There are a lot of moving parts with making an API happen, and not all of them are technical. Am I willing to invest the time necessary to stand up an API or will it require outside investment, as well as resources. What is needed to take an API from acquisition to actual operation?

  • Investment - Who put up the money to support the development of this API resource? Was it internal, or did we have to take external investment.
  • Grant - Was the development of this API rolled up in a grant, or specifically a grant for its development. Covering costs involved.
  • Normalization - What does it take me to cleanup, and normalize a dataset, or across content. This is usually he busy janitorial work necessary.
  • Design - What does it take me to generate a Swagger and API Blueprint, something that isn't just auto-generated, but also has the required hand polish it will require.
  • Database - How much work am I putting into setting up the database. A lot of this I can automate, but there is always a setup cost involved.
  • Server - Defining the amount of work I put into setting up, and configuring the server to run a new API, including where it goes in my overall operations plan.
  • Coding - How much work to I put into actually coding an API. I use the Slim PHP framework, and using automation scripts I can generate 75% of it usually, but there is always finish work.
  • DNS - What was the overhead in me defining, and configuring the DNS for any API, setting up endpoint domain, as well as potentially a portal to house operations. 

Historically when it came to APIs, I just dove into writing code with little consideration for what went into it. I'd say this is one by-product of the microservices way of thinking, is that I decoupled the moving parts of each of my APIs, allowing me to approach development in this way. I'm sure I will keep slicing off other elements within the development process as I progress.

Operation
What goes into keeping an API operational, reliable and available? How much do I spend on all aspects of an existing APIs lifecycle to make sure it meets the standards of API consumers. Ideally operational costs go down the more efficient the platform gets with overall operations, reducing overhead, and streamlining across everything.

  • Definition - How much resources am I applying to creating and maintaining APIs.json, Swagger, and API Blueprint definitions for my APIs.
  • Compute - What percentage of my AWS compute is dedicated to an API. Flat percentage of the server its one until usage history exists.
  • Storage - How much on disk storage am I using to operate an API? Could fluctuate from month to month, and exponentially increase for some.
  • Bandwidth - How much bandwidth in / out is an API using to get the job done.
  • Management - What percentage of API management resources is dedicated to the API. A flat percentage of API management overhead until usage history exists.
  • Code - What does it cost me to maintain my code samples, libraries, and SDKs for each individual API, or possibly across multiple APIs and endpoints.
  • Evangelism - How much energy do I put into evangelizing any single API? Did I write a blog post, or am I'm buying Twitter or Google Ads? How is the word getting out?
  • Monitoring - What percentage of the API monitoring, testing, and performance service budget is dedicated to this API. How large is surface area for monitoring?
  • Security - What does it cost for me to secure a single API, as part of the larger overall operations? Does internal resource spend time, or is this a 3rd party service.
  • Virtualization - What am I spending on virtualization for an API, as part of QA process, for retail sandbox and simulation environments, or for specific partner needs.

Ideally the more APIs you operate, the more detail you will get about each of these areas, and some of these areas you should get better deals, the more volume you run through each area listed above. Example of this would be with compute and storage costs going down, as we do more business. The more we understand the details of operations, the more we can optimize operations.

Access Levels
What sort of access levels are we going to provide across ALL APIs, not that all APIs will use all areas, but we should be ready for as many scenarios as we possibly can. We need to be clear of what access is the free layer, as well as the tiers of access, and any wholesale, partner, or re-sellers aspects.

  • Free (unlimited) - This is just a free API, I won't be rate limiting the usage of it. It will act similar to any website I put out there, but instead of HTML it is JSON.
  • Free Trial - I am only going to offer a limited use, or time period for access a resource, giving just a taste test, but won't be in main pool of APIs available. 
  • Not For Profit - This API is being subsidized somehow. Either there is direct investment from internal or external resources to subsidize or there is a grant involved.
  • Educational Access - Is this API available as an educational resource, with special pricing for students and teachers? This will usually be reflected in the tier, and credit availability.
  • Tier(s) - Which of these service tiers is an API available in, and which endpoint paths + verbs are accessible in the tier (api-pricing definition).
    • Public - General access, you usually don't even need a key. Only limited to specific APIs.
    • Retail - This is the pay as you go level for public acess to all APIs. This is where the retail side of business operations will occur.
    • Trusted - These are just a handful of trusted individuals or companies, who may have write access to some endpoints.
    • Education - Providing a specific access tier for education partners, including students, teachers, and institutions. Providing higher levels of free access, and lower price points.
    • Partner - These are partners I have prearranged agreements with, something I will be transparent about, showcasing them on partner page.
    • Wholesale - The wholesale, often non-rate limited portion of my platform, where I deploy APIs in other people infrastructure, or our own for flat fees.
    • Platform - This is all internal access by applications I build for my own usage. I still rate limit, and manage this access, I just give myself different privileges.
  • Partner Program - A structured program allowing API consumers to achieve higher levels of access, with reduced pricing levels, flat rate access, and other benefits.
  • Reseller Program - A structured programming for allowing API consumers to prove themselves, and share in revenues from API usage, affiliate links, and revenue share.

My intent around access levels is to be as transparent as possible. Not all users will access at all levels, and not all APIs, and their endpoints will be available within at all access levels. The goal is to optimize access, remain as open as makes sense, while also sensibly monetizing resources to cover costs, and make a fair profit.

Pricing & Credits
I am employing a universal credit system that will be used by all APIs. The goal is to expand the unit of currencies I employ beyond just API calls, and attach a universal unit of value that can be applied across all APis. API consumers will be given a certain amount of API credits to be used each day, as well be able to buy and sell credits at various rates. 

  • API Value - Each API will have its own credit rate set, where some will be one credit, while others may be 100 credits to make a single call, it can be defined by API or a specific endpoint.
  • Daily Limit - The daily allowed credit limit will be determined by the access level tier is registered at, starting with daily free public access to retail, trusted, and potentially custom tiers.
  • Usage - How many credits does any one user use during a day, week, or month, across all APIs. When each API is used, it will apply the defined credit value for the single API call.
  • Incentive - How can the platform give credits as an incentive for use, or even pay credits for writing to certain APIs, and enriching the system, or driving traffic.
  • Purchase - What does it cost to buy a credit, something that could fluctuate from day to day, week to week, or month to month.
  • Buyout - Allow API consumers to get paid for the credits on their account, preferably all users are encouraged to spend credits, but buyout is an option.
  • Discounts - Can we give discounts when you buy credits through specific channels, promotions, or other type of planned deal.
  • Volume - Are there volume discounts for buying of credits, allowing consumers to purchase credits in bulk, when they need to and apply when they desire. 
  • Applying - Can you wait to apply credits you have accumulated? Given the option with each billing cycle to apply, or you may want to wait and use at future date.

I envision credits being the lifeblood of the API monetization strategy for my platform, and would love to see it spread beyond any single API ecosystem, and be something that all providers could put to work. The benefits of this would be seen by both API provider, as well as consumers, in helping us establish a currency for the API economy.

Indirect Value Generation
What value is generated via API operations that isn't directly monetized, but driving value in other ways. These indirect value generators are often overlooked, and under-showcased areas of operation, often resulting in API failure--always showcase the buzz.

  • Marketing Vehicle - Having an API is cool these days, and some APIs are worth just having for the PR value, and extending the overall brand of the platform.
  • Web or Mobile Traffic - The API exists solely for distributing links to web and mobile applications, driving traffic to specific properties - is this tracked as part of other analytics?
  • Brand Awareness - Applying a brand strategy, and using the API to incentivize publishers to extend the reach of the brand and ultimately the platform - can we quantify this?
  • Data & Content Acquisition - Using the API, and the applications built on top as part of a larger data and content acquisition strategy--can we quantify this?

I could see data and content acquisition grow into an area we can better quantify soon. Putting a base value on each resource in the system, and figure out how much each resource grows in size, and quality over time. Applying value to these indirect areas is something I'd like to expand upon in future iterations.

Partner Revenue Generation
Ideally any platform should be sharing the revenue and value exhaust generated via the ecosystem, providing revenue opportunities for web, and mobile application developers. There are a handful of ways revenue can be shared via API operations.

  • Link Affiliate - What revenue is generated and paid out via links that are made available via the API, with potentially externally affiliate links embedded.
  • Revenue Share - What portion API revenue is paid out to potential re-sellers who drive API usage. Revenue is percentage of overall credit purchases / usage.
  • Credits to Bill - All revenue is paid in credits on account, and user can decide to get buy out of credits at any time, or possibly use in other aspects of system operation.

I will be expanding on these areas in the future, as I play with ways to incentivize content or data creation, or just driving API consumption well into the paid tiers. Right now many API platforms I study are essentially sharecropping plantations, reaping the value generated from developer activity. In the future, developers should be incentivized with cash and credit to help achieve platform monetization goals, which is something I want to explore via my own API resources when I have the bandwidth.

Internal Revenue Generation
Where are we making money? What revenue is generated across the platform, and then what are the breakdowns. I want to understand who my rockstar users and applications are, something that isn't isolated to external users. I am looking to craft all of my applications as individual citizens within the API ecosystem, measuring and limiting what type of access they have, and treat them like any other consumer on the platform.

  • Monthly - How much revenue is being brought in on a monthly basis for an API and all of its endpoints.
  • Users - How much revenue is being brought in on a monthly basis for a specific user, for an API and all of its endpoints.
  • Applications - How much revenue is being brought in on a monthly basis for a specific application, for an API and all of its endpoints.
  • Tiers -  Which tiers generate most usage and revenue? I should be applying just as easily to aspects of platform / internal usage as well.
  • Affiliate Revenue - What was generated from affiliate links made available via APIs, minus what percentage was paid out to API consumers.
  • Advertising Revenue - What advertising revenue was derived from web or mobile application traffic resulting from the API, minus whatever was paid out as rev share to API consumers.

The goal of my platform is not simply to make money. Sure I like making money, but I'm looking to flush out a reproducible framework for hanging each API, and making sense of it as part of my larger API platform operations. Not all APIs will be created equally, but I should be able to equally measure what it costs to develop, and operate, and apply consistent ways of generating revenue around its use. 

All of this looks intimidating when you scroll back through. However my goal is to produce a standardized pricing page that can exist across all of my API ecosystem(s), which are growing in number, and prompting me to think in this way. I need a better handle on my costs, and ultimately be able to generate more revenue to keep me with a roof over my head, food on the table, and my AWS bill paid.

While I only have a single API portal right now, I'm preparing to deploy a specific collection using APIs.json and publish as version 2.0 of my API Evangelist developer portal. I'm also looking to immediately publish a few other API portals, designed to support various collections or stacks of APIs available in my network (images, API definitions, etc.). I need a standard way to deliver on-boarding, and pricing for the APIs, and this backend framework gives me the v1 approach to that. 

Each API that I launch will have a pricing page, with each of the available service tiers as a box, and within each box it will list how many free credits you get each day, and other features available like per credit rate beyond the daily allowed limit, support elements, and other relevant details to that tier. There should also be links to more detail about partner, re-seller, and wholesale options for each API portal I launch. The API consumer never sees all of this. This framework is for me to hang each API upon, and think through it in context of the overall API lifecycle and platform operations.

I'm applying this outline to the 30 APIs I have in my stack, and then also applying it to a handful of new data APIs I'm working on. Along the way I will flush it out a little more, before I get to work on some of the more advanced pieces like a partner and re-seller programs. I'm not a big fan of advertising, but I do have some single page apps that perform pretty well, and it wouldn't be too intrusive to have some advertising on them. All of these SPAs are driven by my APIs, and they often exist as tools across my API driven content network as well.

This post will be published to my API monetization research, and this list will be published as common building blocks, that can be considered as part of any API monetization strategy. It makes me happy to see this portion of my research finally move forward, and evolve, especially since its based upon my own platform goals, as well as my wider monitoring and review of the space.

See The Full Blog Post


An API Monetization Framework To Help Me Standardize Pricing For The APIs I Bring Online

I'm almost to the point with my API stack, where I can start plugging in new APIs I have planned. Up until now, the APIs i have deployed, are of little use to a wider commercial audience. However some of the APIs I have planned for the next year, I'm looking to monetize their usage, and operate as part of a larger commercially viable API stack. (practice what I preach baby!)

To run this stack, I need a plug and play way to define what an API is costing me, and potentially how much revenue I am generating from each API. With this in mind, here is my draft look at an API monetization framework, that I am employing across my API Stack. 

Acquisition (One Time or Recurring)

  • Discover - What did I spent to find this. I may have had to buy someone dinner or beer to find, as well as time on the Internet searching.
  • Negotiate - What time to I have in actually getting access to something. Sometimes its time, and sometimes it costs me. 
  • Licensing - There is a chance I would license a database from a company or institution, so I want to have this option in there. Even if this is open source, I want the license referenced.
  • Purchase - Also the chance I may buy a database from someone outright, or pay them to put the database together, resulting in one-time fee. 

 Development (One Time or Recurring) 

  • Normalization - What does it take me to cleanup, and normalize a dataset, or across content. This is usually he busy janitorial work necessary.
  • Design - What does it take me to generate a Swagger and API Blueprint, something that isn't just auto-generated, but also has a hand polish to it.
  • Database - How much work am I putting into setting up the database. A lot of this I can automate, but there is always a setup cost.
  • Server - Defining the amount of work I put into setting up, and configuring the server to run a new API, including where it goes in my operations plan.
  • Coding - How much work to I put into actually coding an API. I use the Slim PHP framework, and using automation scripts I can generate 75% of it usually.
  • DNS - What was the overhead in me defining, and configuring the DNS for any API, setting up endpoint domain, as well as potentially a portal URL. 

Operation (Recurring)

  • Compute - What percentage of my AWS compute is dedicated to an API. Flat percentage of the server its one until usage history exists.
  • Storage - How much on disk storage am I using to operate an API? Could fluctuate from month to month, and exponentially increase for some.
  • Bandwidth - How much bandwidth in / out is an API using to get the job done.
  • Management - What percentage of API management resources is dedicated to the API. A flat percentage of API management overhead until usage history exists.
  • Evangelism - How much energy do I put into evangelizing any single API? Did I write a blog post, or am I'm buying Twitter or Google Ads? How is word getting out?
  • Monitoring - What percentage of the API monitoring, testing, and performance service budget is dedicated to this API. How large is surface area for monitoring?

Pricing (Recurring)

  • Tier(s) - Which of the 7 service tiers is an API available in, and which endpoint paths + verbs are accessible in the tier (api-pricing definition).
  • Credit(s) - How many credits does an API use when any single endpoint is engaged, specified as entire endpoint or individual paths + verbs (api-credit definition).

Revenue (Recurring)

  • Monthly - How much revenue is being brought in on a monthly basis for an API and all of its endpoints.
  • Users - How much revenue is being brought in on a monthly basis for a specific user, for an API and all of its endpoints.
  • Applications - How much revenue is being brought in on a monthly basis for a specific application, for an API and all of its endpoints.

I am looking for this framework to help me set pricing, and rate limits for any API I publish. My goal is to rapidly make available some valuable databases, and more functional APIs using common open source software, available for free, but also generate enough revenue from high volume users, to run the whole thing. To do this, I need to understand exactly what an API is costing, allowing me to set a price, with the intent of breaking even, and then generating some revenue where it makes sense.

As part of this work I will be generate an APIs.json type I am calling api-pricing, which I am looking to help be balance out consumption across my API stack. Using my 3Scale API infrastructure I am able to easily add and subtract credits for API usage across users, and apps, then handle the billing based upon what pricing I have set for all my API usage.

My pricing will not just be about retail usage. Some of my APIs I will be deployed in other people's infrastructure, and letting them control the pricing, credits, and service composition. This is the wholesale layer to my strategy, allowing me to go beyond my own internal usage, by B2D usage, and open up new opportunities for API deployment and consumption.

As with most of my work, I'm going to be very transparent about my pricing, making sure it is indexed within each APIs.json file, and available alongside each APIs Swagger, API Blueprint, Postman Collection, and API Science monitor--encouraging wider consumption, and processing.

See The Full Blog Post


Scientific Database Monetization

Wormbase working.

See The Full Blog Post


Multi-Layers Of Monetization

From my drawing.

See The Full Blog Post


Is Your Monetization Rooted In The Resource Or Experience Side Of Your API Operations?

I had another one of my regular check-ins with Anne and Bailey over at Popup Archive today, with a focus around the monetization strategy for their AudioSear.ch API. I thoroughly enjoy my conversations with the team, because they have worked really hard on their audio API solutions, and are extremely open to discussion, and sharing the stories of their API journey.

During our talk, they tuned me into where they were at with their road-map, which includes now having a solid set of API resources that provide access to the episodes, people, and shows involved in podcasts across the Internet. They talked about how they have been iterating on these core API resources, with more experience focused endpoints including relatedness, trending, and tastes, to name a few. All of this really demonstrates that the company are well along in their API journey, moving beyond just the definition of core aPI resources, and actually getting in tune with how these resources will actually be put to work.

Next they walked me through some of the core use cases for their podcast API, ranging from radio stations looking to provide better recommendations, to advertisers who want to better understand the exploding world of podcasts, and understand where the opportunity is. Audiosear.ch is working to quantify the world of podcasting, by indexing not just the metadata surrounding audio files, but also its contents, and relationships between them--busting open a brand new realm in cyberspace, that like video, that is poised for explosive growth, and will be ripe with opportunities.

Now that AudioSear.ch can crack open, index, and develop awareness around the contents of podcasts, which includes the episodes and people involved with shows, they are also faced the hard part of actually defining, and delivering APIs that developers will need, and actually pay for. Meaning which API endpoints do we provide, and what do we charge for them? The number one conversation I am having with companies who have embarked on their own API journey.

These conversation always begin with discussion the hard costs: what did we invest in developer hours? what are our licensing costs? what does compute cost? what does our storage cost? what does our bandwidth cost? All of these costs are rooted in the resource side of the conversation. Where do the data, content, and other resource originate? How much work have we put into? How do we perceive the value of our own resources? Leaving every API provider trying to figure out what people will pay, and calculating their margins--resource-based monetization strategy.

Resource-based API monetization strategies are where all API operations begin when they trying to figure things out. Once you are further along in your API journey like Anne and Bailey are, you need to start thinking about more experienced based pricing, something that will be more closely aligned with how API consumers actually see things. As an API provider, you tend to be too closely aligned with your resources, focused on an API design derived from this view, and a monetization strategy that looks to cover the costs, while also bringing home a potentially good margin. When you are rooted in this way of thinking you are always chained to your core resource-based assumptions, which is often very distant from what will be important to your customers. 

When you look at how Audiosear.ch is iterating with the more experience based API designs like /trends, /relatedness, and /tastes for podcasts, you see how they are beginning to uncover what is actually valuable to end-users. As they keep iterating on these experience based API endpoints, driven by API consumer needs, you can't have your pricing solely anchored to the resource, it needs to also reflect the value perceived by the consumer. You never know when you will find just the right endpoint for a resource, that has high perceived value to end-user, but relatively low costs associated with actually delivering--add entirely new dimensions to your margins.

Approaching API monetization based upon how resources will be experience, decoupling from just a resource based approach, will first and foremost increase the chance that anyone will purchase API access for any sustained period, but secondly opens up a whole new world for price increases (or reductions) based upon demand (think AWS). These are just some initial thoughts from our conversation, something I will be thinking about more in coming weeks, and hopefully provide some other more concrete examples of this in the wild. 

I just needed to work through some initial thoughts as I got out of the Google Hangout--thanks Anne and Bailey #GoodTimes

See The Full Blog Post


Taking API Monetization To The Next Level By Monetizing The Exhaust Around API Consumption

I have spent a lot of time thinking about API analytics. Understanding who is signing up to access API resources, and how they are putting those API resources to use, is one of the most valuable aspects of modern API management solutions in my opinion. The awareness derived from API operations that use analytics, can have the biggest impact on not just how you run your API, but how you craft your API products, tailor your monetization strategy, evolve your overall roadmap and company.

API analytics was the topic of a webinar I conducted with WSO2, one of my partners, this morning. I've been invested in WSO2 since they first contacted me for feedback on their then upcoming release of their open source API management solution. During this mornings webinar, I gave the basics on "why" API analytics during this mornings webinar, then sat back and listened to Nuwan Dias demonstrate the "how" of it.

First thing that caught my attention was the WSO2 predictive analytics, which is a concept I have to say I have never thought about, but will be pondering more around what this might mean (or not) to the business and politics of API operations. Another thing that caught my attention was an idea Nuwan shared during the QA, about API monetization at the API analytics layer--very interesting thought.

While the conversation is not as far as long as I'd hoped by 2015, the concept of using API analytics to help you craft your API monetization strategy is pretty well established. API providers like 3Scale and WSO2 have been helping API providers understand the value delivered via APIs, while also providing a framework for crafting service tiers, pricing, and manage usage and billing for API consumers--what Nuwan is thinking about, takes this up another level.

A generation 1.0 API monetization scenario might be an art gallery, who could have an API that feeds the gallery web or mobile app, and during the planning of their API they identified that they shouldn't charge for API access, because it just adds value to their overall business objective--get foot traffic in the door. So the more developers building their art gallery content, products, and services into other 3rd party apps and sites, the better. Maybe along the way the gallery might also choose to serve up high resolution, or interactive version of exhibits for a fee, adding another layer to their monetization strategy--this is a classic example of API monetization discussions I've seen in the last 10 years.

A generation 2.0 of this API monetization scenario, reflecting Nuwan's vision, could involve the Art gallery actually selling leads in real-time, to partner galleries, restaurants, and other businesses that their customers might also be interested in. To reference the WSO2 real-time API analytics example, you could craft a definition of what ideal scenario would look like, and when a user logs into the web or mobile app, matches the profile, the exhaust from the API usage could also be accessed via a higher level API crafted off of your API analytics API (head scratcher). Of course you'd want to design some sort of layer that would require users to opt-in so that they are aware that their data was being captured, and shared, but in theory that exhaust from a single users API transactions could provide a whole new API monetization opportunity.

I'm still thinking this through, but gen 1.0 is about identifying the value of the digital resources of the gallery including the physical locations, art inventory, artist details, etc. You try to establish ways of extracting direct, and indirect value from these digital assets, by making them available via an API. Indirect monetization would be about driving web and mobile usage, as well as in-person foot traffic, where direct monetization would be about selling high resolution images, or possibly virtual gallery experiences. A gen 2.0 API monetization conversation in this art gallery scenario would be about identifying specific personae of users and experiences via the web, mobile, or even device and sensor based APIs that could be targeted in real-time, triggering other secondary, API driven event and experiences.

This type of APIs for the meta layer of API operations seems like more of a partner level API composition layer. Meaning as an API provider you'd have access to these APIs, but as far as your community, you'd probably only extend these APIs to trusted partners, who you have an established relationship with. As I try to think through API monetization at this layer, I'm guessing partners would pay a flat fee, or maybe a per transaction fee depending on the amount of data shared, or perhaps based upon successful transaction or sale? IDK, just spitball'n.

This is just me thinking through a single hypothetical scenario to help grasp the potential here. It adds a new dimension to both the API analytics discussion, as well as the API monetization one. Lots to chew on, from the business and politics side of API operations. Would love to hear your thoughts...

Disclosure: Both 3Scale and WSO2 are API Evangelist partners.

See The Full Blog Post


Initial Thoughts Around Monetization For An API Deployment Service

I am helping a client work through their monetization strategy for an API deployment service. To help me give me a starting point for the work, I wanted to take a look at a handful of existing service providers, that may not be a perfect match, but somewhat in the same realm--API deployment

For this phase of the work, I looked at six API deployment providers, who in my opinion have a pretty straightforward, modern approach to crafting, publishing, and sharing their pricing. It always amazes me how hard it can be to just find a companies pricing, let alone make sense of it...but I digress, that is another story.

Here are the API deployment providers I reviewed:

  • API Spark, with 5 pricing tiers, broken down by concurrent connection (based upon IP address), number of APIs, the entity and file storage across APIs.
  • APItite, width 3 pricing tiers, broken down by number of APIs, API calls, data transfers, data storage (by row), whether the API is public / unlisted / private, and support
  • InstantAPI, with 1 pricing tier, with incremental add-ons to get more API call volume, broken down just by the number of API calls, with an option to pay for custom domain.
  • Sheetlabs, with 3 pricing tiers, broken down by storage, number of queries, spreadsheets, APIs, and users.
  • Orchestrate, with 3 pricing tiers, broken down by number of APIs calls and applications, with options for Service Level Agreement (SLA) and support as well.
  • Algorithmia, with 2 pricing tiers, broken down by an equal ratio of credits to API calls. You get 10K credits for free, and can purchase more at 10K per $1.00.

I think these six provide a good test sample to kick of my research. Most of the other API deployment providers didn't have a straightforward pricing model, were open source, or just didn't seem like a good fit.  As I looked through these companies approaches to monetization, I see two of the three hard costs of each provider covered:

  • Storage
  • Bandwidth

The only one not present is "compute", but in my experience this is usually applied across the other pricing variables, as it is harder to calculate. After the hard costs, some of the obvious elements of API monetization are present:

  • APIs
  • API Calls
  • Users
  • Applications

These are the most common elements of API service provider pricing, providing tangible things that users will understand. There were also two additional elements I found, that I would consider less than common, but interesting enough to include:

  • Connections
  • Data Storage by Row

The "connections "concept introduced by Restlet's APISpark platform is an interesting way to try and quantity the compute used by each API consumer. There were also three other I guess, kind of value-add aspects that were included in pricing packages:

  • Support
  • SLA
  • Public / Private

I think the number of APIs, API calls, users, apps, and connections are all just ways for carving up the third hard cost of "compute"--with some profit built in of course. It all gives me a great framework to think about what is important when it comes to crafting a monetization strategy, like the hard costs of compute, storage, and bandwidth, but also provides the tangible elements that potential customers will likely understand, such as the number of APIs, API calls, users, and apps. 

Ultimately, I am a fan of simple, tiered API pricing, providing a free, $9.99, 19.99, $49.99, $99.99, and onward--kind of like with Restlet's APISpark. Each package should be assembled from all the moving parts identified, both hard costs, and perceived costs that the end-users will get  like storage, bandwidth, APIs, API cals, users, apps, etc. In my opinion, each tier should allow purchasing of additional units as needed, as well as being able to scale up and down the plan each month. 

Each of the providers listed above had a "contact us" or "enterprise" option for pricing. This is an area I'd like to see more transparency around what partnership opportunities there are, as well as potential reseller possibilities--I see no reason why some developers can't achieve credits for bringing in other business. I see the partner and reseller layer of any API, as the key to platform success, in a way that can also become a external marketing vehicle for the ecosystem.

Alrighty. That gives me a good amount of information to the prime the pump around the conversation with my client, as well as help build a foundation for how I can approach future API monetization strategy discussions. I will incorporate this into my api-pricing strategy, where i am trying to define a machine readable format that can be indexed as part of any APIs, APIs.json index. I am happy with this as a start, to what I see as a continuing conversation about not just strategies for API monetization, but also API service provider monetization. 

See The Full Blog Post


Thinking Through How We Handle The Internet of Things Data Exhaust, And Responsible API Monetization, With Carvoyant

I'm very fortunate to be the API Evangelist, as I get to spend my days discussing, some very lofty ideas, with extremely smart and entrepreneurial folks from across many different business sectors. An ongoing conversation I have going on with Bret Tobey (@batobey), the CEO of Carvoyant, is around the big data generated around the Internet of Things. Many companies that I engage with are very closed about their big data operations, where Carvoyant is the opposite, they want to openly discuss, and figure it out as a community, out in the open--something I fully support.

In reality, the Internet of Things (IoT) is less about devices, than it is about the data that is produced. If someone is exclusively focusing on the device as part of their pitch, it is mostly likely they are trying to distract you from the data being generated, because they are looking to monetize the IoT data exhaust for themselves. Carvoyant is keen on discussing the realities of IoT data exhaust, and not just from Internet connected automobile perspective, but also the wider world of Internet connected devices. If you want to join in the discussion, of course you can comment on this blog, and via Twitter, but you can also come to Gluecon in May, where 3Scale, API Evangelist, and Carvoyant will be conducting an IoT big data workshop, on this topic.

If you aren't familiar with Carvoyant, they are an API-driven device that you can connect to your vehicle, if it was manufactured after 1996, and access the volumes of data being produced each day. Carvoyant's tag line says "Your Car, Your Data, Your API" -- I like that. This is why I enjoy talking through their strategy, because they see the world of APIs, connected devices, and big data the way I do. Sure there is lots of opportunity for platforms, and developers to make money in all of this, but if there is user generated content and data involved, the end-users should also have a stake in the action--I do not care how good your business idea is. 

Carvoyant does a great job of acknowledging that the car is center to our existence, and the data generated around our vehicles is potentially very valuable, but it is also a very personal thing, even when anonymized. Here is how they put it:

Connected car data tells the repair industry when a vehicle needs service – the moment it happens.  Connected car data tells an insurer how a driver actually drives and if they are eligible for a better priced policy.  Connected car data tells gas stations if a vehicle is low on gas.  If a vehicle has not been to the grocery store for a while, than it may be time to make an offer.

The difference in this conversation, is Carvoyant isn't just making this pitch to investors, the automobile industry, and developers, they are making it to the automobile owners as well. They are working to establish best practices for gathering, accessing, storing, and making sense of Internet of Things data exhaust, in a way that keeps the end-users interests in mind. Every platform should think this way. There is too much exploitation going on around user-generated data, and Carvoyants vision is important.

Monetization Via Classic Affiliate Program
When it comes to figuring out a healthy monetization strategy for the Carvoyant platform, and resulting data, the company is starting with a familiar concept, the affiliate program. Establishing potential referral networks, where end-consumers, and developers of apps can refer potential customers to businesses, and when there is a successful sale, an affiliate commisison is paid out. This is a great place to start, because it is a concept that businesses, developers, and consumers will understand and be able to operate within, without changing much on the ground behavior.

If you identify a customer in need of vehicle servicing, and successfully refer them to a local service center, you can be paid money for making the connection, something that when done right could also be applied to the end-user in form of credits, discounts, and other loyalty opportunities. An affiliate approach to the monetization of data via the Carvoyant API makes for an easy sell, but one that can be applied to a myriad of business sectors ranging from automobile services to food, shopping, travel, and much, much more. While an affiliate base is being established, Carvoyant can also begin to look towards the future, and shifting behavior.

Monetization Beyond Affiliate
While I'm fine with Carvoyant kicking off their monetization strategy with a calssic affiliate program, I feel pretty strongly there are many other opportunities for monetization, something that Bret agrees too. When you think about the central role cars play in our lives, the opportunities for inciting meaningful experiences, are endless. While the real money is probably around the mundane realities of the average car owner, the chances for serendipity, beyond these known areas is the exciting aspect. How do you not just help users find the best time to get their oil changed, but also take that side street, instead of freeway that might involve a chance experience that could range from dinner, to concert, or just find that right sunset location.

There is a pretty clear conversion event involved with the affiliate model. A user clicks on just the right deal, a sale is made, and revenue is kicked back from the business to the platform, developer, and is something that hopefully reaches the end-user in meaningful way. What other types of "conversion events" (man I hate that phrase) can we identify in car culture. How do we encourage people to take public transit, share vehicles, or how do we make large fleets operate more in harmony? With the right platform, I think we can quickly go beyond the traditional affiliate transaction, and develop a new wave of monetization around IoT data, that goes beyond just eyeballs, and links, and is more about engagement and true experiences.

Experience Credits Not Just Sales 
Just like moving beyond the affiliate conversion event, I think we can go beyond the transaction also being currency or sale based. How do we create a more experience based currency or credit system that can be used equally to help businesses generate sales, and establish loyalty with customers, but also allow developers and end-consumers to exchange units of value, attached to valuable automobile data? If a $20.00 deal on a $30.00 oil change, results in a $2.00 affiliate revenue share, what would the value of pulling off the freeway, taking the side road and catching just the right sunset picture be worth? How do we incentivize experiences, not just sales? If we are continuing to weave data generated from our physical worlds, it cannot always be about money, there has to be other value generated, that will keep end-users engaged in valuable, yet meaningful ways.

Sharing Economy
As our worlds continue to change, partially because of technology, but also because of other environmental and societal pressures, what value, sales, and experiences can be generated from the data exhaust produced as part of the sharing economy? If its our car, our data, and our API--does this apply when it involves our ZipCar usage, rental car, or possibly Uber? How does the sharing economy impact data generated via connected cars. When the end-user is the center of the conversations, these alternate use cases have to be included, and make sure privacy is protected, but also opportunities around that data to be securely shared with users. This isn't just a shared automobile data conversation, it is potentially applicable to any other objects we rent and share like tools, equipment, supplies, and anything else we may use as part of our business and personal lives.

Commercial Fleets
Just like the data exhaust from personally owned vehicles, or automobiles used as part of the sharing economy, the opportunities, and patterns available for commercial fleets will look very different. How do we incentivize efficiency, cost savings, and safety in fleet operations? What do the affiliate deals, and experiences look like for fleet vehicle drives, and the companies that own and manage them. Think of the implications of big data exhaust from vehicles in heavily regulated industries, and public service entities like police, and fire. We will also have to think very differently about how revenue is generated and shared, as well as look at privacy and security very differently. This doesn't reduce the opportunity in the area of fleet management, it just needs a significantly different conversation about what we need for this class of automobile.

Establishing Common Blueprints 
Beyond the individual conversion events for individual drives, or the wider opportunities for sharing economy companies, and commercial fleet operators, where are the opportunities around identifying common patterns of vehicle usage at scale? How does the vehicle usage of the LAPD differ from NYPD? What does the average residential vehicle owner in San Diego look like, versus the rental car tourist for San Diego? Using connected vehicle technology like Carvoyant opens up a huge opportunity for better understanding car culture at a macro level, beyond what the auto industry, or maybe Department of Transportation sees. How do we begin having honest conversations about our vehicle usage, and allow drivers to be educated about larger studies, allowing them as a company or individual to opt in, and share data, to participate in larger studies? We have to make sure and consider the bigger opportunities for understanding beyond any single endpoint on the connected car network, and look at entire cities, states, countries, and other meaningful demographics.

Lots More To Discuss, Something That Needs Transparency
This is just the beginning of these types of discussions. I have a handful of companies, like https://www.carvoyant.com, who have access to huge volumes of extremely valuable user-generated data, who are trying to figure out how to developer useful tech, make money, all while doing it in a healthy way that protects end-users privacy and security. I am not opposed to companies making money off their API platforms, and user generated data, I just insist that APIs always be used to make it more transparent, and technology such as oAuth employed to give end-users more control, and a vote in how their data is collected, stored, shared.

This post is about me working through my last conversation with Bret, and hopefully will result in several more stories here on the blog. I also want to prime the pump for the APIStrat IoT Big Data Workshop at Gluecon in May, and make sure my readers are aware the workshop will be happening, and if you want to join in the conversation. This is just one of many posts you'll see from me discussing the big data exhaust generated from Internet connected devices, but also the potential for transparency and healthier platform operations when APIs and oAuth are employed. These types conversations are only going to become more critical as more of our physical worlds are connected to the Internet.

See The Full Blog Post


Combined Calls: Monetization Through The Bundling Of API Calls

I was doing my regular monitoring, and found myself on the AlchemyAPI site. Not exactly sure how I got there, but I stumbled across their HTMLGetCombinedData API, which can be used for analysis on HTML content, and is one of 3 separate APIs, AlchemyAPI is calling "combined calls".

If you aren’t familiar with AlchemyAPI, the company has a number of valuable APIs, which you can use to make sense of content and data from on, or offline source. I use AlchemyAPI for API Evangelist, to pull keywords, and the content out of blog posts, helping me shed the overall look of a site, and any advertisements--getting down to the raw content. What I thought was particularly interesting about this API, was their approach combined calls, and specifically their approach to monetizing these aggregated API calls.

There are three specific APIs they are considering "combined calls":

These three APIs are only available in the AlchemyAPI pro and enterprise packages, which for me makes see this aa a potentially new approach to API monetization. I don’t see it as something that works for all API providers, but when you have numerous decoupled APIs, which developers may also be implemented several of them at a time, or daisy chaining them together—a combined API call, might save some developers valuable time.

Combined API calls also seem like a potential opportunity for API platform developers themselves. If an API platform, provides tools for developers to aggregate, and stitch together multiple APIs, and publish their recipes, it is something that could produce some interesting patterns, that may better deliver solutions to the problems developers actually face during integration. At the very least, allowing developers to publish SDKs that accomplish the same thing, might achieve the same thing.

I am just looking to share my thoughts on AlchemyAPIs approach to aggregating their API calls, and specifically the focus on monetization, adding the concept to my research. Maybe it is something others can for their API platforms, or maybe API developers, could provide aggregated API recipes, for specific API platforms, or across multiple platforms.

See The Full Blog Post


Resource Base API Monetization vs. Experience Based API Monetization

I’m lost in API monetization land, evaluating the business models of common APIs, so you are just going to have to cope with it, until I get through this research. Honestly, I really don't care this much about making money off of APIs, it is just telling me a lot about the motivations behind many of the APIs I keep an eye on. This particular story is extracted from my research into the monetization strategies of multiple core business sectors in the API space, and one of my regular conversations that I have, with an API provider around how they can settle in on a sensible monetization strategy.

In the same conversation around how these valuable resources even became APIs, we were immediately led to the topic of what we were going to charge for, and how we were going make money off these potentially resources. As I’m discussing resources based API design, I’m also having a resource based API monetization conversation—which tells me we are so lost in the infancy of all areas of API design, deployment, management, and ultimately monetization.

How these resources are going to be used? How will these API resources actually be put to use by end-users within web, mobile, and IoT applications? Seems like we should be considering the perspectives on how people will experience these APIs as we design the interfaces, let alone before we attach prices to them. If we don’t know ho they will be used, how will we ever know what someone will pay for them?

I just don't know the answer. I know that we have hard costs for crafting these API resources--the compute, storage, and bandwidth to serve them up. I do not know if anyone will even pay a dime for any of them, or even need them at all. Where do I focus? Do I concentrate on the resources that I have, and more about how to make them accessible, while also covering my overhead, and hopefully make some profit on top? Or do I focus how thinking about how they will be used, and the value they have with end-users? I can't decide on where I should focus: 1) resource based API monetization, or 2) experience based monetization. #help

See The Full Blog Post


Tracking On The Red Flags For API Monetization

I spend my time gathering what I call b"uilding blocks", as I work my way through the API landscape. I’ve been tracking on the building blocks of API management since 2011, and have expanded that to include API design, deployment, evangelism, integration and other areas of the API lifecycle in the last 3 years. I'm working my way through the 700 APIs in my API stack, I am looking for industry focused building building blocks, as well as the ones associated with an APIs business model, or monetization strategy.

I have almost twenty monetization building blocks I'm tracking on ranging from free access to the availability of partner programs. As I work my way through various business sectors being impacted by APIs, I'm starting to see interesting patterns, some of which can act as a red flag for me that there are potential problems within an API operations. To the uneducated API pundit, the illnesses around APIs arise from simplistic concepts like “being public”, when in reality there are deeper issues at play, usually around the business model, and these patterns for me begin to really get at the root of the problem, rather than just speculating on the cause.

An example of this has to do with the existence of a handful of management and monetization building blocks, and the absence of others--such as when an API does not have a pricing page, service tiers, or any paid infrastructure, but offers free access to an API. This is amplified when a provider also doesn't provide any clear rate limiting available, resulting in a service you do not want to depend on, because I can guarantee buried in the terms of service is a clause that they can start charging at any moment. The most well known example of this in the wild is Twitter, who despite having Gnip and DataSift, still does not provide clear and scalable pricing options for its developer.

As I continue studying the monetization strategies of APIs, I'll work to establish more patterns that are derived from the presence of, or absence of, specific combinations of monetization building blocks that may tell a larger story. Hopefully I can work to counter much of the BS in the space, and when you hear statements like “public APIs are not viable”, we can clarify with “public APIs who don’t have clear business model, and communicate their pricing and rate limits are not viable”.

See The Full Blog Post


Subject: A New Monetization Opportunity Through Predictive Insurance Policies

Hey Jacob,

This is mean to augment our conversation on the golf course yesterday, and solidify our commitment to this new monetization opportunity. I would say that the opportunity around predictive insurance policies is our greatest opportunity to date, and is something we need to put in place immediately.

Now that our governance solution has been mandated in all IoT solutions in the US, and our logging solution has proven to be effective, it is time to scale and take full advantage of monetization opportunities of predictive insurance. The 43 prescribed deaths that we executed this quarter, show that we are ready. We saw 100% of the tailored fleet accidents occur, with 100% compliance during safety audits across the 31 companies where the the accidents occurred.

We are now ready to implement predictive insurance policies with the same results. The algorithmic tests have shown that we can keep insurance policies in line with all prescribed deaths, and if it was implemented for the last test phase, it would have represented $22.5M in insurance claim related revenue. I recommend scaling to 75 prescribed deaths in the same regions as before, with the predictive insurance algorithm applied.

While there are numerous opportunities for scaling of our service across the transportation industry, including commercial fleets, trucks, heavy equipment, rail, and public transit, I feel the real opportunities will be in a range of IoT devices such as toasters, BBQs, fencing, stairs, doors, and pretty much any part of our physical world that will have a sensor, camera, and other internet connected layer applied. There are unlimited opportunities for revenue generation across the thousands of unintentional deaths like falling, drowning, poisoning and other common place occurrences.

Regardless of where we apply, our Iot governance layer will continue to be key for not only the predictive layer for insurance policies and deaths, but also agents for triggering the actual prescribed deaths across the network. I appreciate your attention to detail on the logging layer, making sure all activity as part of the prescribed death, and predictive insurance program remains in the shadows. 

Get me back a timeline by this weekend for the next phase rollout--it will be an exciting quarter.

- Frank

See The Full Blog Post


An API Evangelism Strategy To Map The Global Family Tree

In my work everyday as the API Evangelist, I get to have some very interesting conversations, with a wide variety of folks, about how they are using APIs, as well as brainstorming other ways they can approach their API strategy allowing them to be more effective. One of the things that keep me going in this space is this diversity. One day I’m looking at Developer.Trade.Gov for the Department of Commerce, the next talking to WordPress about APIs for 60 million websites, and then I’m talking with the The Church of Jesus Christ of Latter-day Saints about the Family Search API, which is actively gathering, preserving, and sharing genealogical records from around the world.

I’m so lucky I get to speak with all of these folks about the benefits, and perils of APIs, helping them think through their approach to opening up their valuable resources using APIs. The process is nourishing for me because I get to speak to such a diverse number of implementations, push my understanding of what is possible with APIs, while also sharpening my critical eye, understanding of where APIs can help, or where they can possibly go wrong. Personally, I find a couple of things very intriguing about the Family Search API story:

  1. Mapping the worlds genealogical history using a publicly available API — Going Big!!
  2. Potential from participation by not just big partners, but the long tail of genealogical geeks
  3. Transparency, openness, and collaboration shining through as the solution beyond just the technology
  4. The mission driven focus of the API overlapping with my obsession for API evangelism intrigues and scares me
  5. Have existing developer area, APIs, and seemingly necessary building blocks but failed to achieve a platform level

I’m open to talking with anyone about their startup, SMB, enterprise, organizational, institutional, or government API, always leaving open a 15 minute slot to hear a good story, which turned into more than an hour of discussion with the Family Search team. See, Family Search already has an API, they have the technology in order, and they even have many of the essential business building blocks as well, but where they are falling short is when it comes to dialing in both the business and politics of their developer ecosystem to discover the right balance that will help them truly become a platform—which is my specialty. ;-)

This brings us to the million dollar question: How does one become a platform?

All of this makes Family Search an interesting API story. The scope of the API, and to take something this big to the next level, Family Search has to become a platform, and not a superficial “platform” where they are just catering to three partners, but nourishing a vibrant long tail ecosystem of website, web application, single page application, mobile applications, and widget developers. Family Search is at an important reflection point, they have all the parts and pieces of a platform, they just have to figure out exactly what changes need to be made to open up, and take things to the next level.

First, let’s quantify the company, what is FamilySearch? “ For over 100 years, FamilySearch has been actively gathering, preserving, and sharing genealogical records worldwide”, believing that “learning about our ancestors helps us better understand who we are—creating a family bond, linking the present to the past, and building a bridge to the future”.

FamilySearch is 1.2 billion total records, with 108 million completed in 2014 so far, with 24 million awaiting, as well as 386 active genealogical projects going on. Family Search provides the ability to manage photos, stories, documents, people, and albums—allowing people to be organized into a tree, knowing the branch everyone belongs to in the global family tree.

FamilySearch, started out as the Genealogical Society of Utah, which was founded in 1894, and dedicate preserving the records of the family of mankind, looking to "help people connect with their ancestors through easy access to historical records”. FamilySearch is a mission-driven, non-profit organization, ran by the The Church of Jesus Christ of Latter-day Saints. All of this comes together to define an entity, that possesses an image that will appeal to some, while leave concern for others—making for a pretty unique formula for an API driven platform, that doesn’t quite have a model anywhere else.

FamilySearch consider what they deliver as as a set of record custodian services:

  • Image Capture - Obtaining a preservation quality image is often the most costly and time-consuming step for records custodians. Microfilm has been the standard, but digital is emerging. Whether you opt to do it yourself or use one of our worldwide camera teams, we can help.
  • Online Indexing - Once an image is digitized, key data needs to be transcribed in order to produce a searchable index that patrons around the world can access. Our online indexing application harnesses volunteers from around the world to quickly and accurately create indexes.
  • Digital Conversion - For those records custodians who already have a substantial collection of microfilm, we can help digitize those images and even provide digital image storage.
  • Online Access - Whether your goal is to make your records freely available to the public or to help supplement your budget needs, we can help you get your records online. To minimize your costs and increase access for your users, we can host your indexes and records on FamilySearch.org, or we can provide tools and expertise that enable you to create your own hosted access.
  • Preservation - Preservation copies of microfilm, microfiche, and digital records from over 100 countries and spanning hundreds of years are safely stored in the Granite Mountain Records Vault—a long-term storage facility designed for preservation.

FamilySearch provides a proven set of services that users can take advantage of via a web applications, as well as iPhone and Android mobile apps, resulting in the online community they have built today. FamilySearch also goes beyond their basic web and mobile application services, and is elevated to software as a service (SaaS) level by having a pretty robust developer center and API stack.

Developer Center
FamilySearch provides the required first impression when you land in the FamilySearch developer center, quickly explaining what you can do with the API, "FamilySearch offers developers a way to integrate web, desktop, and mobile apps with its collaborative Family Tree and vast digital archive of records”, and immediately provides you with a getting started guide, and other supporting tutorials.

FamilySearch provides access to over 100 API resources in the twenty separate groups: Authorities, Change History, Discovery, Discussions, Memories, Notes, Ordinances, Parents and Children, Pedigree, Person, Places, Records, Search and Match, Source Box, Sources, Spouses, User, Utilities, Vocabularies, connecting you to the core FamilySearch genealogical engine.

The FamilySearch developer area provides all the common, and even some forward leaning technical building blocks:

To support developers, FamilySearch provides a fairly standard support setup:

To augment support efforts there are also some other interesting building blocks:

Setting the stage for FamilySearch evolving to being a platform, they even posses some necessary partner level building blocks:

There is even an application gallery showcasing what web, mac & windows desktop, and mobile applications developers have built. FamilySearch even encourages developers to “donate your software skills by participating in community projects and collaborating through the FamilySearch Developer Network”.

Many of the ingredients of a platform exist within the current FamilySearch developer hub, at least the technical elements, and some of the common business, and political building blocks of a platform, but what is missing? This is what makes FamilySearch a compelling story, because it emphasizes one of the core elements of API Evangelist—that all of this API stuff only works when the right blend of technical, business, and politics exists.

Establishing A Rich Partnership Environment

FamilySearch has some strong partnerships, that have helped establish FamilySearch as the genealogy service it is today. FamilySearch knows they wouldn’t exist without the partnerships they’ve established, but how do you take it to the next and grow to a much larger, organic API driven ecosystem where a long tail of genealogy businesses, professionals, and enthusiasts can build on, and contribute to, the FamilySearch platform.

FamilySearch knows the time has come to make a shift to being an open platform, but is not entirely sure what needs to happen to actually stimulate not just the core FamilySearch partners, but also establish a vibrant long tail of developers. A developer portal is not just a place where geeky coders come to find what they need, it is a hub where business development occurs at all levels, in both synchronous, and asynchronous ways, in a 24/7 global environment.

FamilySearch acknowledge they have some issues when it comes investing in API driven partnerships:

  • “Platform” means their core set of large partners
  • Not treating all partners like first class citizens
  • Competing with some of their partners
  • Don’t use their own API, creating a gap in perspective

FamilySearch knows if they can work out the right configuration, they can evolve FamilySearch from a digital genealogical web and mobile service to a genealogical platform. If they do this they can scale beyond what they’ve been able to do with a core set of partners, and crowdsource the mapping of the global family tree, allowing individuals to map their own family trees, while also contributing to the larger global tree. With a proper API driven platform this process doesn’t have to occur via the FamiliySearch website and mobile app, it can happen in any web, desktop, or mobile application anywhere.

FamilySearch already has a pretty solid development team taking care of the tech of the FamilySearch API, and they have 20 people working internally to support partners. They have a handle on the tech of their API, they just need to get a handle on the business and politics of their API, and invest in the resources that needed to help scale the FamilySearch API being just a developer area, to being a growing genealogical developer community, to a full blow ecosystem that span not just the FamilySearch developer portal, but thousands of other sites and applications around the globe.

A Good Dose Of API Evangelism To Shift Culture A Bit

A healthy API evangelism strategy brings together a mix of business, marketing, sales and technology disciplines into a new approach to doing business for FamilySearch, something that if done right, can open up FamilySearch to outside ideas, and with the right framework manage to allow the platform to move beyond just certification, and partnering to also investment, and acquisition of data, content, talent, applications, and partners via the FamilySearch developer platform.

Think of evangelism as the grease in the gears of the platform allowing it to grow, expand, and handle a larger volume, of outreach, and support. API evangelism works to lubricate all aspects of platform operation.

First, lets kick off with setting some objectives for why we are doing this, what are we trying to accomplish:

  • Increase Number of Records - Increase the number of overall records in the FamilySearch database, contributing the larger goals of mapping the global family tree.
  • Growth in New Users - Growing the number of new users who are building on the FamilySearch API, increase the overall headcount fro the platform.
  • Growth In Active Apps - Increase not just new users but the number of actual apps being built and used, not just counting people kicking the tires.
  • Growth in Existing User API Usage - Increase how existing users are putting the FamilySearch APIs. Educate about new features, increase adoption.
  • Brand Awareness - One of the top reasons for designing, deploying and managing an active APIs is increase awareness of the FamilySearch brand.
  • What else?

What does developer engagement look like for the FamilySearch platform?

  • Active User Engagement - How do we reach out to existing, active users and find out what they need, and how do we profile them and continue to understand who they are and what they need. Is there a direct line to the CRM?
  • Fresh Engagement - How is FamilySearch contacting new developers who have registered weekly to see what their immediate needs are, while their registration is fresh in their minds.
  • Historical Engagement - How are historical active and / or inactive developers being engaged to better understand what their needs are and would make them active or increase activity.
  • Social Engagement - Is FamilySearch profiling the URL, Twitter, Facebook LinkedIn, and Github profiles, and then actively engage via these active channels?

Establish a Developer Focused Blog For Storytelling

  • Projects - There are over 390 active projects on the FamilySearch platform, plus any number of active web, desktop, and mobile applications. All of this activity should be regularly profiled as part of platform evangelism. An editorial assembly line of technical projects that can feed blog stories, how-tos, samples and Github code libraries should be taking place, establishing a large volume of exhaust via the FamlySearch platform.
  • Stories - FamilySearch is great at writing public, and partner facing content, but there is a need to be writing, editing and posting of stories derived from the technically focused projects, with SEO and API support by design.
  • Syndication - Syndication to Tumblr, Blogger, Medium and other relevant blogging sites on regular basis with the best of the content.

Mapping Out The Geneology Landscape

  • Competition Monitoring - Evaluation of regular activity of competitors via their blog, Twitter, Github and beyond.
  • Alpha Players - Who are the vocal people in the genealogy space with active Twitter, blogs, and Github accounts.
  • Top Apps - What are the top applications in the space, whether built on the FamilySearch platform or not, and what do they do?
  • Social - Mapping the social landscape for genealogy, who is who, and who should the platform be working with.
  • Keywords - Established a list of keywords to use when searching for topics at search engines, QA, forums, social bookmarking and social networks. (should already be done by marketing folks)
  • Cities & Regions - Target specific markets in cities that make sense to your evangelism strategy, what are local tech meet ups, what are the local organizations, schools, and other gatherings. Who are the tech ambassadors for FamilySearch in these spaces?

Adding To Feedback Loop From Forum Operations

  • Stories - Deriving of stories for blog derived from forum activity, and the actual needs of developers.
  • FAQ Feed - Is this being updated regularly with stuff?
  • Streams - other stream giving the platform a heartbeat?

Being Social About Platform Code and Operations With Github

  • Setup Github Account - Setup FamilySearch platform developer account and bring internal development team into a team umbrella as part of.
  • Github Relationships - Managing of followers, forks, downloads and other potential relationships via Github, which has grown beyond just code, and is social.
  • Github Repositories - Managing of code sample Gists, official code libraries and any samples, starter kits or other code samples generated through projects.

Adding To The Feedback Loop From The Bigger FAQ Picture

  • Quora - Regular trolling of Quora and responding to relevant [Client Name] or industry related questions.
  • Stack Exchange - Regular trolling of Stack Exchange / Stack Overflow and responding to relevant FamilySearch or industry related questions.
  • FAQ - Add questions from the bigger FAQ picture to the local FamilySearch FAQ for referencing locally.

Leverage Social Engagement And Bring In Developers Too

  • Facebook - Consider setting up of new API specific Facebook company. Posting of all API evangelism activities and management of friends.
  • Google Plus - Consider setting up of new API specific Google+ company. Posting of all API evangelism activities and management of friends.
  • LinkedIn - Consider setting up of new API specific LinkedIn profile page who will follow developers and other relevant users for engagement. Posting of all API evangelism activities.
  • Twitter - Consider setting up of new API specific Twitter account. Tweeting of all API evangelism activity, relevant industry landscape activity, discover new followers and engage with followers.

Sharing Bookmarks With the Social Space

  • Hacker News - Social bookmarking of all relevant API evangelism activities as well as relevant industry landscape topics to Hacker News, to keep a fair and balanced profile, as well as network and user engagement.
  • Product Hunt - Product Hunt is a place to share the latest tech creations, providing an excellent format for API providers to share details about their new API offerings.
  • Reddit - Social bookmarking of all relevant API evangelism activities as well as relevant industry landscape topics to Reddit, to keep a fair and balanced profile, as well as network and user engagement.

Communicate Where The Roadmap Is Going

  • Roadmap - Provide regular roadmap feedback based upon developer outreach and feedback.
  • Changelog - Make sure the change log always reflects the roadmap communication or there could be backlash.

Establish A Presence At Events

  • Conferences - What are the top conferences occurring that we can participate in or attend--pay attention to call for papers of relevant industry events.
  • Hackathons - What hackathons are coming up in 30, 90, 120 days? Which would should be sponsored, attended, etc.
  • Meetups - What are the best meetups in target cities? Are there different formats that would best meet our goals? Are there any sponsorship or speaking opportunities?
  • Family History Centers - Are there local opportunities for the platform to hold training, workshops and other events at Family History Centers?
  • Learning Centers - Are there local opportunities for the platform to hold training, workshops and other events at Learning Centers?

Measuring All Platform Efforts

  • Activity By Group - Summary and highlights from weekly activity within the each area of API evangelism strategy.
  • New Registrations - Historical and weekly accounting of new developer registrations across APis.
  • Volume of Calls - Historical and weekly accounting of API calls per API.
  • Number of Apps - How many applications are there.

Essential Internal Evangelism Activities

  • Storytelling - Telling stories of an API isn’t just something you do externally, what stories need to be told internally to make sure an API initiative is successful.
  • Conversations - Incite internal conversations about the FamilySearch platform. Hold brown bag lunches if you need to, or internal hackathons to get them involved.
  • Participation - It is very healthy to include other people from across the company in API operations. How can we include people from other teams in API evangelism efforts. Bring them to events, conferences and potentially expose them to local, platform focused events.
  • Reporting - Sometimes providing regular numbers and reports to key players internally can help keep operations running smooth. What reports can we produce? Make them meaningful.

All of this evangelism starts with a very external focus, which is a hallmark of API and developer evangelism efforts, but if you notice by the end we are bringing it home to the most important aspect of platform evangelism, the internal outreach. This is the number one reason APIs fail, is due to a lack of internal evangelism, educating top and mid-level management, as well as lower level staff, getting buy-in and direct hands-on involvement with the platform, and failing to justify budget costs for the resources needed to make a platform successful.

Top-Down Change At FamilySearch

The change FamilySearch is looking for already has top level management buy-in, the problem is that the vision is not in lock step sync with actual platform operations. When regular projects developed via the FamilySearch platform are regularly showcased to top level executives, and stories consistent with platform operations are told, management will echo what is actually happening via the FamilySearch. This will provide a much more ongoing, deeper message for the rest of the company, and partners around what the priorities of the platform are, making it not just a meaningless top down mandate.

An example of this in action is with the recent mandate from President Obama, that all federal agencies should go “machine readable by default”, which includes using APIs and open data outputs like JSON, instead of document formats like PDF. This top down mandate makes for a good PR soundbite, but in reality has little affect on the ground at federal agencies. In reality it has taken two years of hard work on the ground, at each agency, between agencies, and with the public to even begin to make this mandate a truth at over 20 of the federal government agencies.

Top down change is a piece of the overall platform evolution at FamilySearch, but is only a piece. Without proper bottom-up, and outside-in change, FamilySearch will never evolve beyond just being a genealogical software as a service with an interesting API. It takes much more than leadership to make a platform.

Bottom-Up Change At FamilySearch

One of the most influential aspects of APIs I have seen at companies, institutions, and agencies is the change of culture brought when APIs move beyond just a technical IT effort, and become about making resources available across an organization, and enabling people to do their job better. Without an awareness, buy-in, and in some cases evangelist conversion, a large organization will not be able to move from a service orientation to a platform way of thinking.

If a company as a whole is unaware of APIs, either at the company or organization, as well as out in the larger world with popular platforms like Twitter, Instagram, and others—it is extremely unlikely they will endorse, let alone participate in moving from being a digital service to platform. Employees need to see the benefits of a platform to their everyday job, and their involvement cannot require what they would perceive as extra work to accomplish platform related duties. FamilySearch employees need to see the benefits the platform brings to the overall mission, and play a role in this happening—even if it originates from a top-down mandate.

Top bookseller Amazon was already on the path to being a platform with their set of commerce APIs, when after a top down mandate from CEO Jeff Bezos, Amazon internalized APIs in such a way, that the entire company interacted, and exchange resources using web APIs, resulting in one of the most successful API platforms—Amazon Web Services (AWS). Bezos mandated that if an Amazon department needed to procure a resource from another department, like server or storage space from IT, it need to happen via APIs. This wasn’t a meaningless top-down mandate, it made employees life easier, and ultimately made the entire company more nimble, and agile, while also saving time and money. Without buy-in, and execution from Amazon employees, what we know as the cloud would never have occurred.

Change at large enterprises, organizations, institutions and agencies, can be expedited with the right top-down leadership, but without the right platform evangelism strategy, that includes internal stakeholders as not just targets of outreach efforts, but also inclusion in operations, it can result in sweeping, transformational changes. This type of change at a single organization can effect how an entire industry operates, similar to what we’ve seen from the ultimate API platform pioneer, Amazon.

Outside-In Change At FamilySearch

The final layer of change that needs to occur to bring FamilySearch from being just a service to a true platform, is opening up the channels to outside influence when it comes not just to platform operations, but organizational operations as well. The bar is high at FamilySearch. The quality of services, and expectation of the process, and adherence to the mission is strong, but if you are truly dedicated to providing a database of all mankind, you are going to have to let mankind in a little bit.

FamilySearch is still the keeper of knowledge, but to become a platform you have to let in the possibility that outside ideas, process, and applications can bring value to the organization, as well as to the wider genealogical community. You have to evolve beyond notions that the best ideas from inside the organization, and just from the leading partners in the space. There are opportunities for innovation and transformation in the long-tail stream, but you have to have a platform setup to encourage, participate in, and be able to identify value in the long-tail stream of an API platform.

Twitter is one of the best examples of how any platform will have to let in outside ideas, applications, companies, and individuals. Much of what we consider as Twitter today was built in the platform ecosystem from the iPhone and Android apps, to the desktop app TweetDeck, to terminology like the #hashtag. Over the last 5 years, Twitter has worked hard to find the optimal platform balance, regarding how they educate, communicate, invest, acquire, and incentives their platform ecosystem. Listening to outside ideas goes well beyond the fact that Twitter is a publicly available social platform, it is about having such a large platform of API developers, and it is impossible to let in all ideas, but through a sophisticated evangelism strategy of in-person, and online channels, in 2014 Twitter has managed to find a balance that is working well.

Having a public facing platform doesn’t mean the flood gates are open for ideas, and thoughts to just flow in, this is where service composition, and the certification and partner framework for FamilySearch will come in. Through clear, transparent partners tiers, open and transparent operations and communications, an optimal flow of outside ideas, applications, companies and individuals can be established—enabling a healthy, sustainable amount of change from the outside world.

Knowing All Of Your Platform Partners

The hallmark of any mature online platform is a well established partner ecosystem. If you’ve made the transition from service to platform, you’ve established a pretty robust approach to not just certifying, and on boarding your partners, you also have stepped it up in knowing and understanding who they are, what their needs are, and investing in them throughout the lifecycle.

First off, profile everyone who comes through the front door of the platform. If they sign up for a public API key, who are they, and where do they potentially fit into your overall strategy. Don’t be pushy, but understanding who they are and what they might be looking for, and make sure you have a track for this type of user well defined.

Next, quality, and certify as you have been doing. Make sure the process is well documented, but also transparent, allowing companies and individuals to quickly understand what it will take to certified, what the benefits are, and examples of other partners who have achieved this status. As a developer, building a genealogical mobile app, I need to know what I can expect, and have some incentive for investing in the certification process.

Keep your friends close, and your competition closer. Open the door wide for your competition to become a platform user, and potentially partner. 100+ year old technology company Johnson Controls (JCI) was concerned about what the competition might do it they opened up their building efficient data resources to the public via the Panoptix API platform, when after it was launched, they realized their competition were now their customer, and a partner in this new approach to doing business online for JCI.

When Department of Energy decides what data and other resource it makes available via Data.gov or the agencies developer program it has to deeply consider how this could affect U.S. industries. The resources the federal agency possesses can be pretty high value, and huge benefits for the private sector, but in some cases how might opening up APIs, or limiting access to APIs help or hurt the larger economy, as well as the Department of Energy developer ecosystem—there are lots of considerations when opening up API resources, that vary from industry to industry.

There are no silver bullets when it comes to API design, deployment, management, and evangelism. It takes a lot of hard work, communication, and iterating before you strike the right balance of operations, and every business sector will be different. Without knowing who your platform users are, and being able to establish a clear and transparent road for them to follow to achieve partner status, FamilySearch will never elevate to a true platform. How can you scale the trusted layers of your platform, if your partner framework isn’t well documented, open, transparent, and well executed? It just can’t be done.

Meaningful Monetization For Platform

All of this will take money to make happen. Designing, and executing on the technical, and the evangelism aspects I’m laying out will cost a lot of money, and on the consumers side, it will take money to design, develop, and manage desktop, web, and mobile applications build around the FamilySearch platform. How will both the FamilySearch platform, and its participants make ends meet?

This conversation is a hard one for startups, and established businesses, let alone when you are a non-profit, mission driven organization. Internal developers cost money, server and bandwidth are getting cheaper but still are a significant platform cost--sustaining a sale, bizdev, and evangelism also will not be cheap. It takes money to properly deliver resources via APIs, and even if the lowest tiers of access are free, at some point consumers are going to have to pay for access, resources, and advanced features.

The conversation around how do you monetize API driven resources is going on across government, from cities up to the federal government. Where the thought of charging for access to public data is unheard of. These are public assets, and they should be freely available. While this is true, think of the same situation, but when it comes to physical public assets that are owned by the government, like parks. You can freely enjoy many city, county, and federal parks, there are sometimes small fees for usage, but if you want to actually sell something in a public park, you will need to buy permits, and often share revenue with the managing agency. We have to think critically about how we fund the publishing, and refinement of publicly owned digital assets, as with physical assets there will be much debate in coming years, around what is acceptable, and what is not.

Woven into the tiers of partner access, there should always be provisions for applying costs, overhead, and even generation of a little revenue to be applied in other ways. With great power, comes great responsibility, and along with great access for FamilySearch partners, many will also be required to cover costs of compute capacity, storage costs, and other hard facts of delivering a scalable platform around any valuable digital assets, whether its privately or publicly held.

Platform monetization doesn’t end with covering the costs of platform operation. Consumers of FamilySearch APIs will need assistance in identify the best ways to cover their own costs as well. Running a successful desktop, web or mobile application will take discipline, structure, and the ability to manage overhead costs, while also being able to generate some revenue through a clear business model. As a platform, FamilySearch will have to bring to the table some monetization opportunities for consumers, providing guidance as part of the certification process regarding what are best practices for monetization, and even some direct opportunities for advertising, in-app purchases and other common approaches to application monetization and sustainment.

Without revenue greasing the gears, no service can achieve platform status. As with all other aspects of platform operations the conversation around monetization cannot be on-sided, and just about the needs of the platform providers. Pro-active steps need to be taken to ensure both the platform provider, and its consumers are being monetized in the healthiest way possible, bringing as much benefit to the overall platform community as possible.

Open & Transparent Operations & Communications

How does all of this talk of platform and evangelism actually happen? It takes a whole lot of open, transparent communication across the board. Right now the only active part of the platform is the FamilySearch Developer Google Group, beyond that you don’t see any activity that is platform specific. There are active Twitter, Facebook, Google+, and mainstream and affiliate focused blogs, but nothing that serves the platform, contributed to the feedback loop that will be necessary to take the service to the next level.

On a public platform, communications cannot all be private emails, phone calls, or face to face meetings. One of the things that allows an online service to expand to become a platform, then scale and grow into robust, vibrant, and active community is a stream of public communications, which include blogs, forums, social streams, images, and video content. These communication channels cannot all be one way, meaning they need to include forum and social conversations, as well as showcase platform activity by API consumers.

Platform communications isn’t just about getting direct messages answered, it is about public conversation so everyone shares in the answer, and public storytelling to help guide and lead the platform, that together with support via multiple channels, establishes a feedback loop, that when done right will keep growing, expanding and driving healthy growth. The transparent nature of platform feedback loops are essential to providing everything the consumers will need, while also bringing a fresh flow of ideas, and insight within the FamilySearch firewall.

Truly Shifting FamilySearch The Culture

Top-down, bottom-up, outside-in, with constantly flow of oxygen via vibrant, flowing feedback loop, and the nourishing, and sanitizing sunlight of platform transparency, where week by week, month by month someone change can occur. It won’t all be good, there are plenty of problems that arise in ecosystem operations, but all of this has the potential to slowly shift culture when done right.

One thing that shows me the team over at FamilySearch has what it takes, is when I asked if I could write this up a story, rather than just a proposal I email them, they said yes. This is a true test of whether or not an organization might have what it takes. If you are unwilling to be transparent about the problems you have currently, and the work that goes into your strategy, it is unlikely you will have what it takes to establish the amount of transparency required for a platform to be successful.

When internal staff, large external partners, and long tail genealogical app developers and enthusiasts are in sync via a FamilySearch platform driven ecosystem, I think we can consider a shift to platform has occurred for FamilySearch. The real question is how do we get there?

Executing On Evangelism

This is not a definitive proposal for executing on an API evangelism strategy, merely a blueprint for the seed that can be used to start a slow, seismic shift in how FamilySearch engages its API area, in a way that will slowly evolve it into a community, one that includes internal, partner, and public developers, and some day, with the right set of circumstances, FamilySearch could grow into robust, social, genealogical ecosystem where everyone comes to access, and participate in the mapping of mankind.

  • Defining Current Platform - Where are we now? In detail.
  • Mapping the Landscape - What does the world of genealogy look like?
  • Identifying Projects - What are the existing projects being developed via the platform?
  • Define an API Evangelist Strategy - Actually flushing out of a detailed strategy.
    • Projects
    • Storytelling
    • Syndication
    • Social
    • Channels
      • External Public
      • External Partner
      • Internal Stakeholder
      • Internal Company-Wide
  • Identify Resources - What resource currently exist? What are needed?
    • Evangelist
    • Content / Storytelling
    • Development
  • Execute - What does execution of an API evangelist strategy look like?
  • Iterate - What does iteration look like for an API evangelism strategy.
    • Weekly
    • Review
    • Repeat

AS with many providers, you don’t want to this to take 5 years, so how do you take a 3-5 year cycle, and execute in 12-18 months?

  • Invest In Evangelist Resources - It takes a team of evangelists to build a platform
    • External Facing
    • Partner Facing
    • Internal Facing
  • Development Resources - We need to step up the number of resources available for platform integration.
    • Code Samples & SDKs
    • Embeddable Tools
  • Content Resources - A steady stream of content should be flowing out of the platform, and syndicated everywhere.
    • Short Form (Blog)
    • Long Form (White Paper & Case Study)
  • Event Budget - FamilySearch needs to be everywhere, so people know that it exists. It can’t just be online.
    • Meetups
    • Hackathons
    • Conferences

There is nothing easy about this. It takes time, and resources, and there are only so many elements you can automate when it comes to API evangelism. For something that is very programmatic, it takes more of the human variable to make the API driven platform algorithm work. With that said it is possible to scale some aspects, and increase the awareness, presence, and effectiveness of FamilySearch platform efforts, which is really what is currently missing.

While as the API Evangelist, I cannot personally execute on every aspect of an API evangelism strategy for FamilySearch, I can provide essential planning expertise for the overall FamilySearch API strategy, as well as provide regular checkin with the team on how things are going, and help plan the roadmap. The two things I can bring to the table that are reflected in this proposal, is the understanding of where the FamilySearch API effort currently is, and what is missing to help get FamilySearch to the next stage of its platform evolution.

When operating within the corporate or organizational silo, it can be very easy to lose site of how other organizations, and companies, are approach their API strategy, and miss important pieces of how you need to shift your strategy. This is one of the biggest inhibitors of API efforts at large organizations, and is one of the biggest imperatives for companies to invest in their API strategy, and begin the process of breaking operations out of their silo.

What FamilySearch is facing demonstrates that APIs are much more than the technical endpoint that most believe, it takes many other business, and political building blocks to truly go from API to platform.

See The Full Blog Post


API Monetization In The Internet of Things @ Nordic APIs

I have a panel this week at Nordic APIs called Business Models in an Internet of Things, with Ellen Sundh (@ellensundh) of Coda Collective, David Henricson Briggs of Playback Energy, Bradford Stephens of Ping Identity and Ronnie Mitra(@mitraman/a>) of Layer 7 Technologies. My current abstract for the panel is:

As we just begin getting a hold on monetization strategies and business models for APIs delivering data and resources for mobile development. How will we begin to understand how to apply what we have learned for the Internet of Things across our homes, vehicles, sensors and other Internet enabled objects that are being integrating with our lives.

In preparation for the event I am working through my thoughts around potential monetization strategies and business models that will emerge in this fascinating adn scary new world where everything can be connected to the Internet---creating an Internet of Things (IoT).

Where Is The Value In The IoT?
When it comes to monetizing APIs of any type, there first has to be value. When it comes IoT where is the value for end-users? Is it the device themselves, is it the ecosystem of applications built around a device or will it be about the insight derived from the data exhaust generated from these Internet connected devices?

Evolving From What We Know
After almost 10 years of operating web APIs, we are getting a handle on some of the best approaches to monetization and building business models in this new API economy. How much of this existing knowledge will transfer directly to the IoT? Freemium, tiered plans, paid API access and advertising--which of these existing models will work, and which won't.

Another existing model to borrow from when it comes to IoT is the telco space. The world of cellphone and smart phones are the seeds of IoT and one of the biggest drivers of the API economy. How will existing telco business models be applied to the world of IoT? Device subsidies, contracts, data plans, message volumes are all possible things that could be borrowed from the existing telco world, but we have to ask ourselves, what will work and what won't?

Will Developers Carry the Burden?
When it comes to API access, developers often pay for access and the privilege of building applications on top of API driven platforms. Will this be the case in the IoT? Will the monetization of IoT platforms involve charging developers for API usage, number of users and features? Is this a primary channel for IoT device makers to make money off their products? In the beginning this may not be the case, with providers needing to incentivize developers to build apps and crunch data, but it is likely that eventually developers will have to carry at least some of the burden.

Micro-Payment Opportunities
The payment industry is booming in the API Economy, but micro-payments are still getting their footing, doing better in some areas than others. Certain areas of IoT may lend itself to applying micro-payment approaches to monetization. When you pass through toll booths or parking, there are clear opportunities for micro-payments to engage with Internet connected automobiles. Beyond the obvious, think of the opportunities for traffic prioritization--do you want intelligence on where you should drive to avoid traffic or possibly pay per mile to be in a preferred lane? Another area is in entertainment, in generating revenue from delivering music, audiobooks and other entertainment to drivers or passengers in IoT vehicles and public transportation.

Will IoT Be All About The Data
As we sit at the beginning of the era of big data, driven from mobile, social and the cloud, what will big data look like in the IoT era. Will the money be all about the data exhaust that comes from a world of Internet connected device, not just at the individual device and the insight delivered to users, but at the aggregate level and understand parking patterns for entire cities or the electricity consumption for a region.

Security Will Be Of High Value In IoT
We are already beginning to see the importance of security in the IoT world, with missteps by Tesla and camera maker TRENDnet. Will security around IoT be a monetization opportunity in itself? Device manufacturers will be focused on doing what they do best, and often times will overlook security, leaving open huge opportunities for companies to step up and deliver b2b and b2c security options and layers for IoT. How much will we value security? Will we pay extra to ensure the devices in our lives are truly secure?

I Will Pay For My Privacy In An IoT World
When all devices in my life are connected to the Internet, but also the world around me is filled with cameras, sensors and tracking mechanisms, how will privacy change? Will we have the opportunity to buy privacy in an IoT world? Will the wealthy be able to pay for the privilege of being lost in a sea of devices, not showing up on cameras, passed by when sensors are logging data? Privacy may not be a right in an IoT world it by be purely something you get if you can afford it. Will companies establish IoT business models and drive monetization through privacy layers and opportunities?

A IoT Las Vegas for Venture Capital
With IoT centered around costly physical devices, and potentially large platforms and networks, will anything in the IoT space be able to be bootstrapped like the web 2.0 and mobile space was? Or will all IoT companies require venture capital? At first glance IoT looks like a huge opportunity for VC firms, allowing them to specialize for the win, or gamble on the space like they would in Las Vegas.

Will We Plan For Monetization Early On In IoT?
When it comes to IoT, it is easy to focus on the monetization the physical device, either leaving money on the table with new an innovative ways of generating revenue, or possibly having monetization strategies that are behind the scenes and not obvious to users--something that could be damaging to security, privacy and overall trust in the IoT space.

We learned a lot from mistakes made in early social, cloud and mobile API monetization. We need to make sure and have open conversation around healthy IoT business models and monetization strategies. Generating revenue from IoT needs to be a 3-legged endeavor that includes not just IoT platform providers, but sensibly includes ecosystem developers as well as end-users.

The world of IoT is just getting going, but is picking up momentum very quickly. We are seeing IoT devices enter our homes, cars, clothing, bodies and will become ubiquitous in the world around us, embedded in signs, doorways, roadways, products in rural and metropolitan areas. It is clear there is huge opportunities to make money in this new Internet connected world, but let's make sure and have open conversations about how this can be done in sensible ways to make sure the IoT space grows in a healthy and vibrat way.

See The Full Blog Post


API Monetization

API Monetization

Hopefully by the time you have put an API management plan in place, you already have a health business model in place, which should provide framework for you monetization goals.

Its not just about how you are going to generate revenue via your API, it is also about how you will keep your API in operation, and performing for consumers.

Not all APIs are created equal. The reasons for deploying an API can vary widely, but we are seeing some common patterns emerge.

Free
A popular way to provide access to an API, is by offering a free tier so that anyone can sign up, start using an API and understand the value an API delivers. This allows consumers to kick the tires, see if it will meet their needs before spending money. While free is a good option for many API monetization strategies, it is just one tool in the toolbox of API providers, and by itself, or without the proper up sell to higher levels can prove problematic.

Consumer Pays
After providing free access, the most common approach to API monetization is establishing a price that API consumers will pay. Some API providers have done well by finding a fair price that developers are willing to pay for their resources. We are seeing three common approaches to APIs charging consumers:

  • Tiered - Providing multiple tiers of paid access, such as bronze, gold or platinum. Each tier has its own set or services and allowances for access to API resources. With pricing, stepping up in cost for each tier.
  • Pay as You Go - Other API providers prefer to offer a utility based model, where API consumers pay for what they use. Depending on the amount of bandwidth, storage and other hard costs, incurred around API consumption, providers charge based upon their cost, plus a logical profit.
  • Unit Based - Other API providers define each API resource in terms of units, assign a unit price. API providers pay for the number of units they anticipate using, with the option for buying more when necessary.

Some API providers will mix and match different combinations of tiered, pay as you go and unit based API pricing to recover operational costs as well as generate revenue in many cases. 

Consumer Gets Paid
In some cases, an API will drive other revenue streams for companies and can actually share revenue with API consumers. This approach acts as an incentive model for API consumers, encouraging integration and quality implementation of resources that drive the high possible revenue for an API provider as possible. There are three distinct models for sharing API revenue with consumers that have merged:

  • Ad Rev-Share - Some API providers offer advertising network as part of their platforms. API consumers embed advertising in their sites and apps, providing revenue for API providers. In turn the API provider returns a portion of the revenue from advertising.
  • Affiliate - Some approaches to monetization of websites have been applied to API ecosystems. Cost Per Acquisition (CPA), Cost Per Click (CPC) and one time or recurring revenue sharing models are commonly used. Once key aspects of where revenue is generated from an API driven platform, it is easy to carve of affiliate models for sharing revenue with API consumers.
  • Credits to Bill - A smaller group of API providers will employ an API consumers pay model, but based upon advertising revenue share or affiliate revenue will
    credit an API consumers bill, reducing a developers overhead for integration and eliminating the need to pay cash out the door.

Indirect
Monetization of an API isn't always about directly generating revenue from API access, advertising or other revenue. There are often times indirect ways that an API can deliver value to a company. 

  • Marketing Vehicle - APIs can provide an excellent marketing vehicle for a company and its online presence. Through sensible branding strategies, developers can become 3rd party marketing agents, working on behalf of a core company and its brand.
  • Brand Awareness - As a new tool in an overall marketing and branding strategy, an API can provide a type type of brand exposure via 3rd party websites and applications. Extending the reach of a brand, using 3rd party API consumers as the engine.
  • Content Acquisition - Not all APIs are about delivering content, data and other resources to their consumers. APIs often allow for writing, updating and deleting of content, as well as pulling. Content acquisition via an API can be a great way to build value within a company and its platform if done right.
  • SaaS - Software as a Service (SaaS) has become a common approach to selling software online to consumers and businesses.  Often times an API will compliment the core software and its offering, providing value to SaaS users. API access is often included as part of core SaaS platform, but also can be delivered as an up-sell for premium SaaS users.
  • Traffic Generation - APIs can also be used to drive traffic to an existing website or application. By designing an API to use hyperlinks directed at central websites or apps, and encouraging consumers to build their own websites and apps that are integrated with the API, the opportunity for driving traffic to other sites is very desirable to many companies who are providing APIs.

Many companies start by focusing on launching and evolving their API strategy and gaining essential experience, before fully executing on their API monetization strategy--relying completely on indirect value from an API. While it is better to have a monetization strategy in place early, many are finding success by prioritizing the API first and monetziation second.

With APIs being deployed in various capacities, within a company, privately between partners or in the public, a wide mix of monetization strategies can be used. Some API resources just lend themselves better to a pay as you go model, while some markets demand that data be freely accessible with the need to register or be charged for access. There is no one size fits all approach to API monetization.

One way to think about an API is as an external research & development lab within a company. A lab that accepts ideas and integrations from partners, incubates these ideas, applications and business relationships.  Companies are using APIs to bring allow the introduction of outside ideas and talent into the mix, in hopes of inciting innovation. Some API providers will hand select the best integrations, invest in their individuals and companies, sometimes resulting in acquisitions of technology and the talent they possess--creating entirely new approaches to monetization you may not have thought of.

Just like there is no on size fits all approach to API monetization, there are few constants in pricing or access. Even the pioneers in the space like Amazon Web Services are constantly adjusting, tweaking and experimenting, trying to find the most competitive approach possible. APIs are about business development and finding new ways to monetize your new and existing resources.

See The Full Blog Post


API Monetization: API Affiliate Network API + Google URL Shortener API

In my quest to understand the monetization opportunities via APIs, I’m studying the possibilites around tracking, and now monetization of content and URL’s served up via APIs.

The other day I considered wrapping URLs for another layer of metrics in your API, and today I’m thinking about how to evolve API monetization beyond advertising, defining entirely new API driven conversion events where both API owners and consumers, can realistically make money.

I first talked about an advertising network dedicated to APIs and developers last year, and everytime I come back to it in my Evernote list, I can’t help but consider using the Google Affiliate Network as an engine.

I don’t have any access to Google Affiliate Network (I submitted request), so all of this is speculative. But after looking through the Google Afffiliate Network API interface, it looks like I can pull both advertisers, publishers and events through the API-- so I’m envisioning two scenarios:

  • API Owners - As an API owner I could use the Google Affiliate network to define “events” that could occur via my APIs, and setup my API developers as “publishers”.
  • API Consumers - As an API consumer I could setup various APIs I use as “advertisers”, creating specific events for these APIs, and setup my own applications or sites as “publishers”.

In either case, API owners or API consumers could replace any URLs in content or URLs directly served up via APIs with a Google Affiliate Network generated URL with specific conversion events defined, then shortened using Google URL Shortener API.

This would create a layer of not just metrics, but conversion events that could be monetized. API consumers could choose from affiliate opportunities available on the Google Affiliate Network, and API owners could do the same, or define their own conversion events that were meaningful to their business goals.

These are just some original thoughts on API monetization using Google Afffiliate Network API and the Google URL Shortener API. Seems like there is a lot of opportunity monetize the data and resources flowing through API pipes, whether you’re an API owner or avid API consumer.

See The Full Blog Post


YellowAPI.com and the Future of API Monetization

Canada's Yellow Pages Group (YPG) launched a new developer ecosystem to support their local search and location based services (LBS) API, YellowAPI.com.

YPG sees their developer ecosystem much differently than many other companies, they view the YellowAPI as a direct extension of its core business model, and the API dramatically extends the reach of its business network.

Within many API communities, the API owners can often seem distant from developers, or even view the development community as a problem.

As a developer I've experienced the limitations imposed by Twitter and Facebook, and I often feel ignored by Google when using their APIs.

I see YellowAPI's strategy as the future of API ecosystem management.

Developers can apply to be part of the YellowAPI.com developer community, once approved they gain access to a wealth of resources, including consultation on application design, quality assurance, legal guidance, technical assistance and can receive funding assistance to take their applications live.

In addition to these resources, YellowAPI.com provides opportunities for developers to monetize applications in the following ways:
  • Paid Search - Paying for volumes of high quality traffic from applications
  • Wigets - Group buying solutions in Canada and "Deal of the Day" widgets
  • Advertising - Platform specific digital ad network provided by Mediative
  • Placement - Premium paid search using mobile placement products
These aren't side opportunities invented just for the API, they represent YPG's core business.

Since YPG sees the API and its ecosystem as an extension of their core business, they are opening up revenue sharing opportunities for approved developers within its community.

If the YellowAPI community is successful in building sustainable business applications, YPG benefits. And because of the forward thinking API ecosystem strategy, when YPG makes money, the API developer community shares in this revenue.

How do you monetize your API, and where do your developers fit in?

See The Full Blog Post