RSS

API Monetization News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

Slow Moving Ransomware As The New Business Model

I was reading about the difficulties the City of New York was having when it comes to migrating off of the Palantir platfom, while also reading about the latest cybersecurity drama involving ransomware. I’m spending a lot of time studying cybersecurity lately, partly because they involve APIs, but mostly because it is something that is impacting every aspect of our lives, including our democracy, education, and healthcare. One thing I notice on the cybersecurity stage, is that everything is a much more extreme, intense, representation of what is going on in the mainstream tech industry.

Ransomware is software that gets installed on your desktop or servers and locks up all your data until you pay the software developer (implementor) a ransom. Ransomware is just a much faster moving version of what many of us in the software industry call vendor lock-in. This is what you are seeing with Palantir, and the City of New York. What tech companies do is get you to install their software on your desktop or servers, or convince you to upload all your data into the cloud, and use their software. This is business 101 in the tech industry. You either develop cloud-based software, something that runs on-premise, or you are a mix of both. Ideally, your customers become dependent on you, and they keep paying your monthly, quarterly, or annual subscriptions (cough cough ransom).

Here is where the crafty API part of the scheme comes in. Software providers can also make APIs that allow your desktop and server to integrate with their cloud solutions, allowing for much deeper integration of data, content, and algorithms. The presence of APIs SHOULD also mean that you can more easily get your data, content, and algorithms back, or have kept in sync the whole time, so that when you are ready to move on, you don’t have a problem getting your data and content back. The problem is, that APIs “CAN” enable this, but in many situations providers do not actually give you complete access to your data, content, or algorithms via API, and enable the true data portability and sync features you need to continue doing business without them.

This is vendor lock-in. It is a much friendlier, slower moving version of ransomware. As a software vendor you want your software to be baked in to a customers operations so they are dependent on you. How aggressively you pursue this, and how much you limit data portability, and interoperability, dictates whether you are just doing business as usual, or engaging in vendor lock-in. One thing I’m hopeful for in all of this, are the vendors who see transparency, observability, interoperability, and portability of not just the technical, but also the business and politics of delivering technology as a legitimate competitive advantage. This means that they will always be able to out maneuver, and stay ahead of software vendors who practice vendor lock-in and ransomware, whether of the slow or fast moving variety.


The General Services Administration API Strategy Considers How To Generate Revenue

The General Services Administration(GSA) has an API strategy, which describes “GSA’s future direction for agency­wide API management including design, development, architecture, operations, and support, and security.” Ok, let’s pause there. I want to point out that this isn’t just an API design guide. That is a portion of it, but it also touches on some of the most obvious (deployment), and the most critical aspects (security) of API operation–important stuff.

The objectives for the GSA crafting an API strategy are:

­* Harness API management to maximize customer value and technical efficiencies. ­* Adopt industry best practices with API design, development, and management. ­* Consider opportunities for revenue through API offerings.

I always enjoy seeing federal agencies talk about efficiencies and best practices, but it gives me hope that all of this might actually work when I see federal agencies actually “considering opportunities for revenue through API offerings”. Especially at the GSA, who provides technical guidance to the other 400+ federal agencies, as well as at the state and municipal level. I am not excited about our government charging money for digital resources, I am excited about our government exploring how it will generate revenue to sustain itself in the future.

I know there are a lot of open data advocates who can’t wrap their mind around this, but this is how the government will generate needed tax revenue to operate in the future–using APIs. Our government currently generates a portion of its revenue from the sale of physical resources like timber, mining, and agriculture, why should things be different when it comes to taxing and regulating digital resources being made available via the web, mobile, and device applications. While there is still lots to figure out on this front, I am glad to see the GSA putting some thought into the role APIs will play in the future of funding the government services we all depend on each day.


Key Factors Determining Who Succeeds In The API and ML Marketplace Game

I was having a discussion with an investor today about the potential of algorithmic-centered API marketplaces. I’m not talking about API marketplaces like Mashape, I’m more talking about ML API marketplaces like Algorithmia. This conversation spans multiple areas of my API lifecycle research, so I wanted to explore my thoughts on the subject some more.

I really do not get excited about API marketplaces when you think just about API discovery–how do I find an API? We need solutions in this area, but I feel good implementations will immediately move from useful to commodity, with companies like Amazon already pushing this towards a reality.

There are a handful of key factors for determining who ultimately wins the API Machine Learning (ML) marketplace game:

  • Always Modular - Everything has to be decoupled and deliver micro value. Vendors will be tempted to build in dependency and emphasize relationships and partnerships, but the smaller and more modular will always win out.
  • Easy Multi-Cloud - Whatever is available in a marketplace has to be available on all major platforms. Even if the marketplace is AWS, each unit of compute has to be transferrable to Google or Azure cloud without ANY friction.
  • Enterprise Ready - The biggest failure of API marketplaces has always been being public. On-premise and private cloud API ML marketplaces will always be more successful that their public counterparts. The marketplace that caters to the enterprise will do well.
  • Financial Engine - The key to markets are their financial engines. This is one area AWS is way ahead of the game, with their approach to monetizing digital bits, and their sophisticated market creating pricing calculators for estimating and predicting costs gives them a significant advantage. Whichever marketplaces allows for innovation at the financial engine level will win.
  • Definition Driven - Marketplaces of the future will have to be definition driven. Everything has to have a YAML or JSON definition, from the API interface, and schema defining inputs and outputs, to the pricing, licensing, TOS, and SLA. The technology, business, and politics of the marketplace needs to be defined in a machine-readable way that can be measured, exchanged, and syndicated as needed.

Google has inroads into this realm with their GSuite Marketplace, and Play Marketplaces, but it feels more fragmented than Azure and AWS approaches. None of them are as far along as Algorithmia when it comes to specifically ML focused APIs. In coming months I will invest more time into mapping out what is available via marketplaces, trying to better understand their contents–whether application, SaaS, and data, content, or algorithmic API.

I feel like many marketplace conversations often get lost in the discovery layer. In my opinion, there are many other contributing factors beyond just finding things. I talked about the retail and wholesale economics of Algorithmia’s approach back in January, and I continue to think the economic engine will be one of the biggest factors in any API ML marketplace success–how it allows marketplace vendors to understand, experiment, and scale the revenue part of things without giving up to big of a slice of the pie.

Beyond revenue, the modularity and portability will be equally important as the financial engine, providing vital relief valves for some of the classic silo and walled garden effects we’ve seen the impact the success of previous marketplace efforts. I’ll keep studying the approach of smaller providers like Algorithmia, as well as those of the cloud giants, and see where all of this goes. It is natural to default to AWS lead when it comes to the cloud, but I’m continually impressed with what I’m seeing out of Azure, as well as feel that Google has a significant advantage when it comes to TensorFlow, as well as their overall public API experience–we will see.


Generating Revenue From The Remarketing Tags On API Evangelist

I am going through my entire infrastructure lately, quantifying the products and services that API Evangelist offers, and the partnerships that make everything go round. As I do in my work as the API Evangelist, I'm looking to work through my thoughts here on the blog, and this week I have an interesting topic on the workbench--the API Evangelist remarketing tags.

According to Google, remarketing tags are: "To show ads to people who have visited your desktop or mobile website, add the remarketing tag to your website. The tag is a short snippet of code that adds your website visitors to remarketing lists; you can then target these lists with your ads. If your website has a Google Analytics tag, you can use this tag instead and skip adding the AdWords remarketing tag."

Over the last couple of years, I've had 3Scale's remarketing tags on my site. 3Scale has been my number one supporter for the last three years, and API Evangelist wouldn't even be a thing without them. They have provided me with financial support each month for a variety of reasons. We've never fully itemized what is in exchange for this support, primarily it has been to just invest in me being API Evangelist and help 3Scale be successful, but one of the things on the table for the last couple of years has been that I published the 3Scale remarketing tags on my network of sites.

So, if you visited API Evangelist in the last three years, it is likely you saw 3Scale ads wherever you went on the web. As I'm evaluating all the products and services I offer, quantify my partnerships, and identify the value that API Evangelist brings to the table, I find myself thinking deeply about this practice. Should I be selling my visitors data like this? If you visit API Evangelist, should I allow my partners to target you with advertising on other sites you visit? As I think about the challenges I face in accepting money from startups, I'm forced to look at myself in the mirror as I think about selling remarketing tag access on my website(s).

There are two levels of this for me: 1) having Google JS on my website tracking you, and 2) selling an additional remarketing tag to one or many of my partners. Is if fair for me to 1) track you, and 2) sell the placement of cookies in your browser while you are on my domain. Part of my brand is being outspoken when it comes to the surveillance economy and having JavaScript that tracks you, and then selling access to placing tracking devices in your browser seems pretty fucking hypocritical to me. I do not want to be in the business of tracking you and selling your data--it goes against what API Evangelist is about!

Ok, I take the high road. I don't sell remarketing tags to my partners. I even take the Google tracking off my website. I'm happy with tracking my traffic at the DNS level, I do not need to do this. Except for the fact that it brings in revenue, I need to pay my server bills, rent, and everything else that keeps me doing API Evangelist. It isn't easy making a living being independent in the tech space, and I need every little bit of revenue I possibly can. It's not an insignificant amount of money either. I make a couple thousand dollars a month this way, as the traffic to my website is pretty high quality, at least in the eyes of an API service provider looking to sell you their warez.

I am still thinking on this one. I need to make a decision this month regarding what I am going to do with the remarketing tags for API Evangelist. Do I want to sell them entirely to one partner? Do I want to sell individual ones for each area of my research, like design, deployment, management, monitoring, and the other 70 areas of the API lifecycle I track on? IDK. It is a shitty place to be, with so few options for monetization of my work, beyond just the current advertising industrial complex (AIC)™. I'd love to hear your thoughts, either publicly or privately. It's a tough decision for me because if I choose to not do it, I risk API Evangelist going away, and me having to go get a job. *shudder* *shudder again* I'm exploring other ways of generating revenue, things like creating classes, and other content I can sell access to, so hopefully I can just walk away from remarketing tags without it killing me off. IDK WTF


My Challenges When Taking Money From Startups As The API Evangelist

It is a hustle to do API Evangelist. I've been lucky to have the support of 3Scale since 2013, without them API Evangelist would not have survived. I'm also thankful for the community stepping up last year to keep the site up and running, keeping it community focused thing, and not just yet another vendor mouthpiece. I make my money providing four ad slots on the site, by selling guides and white papers, and by consulting and creating content for others. It is a hustle that I enjoy much more than having a regular job, even though it is often more precarious, and unpredictable regarding what the future might hold.

Taking money from companies always creates a paradox for me. People read my stories because they tend to be vendor neutral and focus on ideas, and usable API-centric concepts. While I do write about specific companies, products, services, and tooling, I primarily try to talk about the solutions they provide, the ideas and stories behind them, steering clear of just being a cheerleader for specific vendor solutions. It's hard, and something I'm not always successful at, but I have primarily defined my brand by sticking to this theory.

This approach is at odds with what most people want to give me money for. 3Scale has long supported me and invested in me being me, doing what I do--which is rare. Most companies just want me to write about them, even if they understand the API Evangelist brand. They are giving me money, and in exchange, I should write about them, and their products and services. They often have no regard to the fact that this will hurt my brand, and run my readers off, falling short of actually achieving what they are wanting to achieve. I get the desire to advertise and market your warez, but the ability for companies to be their own worst enemy in this process is always fascinating to me.

I get regular waves of folks who want to give me money. I'm talking with and fending off a couple at the moment. So I wanted to think through this paradox (again), and talk about it out loud. It just doesn't make sense for me to take money from a company to write blog posts, white papers, and guides about their products and services. However, I feel like I can take money from companies to write blog posts, white papers, and guides about an industry or topic related to the problems they provide solutions for, or possesses a significant audience that might be interested in their products and services. I consider this underwriting and sponsorship of my API Evangelist research, where they receive branding, references, and other exposure opportunities along the way.

Ideally, all branding, reference, and exposure elements are measurable through the tracking of impressions and links. What was the reach? What is the scope of exposure? It is difficult to sustain any relationship without measuring success and both parties are unable to articulate and justify the financial underwriting and support. In some cases, I get paid a finder's fee for referrals, but this can be very difficult to track on and validate--something that requires a company to be pretty ethical, and truly invested in partnerships with smaller brands like me. I prefer to rely on this, as opposed to any sort of affiliate or reseller style tracking systems. I like companies that ask their customers, "how did you learn about us?", as they tend to actually care about their employees, partners, other stakeholders at a higher level.

Sometimes I take an advisor, or even technology investor role in startups, taking on a larger stake in outcomes, but these are something that is very rare. I have a manila file folder filled with stock options, and stakes I have in companies that never reached an exit, or when they did I was written out of things--when I do get a check from startup founders, I'm always floored. This does happen, but is truly a rare occurrence. I wish there were more decoupled, plug and play opportunities to invest in startups as an advisor, researcher, analyst, and storyteller, but alas the system isn't really set up for this type of thinking--people prefer the big money plays, over smaller, more meaningful contributions--it's about the money man.

Anyways, every time I visit this conversation in my head I come back to the same place. I'm happy to take money from companies to generate short form and long form content about an industry or topic. If there are finder fees for referrals, great! I leave it up to you to track on and come back to me with the details, and any specific event--while I will stay focused on what I do best, the thing you are investing in me to do. I'm mildly interested in opportunities to become more invested in your companies overall success, honestly, I just don't trust that the system will deliver in this area, and is more often just looking to extract value from me. I have seen too much in my 30-year career. However, I always welcome folks who want to prove me wrong! ;-)

In the end, my mission stays the same. I'm interested in studying where the API space has been, where it is at, and where it might be going. Then, I am interested in telling stories from what I learn in this process. If you want to invest in any of this, let me know. I could use your help.


Pricing Tiers Works For SaaS But Not Really For APIs

I get why SaaS, and API providers offer a handful of pricing plans and tiers for their platforms, but it isn't something I personally care for as an API consumer. I've studied thousands of plans and pricing for API providers, and have to regularly navigate 50+ plans for my own API operations, and I just prefer having access to a wide range of API resources, across many different companies, with a variety of usage limitations and pricing based upon each individual resources. I really am getting tired of having to choose between bronze, gold, or platinum, and often getting priced out completely because I can scale to the next tier as a user.

I understand that companies like putting users into buckets, something that makes revenue predictable from month to month, or year to year, but as we consumer more APIs from many different providers, it would help reduce the complexity for us API consumers if you flattened the landscape. I really don't want to have to learn the difference between each of my provider's tiers. I just want access to THAT resource via an API, at a fair price--something that scales infinitely if at all possible (I want it all). Ultimately, I do not feel like API plans and tiers will scale to API economy levels. I think as API providers, we are still being pretty self-centered, and thinking about pricing as we see it, and we need to open up and think about how our API consumers will view us in a growing landscape of service providers--otherwise, someone else will.

As I pick up my earlier API pricing work, which has two distinct components: 1) all API resources and pricing available for a platform 2) the details of plans and tiers which a complex list of resources, features, and pricing fit into. It would be much easier to just track resources, the features they have, and the unit price available for each API. Then we could let volume, time-based agreements, and other aspects of the API contract help us quantify the business of APIs, without limiting things to just a handful of API contract plans and tiers, expanding the ways we can do business using APIs.

As an API provider, I get that a SaaS model has worked well to quantify predictable revenue in a way that makes sense to consumers, but after a decade or more, as we move into a more serverless, microservices, devops world, it seems like we should be thinking in a more modular way when it comes to the business of our APIs. I'm sure all you bean counters can get out of your comfort zone for a bit, and change up how you quantify access to your API resources, following the lead of API pioneers like Amazon, and just provide a master list of API, CLI, and console resources available for a competitive price.


The Value Of Our Digital Bits

I think way too much about the digital bits being transmitted online each day. I study the APIs that are increasingly being used to share these bits via websites, mobile, and other Internet-connected devices. These bits can be as simple as your messages and images or can be as complex as the inputs and outputs of algorithms used in self-driving cars. I think about bits at the level up from just the 1s and 0s, at the point where they start to become something more meaningful, and tangible--as they are sent and received via the Internet, using web technology.

The average person takes these digital bits for granted, and are not burdened with the technical, business, and political concerns surrounding each of these bits. For many other folks across a variety of sectors, these bits are valuable and they are looking to get access to as many of them as you can. These folks might work at technology startups, hedge funds, maybe in law enforcement or just tech-savvy hacker or activist on the Internet. If you hang out in these circles, data is often the new oil, and you are looking to get your hands on as much of it as you can, and are eager to mine it everywhere you possibly can. 

In 2010, I started mapping out this layer of the web that was emerging, where bits were beginning to be sent and received via mobile devices, expanding the opportunity to make money from these increasingly valuable bits on the web. This move to mobile added a new dimension to each bit, making it even more valuable than they were before--it now possessed a latitude and longitude, telling us where it originated. Soon, this approach to sending and receiving digital bits spread to other Internet-connected devices beyond just our mobile phones, like our automobiles, home thermostats, and even wearables--to name just a few of the emerging areas.

The value of these bits will vary from situation to situation, with much of the value lying in the view of whoever is looking to acquire it. The value of a Facebook wall post is worth a different amount to an advertiser looking for a potential audience, then it will be to law enforcement looking for clues in an investigation, and let's not forget the value of this bit to the person who is posting it, or maybe their friends who are viewing it. When it comes to business in 2017, it is clear that our digital bits are valuable, even if much of this value is purely based on perception and very little tangible value in the real world. With many wild claims about the value and benefit of gathering, storing, and selling bits.

Markets are continually working to define the value of bits at a macro level, with many technology companies dominating the list, and APIs are defining the value of bits at the micro level--this is where I pay attention to things, at the individual API transaction level. I enjoy studying the value of individual bits, not because I want to make money off of them, but because I want to understand how those in power positions perceive the value of our bits and are buying and selling our bits at scale. Whether it is compute and storage in the cloud, or the television programs we stream, and pictures and videos we share in our homes, these bits are increasing in value, and I want to understand the process how everything we do is being reduced to a transaction. 


The Value Of Operational Level API Exhaust Systems

When thinking about generating revenue generated from APIs it is easy to focus on directly charging for any digital resource being made available via the API. If it's an image, we charge per API call, and maybe the amount of MB transferred. If it's messaging, we charge per message. There are plenty of existing examples out there regarding how you directly charge for data, content, or algorithms using APIs, and an API way of doing business--look to Amazon, Twilio, and other pioneers.

Where there are fewer examples and less open discussions, is around the value of the operation level of APIs, and making these data available via APIs--yes APIs for APIs. Modern approaches to doing APIs are all about requiring each application to use an API key with each call they make, the logging of each request and response, possessing the identifying key for each application. This is how API providers are developing an awareness of who is accessing resources, how they are being put them to use, and specific details about each application, and maybe even the users involved.

Sometimes the value generated at this layer doesn't exist. Due to restrictive access models, and direct revenue models, there isn't much going on operationally, so there isn't much value generated. However, when there is heavy usage around APIs, the exhaust of the API management layer can become increasingly valuable. What are people searching for? What are applications most popular? Which geographic regions are the most active? There is a pretty lengthy laundry list of valuable data points being applied across modern API operations, that are helping API providers better understand what is going on, that aren't often being included as part of the API road map, and future revenue models.

Ok, let me pause here for a moment. I identify the value being generated at this layer because I see existing providers reaching this realization in their operations, as well as wanting to help other providers see the potential being achieved by successful API providers. I also acknowledge there is a lot of exploitation, abuse, and surveillance going on at this level, which is one of the biggest reasons I'm encouraging more transparency, observability, and discussion about this layer. I want API providers to understand the potential, but I also want API consumers and the end users of their applications to understanding what is going on at the API operational layer as well. 

The current model I'm looking at through this lens currently is around my Open Referral Human Services Data Specification (HSDS) work, where I'm trying to help define the operational layer of human services APIs, as well as the direct API access to this critical public data. I am asking the question of how stewards of this very important data at the municipal level leverage APIs to make their valuable resources more accessible, and put to work where it is most needed, while also being able to generate and harness the valuable particles generated as part of an API exhaust system. What are people searching for? How are demographics evolving in a city, and how can city services shift and evolve too. Making the operational layer available via API so that it is available to key decision makers, even if those are private sector decisions makers who are required to pay for access to this intelligence--bringing critical new revenue streams for data stewards.

Let's pause again and be honest about the privacy concerns here. Access at this layer needs an additional level of scrutiny and care, over the direct access layers. Examples I'm concerned with can be seen in searches for Muslim religious services, or possibly LGBTQ services, and other information that could be used to violate the privacy and rights of application developers and end users. There are numerous privacy and security concerns at this level, but the inverse of these concerns also highlight the value of a data access exhaust system at this level. This is important information, that can provide some real time signals for both the public and private sector to consider more deeply.

I am purposely using the word exhaust here, specifically as a noun, as I don't think people are considering this layer, and may often see log files, and other ways data being generated in this as way as an unusable byproduct and output of web and mobile operations. I want to help people see the potential dangers of exhaust from API-centric data operations, but also understand that when it is captured, it can become valuable, similar to natural gas capture from recycling or landfill operations. There are some toxic aspects of API-driven data operations, but when measured and controlled, and made more observable, the dangerous aspects can be mitigated, and you might also find ways that other reuse and extraction that can also occur along the way. 


Taxation On Public Data Via The API Management Layer

I'm involved in some very interesting conversations with public data folks who are trying to push forward the conversation around sensible revenue generation by cities, counties, state, and the federal government using public data. I'm learning a lot from these conversations, resulting in the expansion and evolution my perceptions of how the API layer can help the government develop new revenue streams through making public data more accessible. 

I have long been a proponent of using modern API management infrastructure to help government agencies generate revenue using public data. I would also add that I'm supportive of the crafting of sensible approaches to developing applications on top of public data and API in ways that generate a fair profit for private sector actors. I am also in favor of free and unfettered access to data, and observability into the platform operations, as well as ALL commercial interests developing applications on top of public data and APIs. I'm only in favor of this, when the right amount of observability is present--otherwise digital good ol boy networks form, and the public will lose.

API management is the oldest area of my API research, expanding into my other work to eventually define documentation, SDKs, communication, support, monetization, and API plans. This is where you define the business of API operations, organizing APIs into coherent catalogs, where you can then work to begin establishing a wider monetization strategy, as well as tiers and plans that govern access to data, content, and algorithms being made available via APIs. This is the layer of API operations I'm focusing on when helping government agencies better understand how they can get more in tune with their data resources, and identify potential partnerships and other applications that might establish new revenue streams.

A portion of this conversation that I am having was involved in the story from Anthony Williams about maybe government data shouldn't always be free, where the topic of taxation came up. One possible analogy for public data access and monetization was brought up as a reference to the Vehicle-miles Traveled (VMT) tax, injecting the concept of taxation to my existing thoughts on revenue generation using API management. I've considered affiliate and reseller aspects to the API management layer before, applying percentage based revenue and payments on top of API access, but never thought about a government taxation layer existing here.

I thought my stance on revenue generation on public data using API management was controversial before, adding in concepts of taxation to the discussion is really going to invigorate folks who are in opposition to my argument. I'm sure there is a libertarian free web, open data advocate, smaller government Venn diagram in there somewhere. I'm not too concerned, as the monetization is already going on, I'm simply talking about making it more observable, and in favor of revenue generation for data stewards and government agencies. I'm confident that most won't folks in opposition won't even read this paragraph, as it's buried in the middle of this post. ;-)

I take no stance on which data, content, or algorithms should be taxed, or what that tax rate should be. I leave this to data stewards and policy makers. My objective is to just introduce folks to the concept, and marry with the existing approaches to using APIs to develop digital products and services in the private sector. However, if I was wearing my policy maker hat I would suggest thinking about this as a digital VAT tax, "that is collected incrementally, based on the surplus value, added to the price on the work at each stage of production, which is usually implemented as a destination-based tax, where the tax rate is based on the location of the customer."

My thoughts on a government tax at the API management layer are at an early stage. I am just exploring the concept on my blog--this is what I do as the API Evangelist. I'd love to hear your thoughts, on your blog. I am merely suggesting a digital VAT tax at the API contract layer around public data and APIs when commercial activities are involved. Eventually, I could see the concept spread to other sectors as the API economy becomes a reality, but I feel that public data provides us with a rich test bed for a concept like this. I'm considering reframing my argument about charging for commercial access to public data using APIs as taxing commercial usage of public data using APIs, allowing for tax revenue to fund future investment in public data and API efforts.

As I remove my API Evangelist hat and think about this concept, I'm not 100% sure if I'm in agreement with my argument. It will take a lot more polishing before I'm convinced that taxation should be included in the API management layer. I'll keep exploring, and play with a variety of potential use cases, and see if I can build a case for API taxation when public data is involved, and applications are generating surplus value in the API economy. 


Open Discussions About Funding API Startups

It made me happy to read the Rise of Non “VC compatible” SaaS Companies, and see that there are more sensible discussions going on around how to develop SaaS business, something that I hope spreads into the specifically API as a product startups and API service providers. I know that many of my readers think I'm anti-VC--I am not. Or may I'm anti-startup--I am not. I'm anti-VC and anti-startup ideology becoming the dominant religion, pushing out a lot of really good people and ideas who can't play that game.

When it comes to Silicon Valley, if you push back, you get excluded from the club, and there are waves of people who step up to tell you "not all startups are bad" or "not all VCs are bad"--I wish I could help you understand how this response makes you look. Of course, they aren't all bad, but there are bad ones, and there is a lot of rhetoric that this is the ONLY way you can build technology when it isn't. There are plenty of ways you can develop technology, and build a business without the VC approach, or the cult of the startup. 

There are more instructions you should follow in the rise of the non-VC compatible SaaS companies story, but the author outlines four types of SaaS companies, which I think applies nicely to APIi companies, as many of them will be SaaS  providers:

  1. Funded SaaS: companies which finance their business with VCs a.k.a equity against money. From early stage startups with no revenue to companies going public with hundreds of millions of dollars of ARR, the range is extremely wide.
  2. Bootstrapped “scaling” SaaS companiesSaaS companies which manage to pass the $10M ARR threshold without VC money. Ex: Mailchimp or Atlassian (which raise VC money but at a very late stage) have reached the hundreds of millions of dollars of ARR without VC money. These “unicorns among unicorns” are very rare.
  3. Bootstrapped SaaS companies: bootstrapped companies which manage to reach the $300k — $10M ARR range without VC money.
  4. Bootstrapped Micro SaaS: “1 to 3” people companies which operate in the $1k — $300k ARR range, without VC money.

There are some ideas that should go VC, but there are even more that should not. I want to see more talk like this about the funding realities of an API startups. A sensible discussion about what the goals are, how fast a company should grow, and whether the company is building a business to develop software solutions that we sell to customers, or a business to sell some large enterprise--hell, maybe even go IPO. These are all viable options for your business, I'm just looking for more discussion about the priorities. One more thing, you should be having this discussion out in the open with the customers you are supposedly selling your products and services to--this is the important part, not just the having of the discussion.

I'm not trying to get in the way of you getting super filthy rich. I'm just trying to strike a balance between you building a company, and the stability and availability of your APIs, and API tools and services, in an industry I depend on to make my living. You see, APIs have gotten a bad wrap for not being stable, and going away at any time. This isn't an API thing, this is a VC funded startup thing. When you are telling your companies that you are building a business, with products and services to purchase, and then everyone bakes your solutions into their solutions, and you go away--that sucks. If you tell people the truth from the beginning and are honest and communicative about what your plans are, people can build and plan accordingly--the problem isn't APIs going away, it is startup founders not caring.

I am just trying to strike a balance between your business aspirations, and those of the people I help convince to use your APIs. Many of them aren't trying to get rich. They are just trying to build a business, and get their work done with your service, tool, and API. Let's start having more honest and open conversation about the variety of business models available to use when building API startups, and remember that not everything is a VC-worthy idea, sometimes you are just a viable feature for regular business owners like me.


My New API Vendor Evaluation Checklist

I am helping a customer think through their decision-making process around the adoption of a new API service, and while I am doing this I am spending the time to think through my own API adoption process. I like having checklists to consider when making new purchasing and integration decision. Sometimes I have an immediate need which is driven by emotion, and it can help to step back and think through a more rational checklist I established for myself on a previous date.

When I am approaching a new API that I think might move beyond just playing around, and actually have a viable business use case in my operations, I think through the following areas:

  • Define Value - What value is being created by an API I'm looking to use.
  • Define Needs - What needs do I have which using an API will solve.
  • Define Options - What other solutions are there beyond this API.
  • Think About Today - Is this an immediate need I have with days or weeks.
  • Think About Tomorrow - Is this a long term need that will go on for years.
  • Vet Company & People - More detail about the company, people, and investors.
  • Define Partners - What does my the partnership landscape look like for the API.
  • What Things Cost - What are things going to cost be for using an API.
  • What You Can Afford - Can I really afford using this service, or not use.
  • Develop Continuity Plan - What is the plan for ensuring stable operations using API.
  • Develop Exit Plan - How will I severe a relationship and replace or deprecate need.

Sometimes I do not have everything in order before I pull the trigger. I may not completely though through what I will do when an API goes away, but I like at least having some notes written down about my frame of mind when I was integrating with any API. Having the checklist to think about at the time of integration and purchase, as well as my notes about evaluation around a vendor provides me with something I can revisit each month when paying the bill, or may as I encounter a new service, or possibly instability or lack of access to an API.

I am using this as part of my own API operations, but I'm also helping a client think through their API purchasing decisions, and hopefully, make the process a much healthier and educated one. I'm hoping this helps put much of the burden of API stability on the shoulders of us people who are actually making the decision, allowing us to be more mindful of who we integrate with, and set more informed and honest expectations around what APIs do, and where some of the current business models in place might help, or hurt our plans.


FREE Always Seems To Suck The Oxygen Out Of The Room

I closely watch the value the digital bits being exchanged via the Interwebz--it is what I do. @audreywatters always says that APIs are "reducing everything to a transaction", and I am interested understanding the value of these bits, what people are buying and selling them for, and how it keeps the Internet machine chugging along--for better or worse. As I watch Audrey battle with folks about the availability of content with her domain and experience my own shift in what should be made freely available by API providers, I'm left thinking about the damaging effects free has had on our world.

I feel like the seeds of this were set into motion by John Perry Barlow followers imparting their ideology on the web, but was something that was capitalized on during the Web 2.0 movement by tech giants like Google, Twitter, and Facebook when it came to leveling the playing field, giving them the competitive advantage they needed. It is very difficult to compete with FREE. Only certain companies can operate in this environment. It's a brilliant and cutthroat way of doing business, setting a tone for how business should be done in a way that keeps competitors out of the game. When the free and open Internet armies become wielded by this type of passive aggressive capitalism, the resulting zombie army becomes a pretty effect force for attacking any providers who are left operating in this oxygen-deprived environment.

These free zombie armies think the web should be free and openly accessible for them to do what they want, most notably build a startup, getting funding, and sell that startup to another company. Your detailed website of business listings, research into an industry, and other valuable exhaust that MUST remain free is ripe for picking, and inclusion into their businesses. The zombies rarely go picketing after tech giants, telling them that everything must remain free and available, they go after the small service provider who is trying to make a living and build an actual small business. If the tech giants sucking the oxygen out of space with FREE don't get you, the free and open zombies will pick you clean through a sustained, and systematic assault from Reddit and Hacker News.

I'm always amazed at the bipolar conversations I have with folks about how I manage to make a living doing my API research, how rich and detailed my work is, while also being asked to jump on a phone call to talk through my research, so it can be used in their startup, marketing, or infographic. Never being asked if they could pay me, and when I mention getting paid-- they often just scatter. This continuous assault on the web has pushed me to shift my views on what should be FREE, and what we publish and openly license on the web, as well as make available at the lowest tiers of our APIs. These are my valuable bits. I've managed to stay alive and make a living in a field where most analysts either burn out or are acquired and coopted. My bits are how I make a living, please stop demanding that they always be free. Not everyone can operate like Google, Facebook, or Twitter--sometimes things cost money to do well, and might not need to be done at scale.


Wearing My Tech Vendor Hat When It Comes To Public Data

This is a multipart story on monetizing public data using APIs. I have spent the last seven years studying over 75+ aspects of the API delivery lifecycle across companies, organizations, institutions, and government agencies. This project is designed to be a distillation of my work to help drive a conversation around sensible and pragmatic revenue generation using public data--allowing the city, county, state, and federal government agencies to think critically about how open data efforts can exist and grow. It lives as a standalone repository, as well as individual stories that are meant to stand on their own, while also contributing to an overall narrative about public data monetization.

While my primary income is not derived from developing software for sale, I have developed commercial software throughout my career, and actively maintain my own API driven technology platform for tracking on the API industry. This is my best attempt to put on my technology vendor hat on for a bit to better understand the concerns and perspective of the software vendors involved with the public data sector. There is a wide spectrum of technology vendors servicing the space, making this exercise difficult to generalize, but I wanted to take a shot at defending and jumpstarting the conversation at the commercial vendor level.

Commercial tech vendors are always at the frontline of discussion around monetization of public data, for better or worse. When open data activists push back on my work to understand how public data can be monetized, the most common response I have is that public data is already being monetized by commercial vendors, and my work is about shining a light on this, and not being in denial that it is already occurring everywhere. Here are some of my thoughts from the public data commercial vendor landscape:

  • Investment In Data - As a technology company I am investing a significant amount of resources into our data, and the data of our customers. While views may greatly vary on how much ownership platform and technology operators have around the public data they touch, it can't be argued that commercial vendors play a significant role--the discussion should be more about how great of a role, and how much ownership is there.
  • Investment in Software - Beyond the data, we are investing a significant amount of resources into software, that our customers use, and we use to manage our business internally. This is where we will keep most of the proprietary value generated around public data, although the door around the availability of open source tooling needs to remain open. Similar to data, the question is about how much ownership over software do I need as a commercial vendor and how much can I give back to the community.
  • Lead Generation - I am interested in generating leads for new customers, and joining in new conversations that demonstrate the value of the products and services that my company brings to the table.
  • Sell My Services - I am interested in selling my products and services, and my motivation is going to reflect this. No matter what our mission or marketing may say, I'm interested in generating a profit for my company, and its interests.
  • Premium Services - Our domain expertise, and investment in data and software opens up the opportunity for us to offer premium services on top of public data operations. While our customers may not always pay directly for data storage and management, or even software licenses, the ability to sell premium services is valuable to everyone involved.
  • Protect Intellectual Property - It is important to us that our intellectual property is protected in all conversations, and that the licensing of data and software is respected, and reflected in our partnerships. While perspectives on what is appropriate regarding intellectual property will vary, it is important that IP Is always an up-front part of the conversation.
  • Investment in R&D - Commercial vendors are more likely to invest in research and development, helping push forward innovation around public data, something that isn't always possible unless there are clear revenue opportunities for commercial operators and clear communication and understanding with non-commercial stakeholders about what is possible, and being done.
  • Consistent Support - One important thing commercial vendors bring to the table for government agencies, and non-commercial stakeholders are the opportunity for consistent support. As seasons change in government, commercial vendors can be there to fill gaps in business and technical support services, keeping important conversations moving forward.

I have to be 100% transparent here and stop to say that while I am advocating for revenue generation around public data, I'm not always a proponent of that revenue benefitting commercial interests. First, and foremost, I want revenue to benefit the public, secondarily the non-commercial stakeholders, then thirdly for the benefit commercial vendors. Making this argument from the commercial vendor perspective is possible for me, just not something I'm always going to be in full support of, and I will always insist on pushing back on aggressive behavior from commercial vendors to dominate the conversation, in favor of data stewards, and the public.

With that said, I'm a believer that commercial activity can benefit public data efforts. Sensible revenue can be generated from delivering services, products, and tooling developed around public data, while also investing back into data operators, owners, and stewards, and most importantly benefit those being served. Depending on where you operate in the public data space you will see this argument, or hopefully conversation, differently. This is just an attempt to look at things from the side of commercial vendors, and being honest and transparent about what the commercial interests are when it comes to public data.

You can keep an eye on my public data monetization research via a separate site--I will be adding each segment here on the blog, as well as the individual project website. You can participate via the Github repository I am using to manage all my work in this area.


My Concerns As A Public Data Steward

This is a multipart story on monetizing public data using APIs. I have spent the last seven years studying over 75+ aspects of the API delivery lifecycle across companies, organizations, institutions, and government agencies. This project is designed to be a distillation of my work to help drive a conversation around sensible and pragmatic revenue generation using public data--allowing the city, county, state, and federal government agencies to think critically about how open data efforts can exist and grow. It lives as a standalone repository, as well as individual stories that are meant to stand on their own, while also contributing to an overall narrative about public data monetization.

I am a database person. I have had a professional career working with databases since 1987 when I began working with COBOL databases as part of student information systems in the State of Oregon. After the Internet became a thing in 1996 I began to architect a variety of database driven web applications. Ten years later, as the cloud began to form, I began architecting distributed data-driven systems using web technology, aka APIs. I understand data. I understand the challenges of being a data administrator, operator, and steward. I wanted to take this experience and awareness and apply it to helping data operators and stewards become more successful when it comes to achieving their mission. 

As part of this exercise, I wanted to put on my data steward hat for a few, and think about my needs when it comes to monetization of my data. While this project is focused on the monetization of public data, in reality, much of the logic can also be applied to any type of data, it just depends on your view of the data landscape--here is what comes to mind: 

  • Hard Work - I have invested a lot of time and resources into my data. This wasn't just a one-time thing. I am perpetually investing in my data, and I would like to see this reflected and respected in all partner engagements I have, wherever my data is used.
  • Hard Costs - There are hard costs involved with managing my data, including storage, compute, and bandwidth charges. These costs are a big concern for me as a small business operator, and something that will only increase as my data expands, evolves, and is consumed by a larger audience.
  • Quality Control - I am extremely concerned about the quality of data, and my process, domain expertise, and overall experience contribute significantly to the quality of my data, and its value to my partners, and the general public.
  • Provenance - Where data comes from is important to me. I keep track of all the sources of where my data comes from. I keep a detailed log of the provenance of data as it is acquired, and ingested into any system. I expect my partners to respect this practice.
  • Accessibility - It is important that my data is accessible to all key stakeholders. I want to have a master list of all stakeholders, and ensure frictionless access to exactly the data I want them to have, and nothing more. I want to be able to allow and revoke access as I see fit.
  • Security - It is vital that all data is secured in storage, in transport, and across partner systems. Security should be built into all aspects of my data life cycle, and something that is easily explained to partners and can be measured and reported upon.
  • Revenue - I want to be able to generate revenue to invest back into the acquisition, development, and management of my data resource. It costs money to do what I do, resulting in an extremely valuable resource that is desirable by a large audience--I should be able to derive revenue from this when it is ethical and makes sense, especially when it comes to engaging with other commercial entities.

I am sure there a number of concerns for data administrators, operators, and stewards from different sectors, but generally speaking, as a data steward, these are the top concerns for me. Whether I'm working in a small business, large enterprise, organization, institution, or government agency, these are going to be some of my most immediate concerns. No matter how my budget works, I'm going to need a way to quantify the direct and indirect costs, and translate this to some value to someone--being able to generate sensible revenue is an important part of making sure data is 1) accessible, but also 2) continues to deliver value to all stakeholders at the table.

This is just one component in a larger argument about generating revenue from public data using APIs, but when we are talking about publicly available data, most conversations are going, to begin with, and continue to focus on the data operator or steward. I choose to label my role as a data steward, over user the phrase data operator, because I usually have a personal stake in the data I'm managing--it is just the way I do business. This isn't always the case, but in my experience, when someone cares about the data they are managing, the quality is always better, and the value significantly increases, deserving a separate descriptor.

This post is crafted while thinking about my own data projects, ranging from my data driven API Evangelist research to my Adopta Agency work. I depend on data to operate my business. I also depend on other data providers to operate my business. Some of these data providers are government agencies who are severely underfunded, and I want them to be more successful in what they do, and generate the revenue they need to meet their objectives, but also stay in operation for my own selfish needs. Much like minerals, water, and trees on public lands, there is a lot of value in the public data resources out there, and this is just an exercise to open up an honest discussion around how we might possibly generate revenue from public data.

You can keep an eye on my public data monetization research via a separate site--I will be adding each segment here on the blog, as well as the individual project website. You can participate via the Github repository I am using to manage all my work in this area.


Your Wholesale API For Sale In The Major API Marketplaces

I have been talking about selling wholesale APIs for some time now, allowing your potential customers to pick and choose exactly the API infrastructure they need, and develop their own virtualized API stacks. I'm not talking about publishing your retail API into marketplaces like Mashpe, I'm talking about making your API deployable and manageable on all the major cloud providers. 

You see this shift in business with a recent AWS email I got telling me about multi-year contracs for SaaS and APIs. Right now there are 70 SaaS products on AWS Marketplace, but from the email I can tell that Amazon is really trying to expand it's API offerings as well. When you deploy an API solution using the AWS Marketplace, and a customer signs up for a one, two, or three year contract, they don't pay for the underlying AWS infrastructure, just for the SaaS, or API solution. I will have to expore more to see if this is just absorbed by the API operator, or AWS working to incentivize this type of wholesale API deployment in their marketplace, and locking in providers and consumers.

I'm still learning about how Amazon is shifting the landscpe for deploying and managing APIs in this wholesale, almost API broker type of way. I recently came across the AWS Serverless API Portal, which is meant to augment the delivery of SaaS or API solutions in this way. With this model you could be in the business of deploying API developer portals for companies, and fill ingthe catalog with a variety of wholesale API resources, from a varietiy of providers--opening up a pretty interesting opportunity for white label APIs, and API brokers.

As I'm studying this new approach to deploying and managing APIs using marketplaces like this, I'm also noticing a shift towards deliving more algorithmic APIs, with machine learning, artificial intelligence, and other voodoo as the engine--resulting in a shift towards machine learning API marketplaces. I really need to carve off time to think about API deployment and management in this way. I've already begun looking at what it takes to deploy bare bones, wholesale APIs using AWS, Google, Heroku, or Azure clouds, but I really haven't invested much in the business side of all of this, soewhere Amazon seems to be slightly ahead of the curve in.


Lots Of Talk About Machine Learning Marketplaces

I spent last week in San Francisco listening to Google's very machine learning focused view of the future. In addition to their Google Next conference, I spent Tuesday at the Google Community Summit, getting an analyst look at what they are up to. Machine Learning (ML) was definitely playing a significant role in their strategy, and I heard a lot talk of machine learning marketplaces.

Beyond their own ML offerings like video intelligence and vision APIs, Google also provides you with an engine for publishing your own ML models. They also have a machine learning advanced solution lab, throwing a machine learning hackathon, and pushing a machine learning certification program as part of their cloud and data offerings. As the Google machine learning roadmap was being discussed throughout the day, the question of where can I publish my ML models, and begin selling them, came up regularly--something I feel like is going to be a common theme of the 2017 ML hype.

I'm guessing we will see a relationship between the Google ML engine, Google Cloud Endpoints emerge, and eventually some sort of ML marketplace like we have with Algorithmia. We are already seeing this shift in the AWS landscape, between their Lambda, ML, API Gateway, and AWS Marketplace offerings. You see hints of the future in the AWS serverless API portal I wrote about previously. The technology, business, and politics of providing retail and wholesale access to algorithms and machine learning models in this way fascinates me, but as with every other successful area of the API economy, about 90% of this will be shit, and 10% will be actually doing anything interesting with compute and APIs.

I'm doing all my image and video texture transfer machine learning model training using AWS and Algorithmia. I then use Algorithmia to get access to the models I've trained, and if I ever want to open up partner level (wholesale), or public (retail) access to my ML Models I will use Algorithmia, or an API facade on top of their API to open up access, and make available in the Algorithmia ML marketplace. I'm guessing at some point I will want to syndicate my models into other marketplace environments, with giants like Google and AWS, but also other more niche, specialty ML marketplaces, where I can reach exactly the audience I want.


Dreaming Of A More Modular Event Driven API Monetization

I was learning about the approach Amazon has taken with their serverless API developer portal, and highlighting their approach to API plans, and couldn't help but think there was more to it all than just rate limiting your API. Amazon's approach to API plans is in alignment with other API management providers, allowing you to deploy your APIs, meter, rate limit, and charge for access to your API--standard business of APIs stuff.

Controlling access to a variety of API resources is something that has been well-defined over the last decade by API management providers like 3Scale, and now Tyk and DreamFactory. They provide you with all the tools you need to define access to APIs, and meter access based upon a wide variety of parameters. While I haven't seen the type of growth I expected to see in this area, we have seen a significant amount of growth because API management providers are helping to standardize things--something that will grow significantly because of availability from cloud providers like AWS, Microsoft, and Google.

We have a lot of work ahead of us, standardizing how we charge for API consumption at scale. We have even more work ahead of us to realize that we can turn all of this on its head, and start paying for API consumption at scale. I do not understand how we've gotten so hung up on the click and the view, when there are so many other richer, and more meaningful actions already occurring every second, of each day online. We should be identifying these opportunities, then paying and incentivizing developers to consume APIs in most valuable way possible. 

With modern approaches to API management, we already have the infrastructure in place. We need to just start thinking about our APIs differently. We also need to get better at leveraging POST, PUT, and PATCH, as well as GET, when it comes to paying for consumption. Imagine a sort of event driven API affiliate layer to the web, mobile, device, and even conversational interfaces--where developers get paid for making the most meaningful events occur. It makes the notion of paying per view or click seem really, really, shit simple.

Anyways, just a thought I needed to get out. The lack of innovation, and abundance of greed when it comes to API monetization and planning always leaves me depressed. I wish someone would move the needle forward with some sort of modular, event-driven API monetization framework--allowing some different dimensions to be added to the API economy.


Including End Users In the Conversation About Their Bits Being Sold

Fitbit recenttly announced a program to pay their wearable users up to $1500 for integrating their Charge 2 into the UnitedHealthcare Motion program powered by Qualcomm Life’s 2net Platform. The "UnitedHealthcare Motion is an employer-sponsored wearable device wellness program that offers financial incentives for enrollees who meet daily step goals". Pulling back the curtain just a little bit on the value of your Internet of Things data, and specifically the devices you strap to your body.

I am not a fan of corporations strapping devices to their employees as part of these wellness programs (or for any reason), and using cash incentive to achieve the desired behavior. I feel this is a doorway to some pretty dark human resources strategies, but I do think these events pull back the curtains on what is going on, even just a little bit for users. I am sure it's not Fitbit's intention to include end-users in all of their monetization of their data, but I see this as an opportunity to educate end-users in these situations.

Most Internet users are not aware of the amount of information being gathered, bought and sold when they use the Internet, and their mobile phones. This lack of awareness is translating pretty nicely to the world of connected devices, adding some valuable demographic dimensions, to an already valuable user profile. While insurance companies are interested in improving their margins with this data, they are also interested in the new revenue streams they can create by selling data to other brokers, hedge funds, and more. 

One of the only hopes I have in this area is that startups will continue to pull back the curtain on this behavior either intentionally or unintentionally with products, services, and programs like the wellness program that Fitbit is offering. Showing users that there is value in their data, and this can be a positive first step in educating them about what is happening in the tech world. Once they get a taste of making some actual cash from their data, I'm hoping that more users establish an appetite for understanding the value of their information.

I'm counting on future waves of startups blindly disrupting industries by continuing to pull back the curtain, as well existing companies looking to gain a competitive advantage by operating this way. Leveraging market forces against industry leaders is one of the most important tools we have in our toolbox to combat exploitation of our data. You will find me encouraging companies to do this as a disruptive tactic, and as a competitive edge, not because I want to help them, but because I want to help end-users with each wave. It is my way of using their quest for startup success against the practices of the wider industry and leveraging it to help end-users in any way that I can.


Thinking About The Monetization Layer For Public Data

This is my walk-through of the concepts involved with the monetization of public data using APIs. In this work I am not advocating that companies should be mindlessly profiting from publicly available data, my intent is to provide a framework for organizations to think through the process of generating revenue from commercial access to public data, acknowledging that it costs money to aggregate, serve up, and keep data up to date and usable for the greater public good--if public data is not accessible, accurate, and up to date it is of no use to anyone.

I have long argued that companies and even government agencies should be able to charge for commercial access to public data and be able to generate revenue to cover operational costs, and even produce much-needed funds that can be applied to the road map. My work in this has been referenced in existing projects, such as the Department of Interior and Forest Service looking to generate revenue from commercial access and usage of public data generated by the national parks systems. In my opinion, this conversation around generating revenue from publicly available digital assets should be occurring right alongside the existing conversations that already are going on around publicly available physical assets.

Building Upon The Monetization Strategies Of Leading Cloud Providers
My thoughts around generating revenue from public open data is built upon monitoring the strategies of leading online platforms like Amazon Web Services, Google, and others. In 2001 a new approach to providing access to digital resources began to emerge from Internet companies like Amazon and Salesforce, and by 2016, it has become a common way for companies to do business online, providing metered, secure access to valuable corporate and end-users data, content, and other algorithmic resources. This research looks to combine these practices into a single guide that public data stewards can consider as they look to fund their important work.

Do not get me wrong, there are many practices of leading tech providers that I am not looking to replicate when it comes to providing access to public data, let alone generating revenue. Much of the illness in the tech space right now is due to the usage of APIs, it is due to a lack of creative approaches to monetizing digital assets like data and content, and terms of service that do not protect the interest of users. My vantage point is the result of six years studying the technology, business, and politics of the API sector, while also working actively on open data projects within city, state, and federal government--I'm looking to strike a balance between these two worlds.

Using Common Government Services As A Target For Generating Much-Needed Revenue
For this research, I am going to use a common example of public data, public services. I am focusing in this area specifically to help develop a strategy for Open Referral but it is also a potential model that I can see working beyond just public services. I am looking to leverage my existing Open Referral work to help push this concept forward, but at the same time, I am hoping it will also provide me with some public data examples that are familiar to all of my readers, giving me with some relevant ways to explain some potentially abstract concepts like APIs to the average folk we need to convince.

For the sake of this discussion things down and focus on three areas of public data, which could be put to work in any city across the country:

  • Organizations - The business listings and details for public and private sector agencies, businesses, organizations, and institutions.
  • Locations - The details of specific locations which organizations are providing access to public services.
  • Services - The listings and details of public services offered at the municipal, county, state, or federal levels of government.

Open Referral is a common definition for describing public services organizations, locations, and services, allowing the government, organizations, institutions, and companies to share data in a common way, which focuses on helping them better serve their constituents--this is what public data is all about, right? The trick is getting all players at the table to speak a common language, one that serves their constituents, and allows them to also make money.

While some open data people may snicker at me suggesting that revenue should be generated on top of open data describing public services, the reality is that this is already occurring--there are numerous companies in this space. The big difference is it is currently being done within silos, locked up in databases, and only accessible to those with the privileges and the technical expertise required. I am looking to bring the data, and the monetization out of the shadows, and expand on it in a transparent way that benefits everybody involved.

Using APIs To Make Public Data More Accessible and Usable In A Collaborative Way
Publicly available data plays a central role in driving websites, mobile applications, and system to system integrations, but simply making this data available for download only serves a small portion of these needs, and often does so in a very disconnected way, establishing data silos where data is duplicated, and the accuracy of data is often in question. Web APIs are increasingly being used to make data not just available for downloading, but also allow it to be updated, and deleted in a secure way, by trusted parties. 

For this example I am looking provide three separate API paths, which will give access to our public services data:

  • http://example.com/organizations/ - Returns JSON or XML listing and details of organizations for use in other applications.
  • http://example.com/locations/ - Returns JSON or XML listing and details of organizational locations for use in other applications.
  • http://example.com/services/ - Returns JSON or XML listing and details of public services for use in other applications.

A website provides HTML information for humans, and web APIs provides machine readable representations of the same data, making it open for use in a single website, but also potentially multiple websites, mobile applications, visualizations, and other important use cases. The mandate for public data should ensure it isn't available on a single website but is as many scenarios that empower end-users as is possible. This is what APIs excel at, but is also something that takes resources to do properly, making the case for generating revenue to properly fund the operations of APIs in the service of the public good.

The Business of Public Data Using Modern Approaches to API Management
One of the common misconceptions of public web APIs is that they are open to anyone with access to the Internet, with no restrictions. This might be the case for some APIs, but increasingly government agency, organizations, and institutions are making public data available securely using common API management practices defined by the Internet pioneers like Salesforce, Amazon, and Google over the last decade.

API management practices provide some important layers on top of public data resources, allowing for a greater understanding and control over how data is accessed and put to use. I want to provide an overview of how this works before I dive into the details of this approach by outlining some of the tenets of an API management layer:

  • Users - Requiring users to register, establishing a unique account for associated all API and public data activity.
  • Applications - Requiring users to define the application (the web, mobile, visualization, etc.) and other viable information regarding their access to the public data.
  • Keys - Issuing of unique API keys for each application, requiring their inclusion in all consumption of public data via the API.
  • Service Composition - Placement of public data resource (organizations, locations, services) into tiers, defining which APIs different users have access to and the scope of that access.
    • Resource Specific - Allowing access to specific data resources to a select audience.
    • Read / Write - Restricting write access to select users and applications. 
    • Data Specific - Limiting which data is returned, filtering based on who is requesting it.
  • Rate Limits - All APIs are rate limited, allowing for different levels of access to public data resources, which can be defined in alignment with the costs associated with operations.
  • Logging - Each API call is logged, with required user application keys, as well as details of the request and response associated with each API call.
  • Analytics - The presence of a variety of visual layers that establish an awareness of who is accessing public data APIs, what they are accessing, and details on how and where it is being applied.

These seven areas provide some very flexible variables which can be applied to the technical, business, and politics of providing access to public data using the Internet. Before you can access the organizations, locations, and service information via this example public services API you will need to be a registered user, with an approved application, possessing valid API keys. Each call to the API will contain these keys, identifying which tier of access an application is operating within, which API paths are available, the rate limits in existence, and logging of everything you consume and add so it can be included as part of any operational analytics. 

This layer enables more control over public data assets, while also ensuring data is available and accessible. When done thoughtfully, this can open up entirely new approaches to monetization of commercial usage by allowing for increased rate limits, performance, and service level agreements, which can be used to help fund the public data's mandate to be accessible by the public, researchers, and auditors.

Providing The Required Level Of Control Over Public Data Access
Understandably, there concerns when it comes to publishing data on the Internet. Unless you have experience working with modern approaches to delivering APIs it can be easy to focus on losing control over your data when publishing on the web--when in reality data stewards of public data can gain more control over their data when using APIs over just publishing for a complete download. There are some distinct ways that API providers are leveraging modern API management practices to evolve greater levels of control over who accesses data, and how it is put to use.

I wanted to highlight what can be brought to the table by employing APIs in service of public data, to help anyone make the argument for why providing machine readable data via APIs is just as important as having a website in 2016:

  • Awareness - Requiring all data to be accessed via APIs which required keys to be used for ALL applications, combined with a comprehensive logging strategy, brings a new level of awareness regarding which data is accessed, and how it is being used, or not used.
  • Security - While API keys are primarily used for logging and analytics, it also ensures that all public data resources are secured, providing tiered levels of access to 3rd parties based upon trust, contributing value to the data, and overall platform involvement--allowing data to be securely made available on the open web.
  • Quality Control -  APIs act as central gatekeeper regarding how data is updated, evolved, and deleted, allowing for a centralized, yet also potentially distributed, and collaborative approach to ensuring public data is accurate, possessing a high level of quality control.
  • Terms of Service - All API usage is governed by the legal terms of service laid out as part of platform operations, requiring all users to respect and follow terms of service if they expect to maintain their public data API keys.
  • Governance - Opening up the overall management of the availability, usability, integrity, and security of the public data which may include oversight from governing body or council, a defined set of procedures, and a plan to execute those procedures.
  • Provenance - Enabling visibility into the origins, history, and oversight of public data, helping establish the chain of custody regarding shared use of valuable data across platform operations.
  • Observability - Allowing for the observability of data resources, and its contributors and consumers using existing platform outputs and mechanisms, enabling high levels of awareness through the API management framework employed as part of platform operations, meeting service level agreements, and expected availability.

It is important to discuss, and quantify this control layer of any public data being made available via APIs if we are going to talk about monetization. Having APIs is not enough to ensure platform success, and sometimes too strict of control can suffocate consumption and contribution, but a lack of some control elements can also have a similar effect, encouraging the type of consumption and contribution that might not benefit a platform's success. A balanced approach to control, with a sensible approach to management and monetization, has helped API pioneers like Amazon achieve new levels of innovation, and domination using APIs--some of this way of thinking can be applied to public data by other organizations.

Enabling and Ensuring Access To Public Data For Everyone It Touches
Providing access to data through a variety of channels for commercial and non-commercial purposes is what modern API management infrastructure is all about. Shortly after possessing a website became normal operating procedure for companies, organizations, institutions, and government agencies, web APIs began to emerge to power networks of distributed websites, embeddable widgets, and then mobile applications for many different service providers. APIs can provide access to public data, while modern API management practices ensure that access is balanced and in alignment with platform objectives--resulting in the desired level of control discussed above.

There are a number of areas of access that can be opened up by employing APIs in the service of public data:

  • Internal - APIs can be used by all internal efforts, powering websites, mobile applications, and other systems. The awareness, logging, and other benefits can just as easily be applied to internal groups, helping understand how resources are used (or not used) across internal groups.
  • Partner - After internal access to data resources, access levels can be crafted to support partner tiers of access, which might include access to special APIs, read and write capabilities, and relaxing of rate limits. These levels often include service level agreements, additional support options, as well as other benefits.
  • Public - Data can be made publicly available using the web, while also maintaining the quality and security of the data, keep the access as frictionless as possible, while ensuring things stay up and running, and of expected quality and availability.
  • Privacy - Even with publicly available data there is a lot to consider when it comes to the privacy of organizations, locations, and services involved, but also the logging, and tracking associated with platform operations.
  • Transparency - One important characteristic of API platform is transparency in the API management layer, being public with the access tiers, resources available, and how a platform operates--without necessary transparency, consumers can become distrustful of the data.
  • Licensing - Ideally all data and all schema in this scenario would be licensed as CC0, putting them into the public domain, but if there are license considerations, these requirements can be included along with each API response, as well as in platform documentation.
  • Platform Meta API - APIs do not just provide access to the public data, they also provide access to the API management layer for the public data. Modern API management allows for API access to the platform in the several important ways:
    • Users - Providing API access to user's data and usage.
    • Apps - Providing API access to applicaiton level data and usage.
    • Tiers - Providing API access to platform tiers and details.
    • Logs - Providing API access to the platform log files.
    • Billing - Providing API access to the platform billing for access.
    • Analytics - Providing API access to the analytics derived from logs, billing, and usage.
  • Observability - An API management layer on top of public data makes data access observable, allowing platform operators, government agencies, and potentially 3rd party and independent auditors--observability will define both the control as well as access to vital public data resources.

In a world that is increasingly defined by data, access to quality data is important and easy, secure access via the Internet is part of the DNA of public data in this world. API management provides a coherent way to define access to public data, adhering to the mandate that the data is accessible, while also striking a balance to ensure the quality, reliability, and completeness of the public data.

There has been a lot of promises made in the past about what open or public data can do by default when in reality opening up data is not a silver bullet for public services, and there is a lot more involved in successfully operating a sustained public data operation. APIs help ensure data resources are made available publicly, while also opening up some new revenue generation opportunities, helping ensure access is sustainable and continues to provide value--hopefully find a balance between public good and any sensible commercial aspirations that may exist.

APIs Open Up Many Potential Applications That Support the Mission
As doing business on the web became commonplace in the early 21st century, Amazon was realizing that they could enable the sales of their books and other products on the websites of their affiliate partners by using APIs. In 2016 there are many additional applications being developed on top of APIs, with delivering public data to multiple web sites being just the beginning.

  • Web - It is common for websites to pull from a database. Increasingly APIs are being used to drive not just a single website, but networks, and distributed websites that draw data and content from a variety of sources.
  • Mobile - APIs are used to make data and content available across a variety of mobile applications, on different platforms.
  • Embeddable - Delivering data to buttons, badges, bookmarklets, and widgets that can be embedded across a variety of websites, and applications.
  • Conversational - Using data in conversational interfaces like for bots, messaging, and voice-enabled applications.
  • Visualizations - Including data in visualizations, showing API consumption, and platform usage around public data.
  • iPaaS / ETL - Enabling the migration of public data to and from other external 3rd party platforms using traditional ETL, or more modern iPaaS solutions powered via the API.
  • Webhooks - Notifying external systems of relevant events (location or service update) by pushing to URLs via what is called a webhook.
  • Spreadsheets - Publishing of data to Microsoft Excel or Google Spreadsheet using the public data APIs, as well as spreadsheet APIs.

This is just an overview of the number of ways in which a single, or multiple APIs can be used to deliver public data to many different endpoints, all in service of a single mission. When you consider this in support of public services, a bigger picture of how APIs and public data can be used to better serve the population--the first step always involved standardized, well-planned set of APIs being made available.

The Monetization Requirements Around Public Data API Operations
This is where we get to the portion of this discussion that is specifically about monetization of the operations around publishing and maintaining high-quality sources of public data. Before a sensible monetization strategy can be laid out, we need to be able to quantify what it costs to operate the platform and generate the expected value from everyone at the table.

What are the hard costs that should be considered when operating a public data platform and looking to establish some reasonable monetization objectives?

  • Data Acquisition - What one-time, and recurring costs are associated with acquiring data. This might include ongoing payouts to API contributors who are adding, updating, and validating data via the API.
    • Discover - What was spent to discover data, and identify its use on the platform.
    • Negotiate - What time to I have in actually getting access to something.
    • Licensing - Are there licensing costs or fees involved in the acquisition of data.
  • Development - What one-time, and recurring costs are associated with platform development.
    • Normalization - What does it take me to clean up, and normalize a data set, or across content. This is usually the busy janitorial work necessary.
    • Validation - What is involved with validating that data is accurate correct, providing sources, and following up on references.
    • Database - How much work is being put putting into setting up the database, maintaining, backing up, and delivering optimal levels of performance.
    • Server - Defining the amount of work put into setting up, and configuring the server(s) to operate an API, including where it goes in the overall operations plan.
    • Coding - How much work goes into actually coding an API. Ideally, open source frameworks are employed to reduce overhead, maintenance, and the resources needed to launch new endpoints.
  • Operational - What one-time, and recurring costs are associated with platform development.
    • Compute - What costs are associated with providing server compute capacity to process and deliver public data via APIs.
    • Storage -What costs are associated with on the disk storage, for both the database and other images, video, and related objects.
    • Network - How much bandwidth in / out is an API using to get the job done, as well as any other network overhead.
    • Management - What percentage of API management resources is dedicated to the API. A flat percentage of API management overhead until usage history exists.
    • Monitoring - What percentage of the API monitoring, testing, and performance service budget is dedicated to this API. How large is the surface area for monitoring?
    • Security - What does it cost to secure a single API, as part of the larger overall operations? Does internal resource spend time, or is this a 3rd party service.

Understand The Value Being Generated By Public Data
Now that we understand some of our hard costs, let's have an honest conversation about what value is being generated? First, public data has to offer value, or why are we doing all this hard work? Second, nobody is going to pay for anything if it doesn't offer any value. Let's stop for a moment and think about why we are doing all of this in the first place, and what value is worthy of carving off to drive monetization efforts.

  • Direct Value Generation - What direct value is being generated by the public data platform operations.
    • Usage - How is usage wielded as part of value generation? Is value about the increased usage of a resource, or possible value generated by a minimum usage of a resource? Usage is an important dimension of determining how value is generated as part of API operations.
    • Users - How is the value generated on a per user level? Are more users valuable? or possibly more targeted users? Teams, groups, and many other ways to determine how users impact positively or negatively the value generated from platform usage.
    • Relationships - How can relationships between users, or companies be applied to value generated? Will access to more relationships positively or negatively impact how value is generated for the platform and consumers?
    • Data Acquisition - Is the acquisition of data part of the generation of value via the public data platform, encouraging the growth of data.
    • Applications - Is value generated looked at on a per application basis? Does having multiple applications impact the value generate? Coming up with interesting ways to understand how applications impact platform value for everyone.
    • Integrations - What external integrations are available? How can these be leveraged to enhance the value for consumers? Are some integrations part of base operations, where others are accessible at higher levels, or on a one-off basis.
    • Support - Is access to support something that impacts the value being generated? Does access to support resources introduce the optimal levels of value consumers are looking for? How is support wielded within overall platform monetization?
    • Service Level Agreements - Are we measuring the number of contracts signed, and partner agreements in place? And how we are delivering against those agreements?
    • Revenue - What revenue opportunities for the ecosystem around an API and its operation, sharing in the money made from platform operations. Obviously, anything beyond operating costs should be applied to expansion of efforts.
  • Indirect Value - What are some of the indirect value being generated by the public data platform operations.
    • Marketing Vehicle - Having an API is cool these days, and some APIs are worth just having because of the PR value, and discussion.
    • Traffic Generation - The API exists solely for distributing links to the web and mobile applications, driving traffic to specific properties - is this tracked as part of other analytics?
    • Brand Awareness - Applying a brand strategy, and using the API to incentivize publishers to extend the reach of the brand and ultimately the platform - can we quantify this?
    • Analysis - How can analytics be leveraged as part of API value generation? Are analytics part of the base of operations, or are they an added value incentive for consumers, and platform operators.
    • Competitiveness - Is the public data effort more agile, flexible, and competitive because it has an API and can deliver on new integrations, partnerships, and to new platforms easier, and more cost effectively?
    • Public Service - Making data available for use on many different web, mobile, and other applications demonstrates a commitment to public service, and the good public data can do.

While there may be other hard costs associated, as well as areas of value being generated, this should provide a simple checklist that any open data provider can use as a starting blueprint. Additional costs can be included on in these existing areas or added on as new areas as deemed relevant--this is just about getting the monetization conversation going.

There are two main objectives in this exercise: 1) understanding the hard costs and value associated with operations 2) assembling into a coherent list so that we can explain to others as part of transparency efforts. When it comes to the business of public data, it is more than just generating revenue, it is about being upfront and honest about why we are doing this, and how it is done--mitigating the political risk involved with doing business with public resources out in the open.

Putting Together A Working Plan Involving Public Data
With an understanding of the hard costs of providing a public data platform and an awareness of the intended value to be generated via operations, we can now look at what details would be involved in a plan for executing this monetization strategy. API management practices are architected for metering, measuring, and limiting access to data, content, and algorithmic resources in service of a coherent, transparent public data monetization strategy. 

Here is a core framework of API management that can be applied to public data that can be used to drive monetization efforts:

  • Access - What are the default access levels for public data access via the API.
    • Self-Service - Public access to the platform via user registration, or 3rd party authentication like Twitter, Facebook, or Github.
    • Approval - Access level(s) that require the approval of user or application before they are able to access platform resources.
  • Tiers - What are the multiple tiers of access to all data resources available via API.
    • Public - Define the default public access for the platform, with a free, limited access tier that is obtainable via a self-service registration without approval.
    • Contributor - Providing a tier of access to contribute content, validate and management data on the platform.
    • Service Provider - Providing a tier of access for service providers involved with public data operations.
    • Internal - Access tier for internal groups, used by all websites, mobile applications, and system integrations.
    • Partner - Tier(s) of access design for data, and other partners involved in the management of public data resources.
    • Commercial - Access tier(s) for commercial usage of public data resources with higher levels of access for set fees.
    • Non-Commercial - Access tier(s) for non-commercial usage of public data resources with specific access waving fees.
    • Government - A set of API resources is available specifically for government access.
    • Auditor - Access across APIs specifically designed for external 3rd party auditors.
  • Elements - What are the core elements that make up the service composition for the monetization plan(s).
    • Paths - Establishing plans based upon the availability and access to different API endpoints, including platform meta API
    • Read / Write - Restricting read and write access to public data to specific tiers, limiting who writes data to only trusted applications.
  • Time Frames - What are the timeframes that impact the public data / API monetization plan(s) and consumption.
    • Daily - What are the details for managing, guiding, and restricting plan entries each day.
    • Weekly - What are the details for managing, guiding, and restricting plan entries in weekly timeframes.
    • Monthly - What are the details for managing, guiding, and restricting plan entries on a monthly basis.
  • Metrics - What is being measured to quantify value generated, providing a framework to understand monetization possibilities.
    • API Calls - Each call to the API is measured, providing the cornerstone of monetizing access and contribution to public data--remember not all calls will cost, some will add value with contributions.
    • URL Clicks - Each click on a URL served up via the API drive data and content is measured, providing details on value delivered to internal and external websites--URL shortener required for this.
    • Searches - All searches conducted via the API are measured, providing details on what users are looking for.
    • Users - Measure user acquisitions and history to keep track of the value of each platform user.
    • Applications - Measure the number of applications added, with details of activity to understand value generated.
  • Limits - What are the limitations imposed across all tiers of access as part of the API monetization plan.
    • API Calls - How many API calls any single application can make during a specific time frame.
    • IP Address - Which IP addresses you can request data from, limiting the scope of serving data.
    • Spend - How much any user can spend during a given time period, for each user or application.
  • Pricing - What prices are set for different aspects of the monetizing the platform.
    • Tiers - What are the base prices established for each tier of API access.
    • Unit - What are the default unit prices of per API call access for each tier.
    • Support - What charges are in place for receiving support for platform applications.
    • SLA - What costs are associated with delivering specific quality or levels of service and availability?

These are the moving parts of a public data monetization strategy. It allows any public data resources to be made available on the web, enabling self-service access to data 24/7. However, it does it in a way that requires accountability by ALL consumers, whether they are internal, partner, or the public at large. This API management scaffolding allows for the frictionless access to public data resources by the users and applications that are identified as worthwhile, and imposing limits, and fees for higher volume and commercial levels of access. 

Speaking To A Wide Audience With This Public Data Monetization Research
I purposely wrote this document to speak to as wide as possible audience as possible. In my experience working with public data across numerous different industries, there can be a wide variety of actors involved in the public data stewardship pipeline. My objective is to get more public data accessible via web APIs, and generating revenue to help fund this is one of my biggest concerns. I am not looking to incentivize people in making unwarranted profits on top of public data, this is already going on. My goal is open up the silos of public data out there right now, make them more accessible, while opening up the opportunity for delivering to a variety of applications, while also funding this important process.

I wanted to help anyone reading this to craft a coherent argument for generating much-needed revenue from public data, whether they are trying to convince a government agency, non-profit organization, institution, or a commercial company. Public data needs to be available in a machine-readable way for use in a variety of applications in 2016--something that takes resources and collaboration. APIs are not another vendor solution, they are the next step in the evolution of the web, where we don't just make data available for humans by publishing as HTML--we need the raw data available for use in many different applications. 


Exploring The Economics of Wholesale and Retail Algorithmic APIs

I got sucked into a month long project applying machine learning filters to video over the holidays. The project began with me doing the research on the economics behind Algorithmia's machine learning services, specifically the DeepFilter algorithm in their catalog. My algorithmic rotoscope work applying Algorithmia's Deep Filters to images and drone videos has given me a hands-on view of Algorithmia's approach to algorithms, and APIs, and the opportunity to think pretty deeply about the economics of all of this. I think Algorithmia's vision of all of this has a lot of potential for not just image filters, but any sort of algorithmic and machine learning API.

Retail Algorithmic and Machine Learning APIs
Using Algorithmia is pretty straightforward. With their API or CLI you can make calls to a variety of algorithms in their catalog, in this case their DeepFilter solution. All I do is pass them the URL of an image, what I want the new filtered image to be called, and the name of the filter that I want to be applied. Algorithmia provides an API explorer you can copy & paste the required JSON into, or they also provide a demo application for you to use--no JSON required. 

Training Your Own Style Transfer Models Using Their AWS AMI
The first "rabbit hole" concept I fell into when doing the research on Algorithmia's model was their story on creating your own style transfer models, providing you step by step details on how to train them, including a ready to go AWS AMI that you can run as a GPU instance. At first, I thought they were just cannibalizing their own service, but then I realized it was much more savvier than that. They were offloading much of the costly compute resources needed to create the models, but the end product still resulted in using their Deep Filter APIs. 

Developing My Own API Layer For Working With Images and Videos
Once I had experience using Algorithmia's deep filter via their API, and had produced a handful of my own style transfer models, I got to work designing my own process for uploading and applying the filters to images, then eventually separating out videos into individual images, applying the filters, then reassembling them into videos. The entire process, start to finish is a set of APIs, with a couple of them simply acting as a facade for Algorithmia's file upload, download, and DeepFilter APIs. It provided me with a perfect hypothetical business for thinking through the economics of building on top of Algorithmia's platform.

Defining My Hard Costs of Algorithmia's Service and the AWS Compute Needed
Algorithmia provides a pricing calculator along with each of their algorithms, allowing you to easily predict your costs. They charge you per API call, and the compute usage by the second. Each API has its own calculator, and average runtime duration costs, so I'm easily able to calculate a per image cost to apply filters--something that exponentially grows when you are applying to 60 frames (images) per second of video. Similarly, when it comes to training filter models using AWS EC2 GUP instance, I have a per hour charge for compute, storage costs, and (now) a pretty good idea of how many hours it takes to make a single filter. 

All of this gives me some pretty solid numbers to work with when trying to build a viable business built on top of Algorithmia. In theory, when my customers use my algorithmic rotoscope image or video interface, as well as the API, I can cover my operating costs, and generate a healthy profit by charging a per image cost for applying a machine learning texture filter. What I really think is innovative about Algorithmia's approach is that they are providing an AWS AMI to offload much of the "heavy compute lifting", with all roads still leading back to using their service. It is a model that could quickly shift algorithmic API consumers to be more wholesale / volume consumers, from being just a retail level API consumer.

My example of this focuses on images and video, but this model can be applied to any type of algorithmically fueled APIs. It provides me with a model of how you can safely open source the process behind your algorithms as AWS AMI and actually drive more business to your APIs by evolving your API consumers into wholesale API consumers. In my experience, many API providers are very concerned with malicious users reverse engineering their algorithms via their APIs, when in reality, in true API fashion, there are ways you can actually open up your algorithms, make them more accessible, and deployable, while still helping contribute significantly to your bottom line.


Investing In Your API Community Like Amazon And Slack

The messaging platform Slack made waves when they launched their Slack Fund as part of their API release, putting up $80M to invest in developers who were interested in building cool things on the API. Slack has continued telling their story, talking about how they have invested some of the fund and are being pretty transparent about which API integrations made the cut. 

After reading about the Slack Fund I wanted to see which other APIs were investing in their communities like this. Next, I came across the Alexa Fund where Amazon is investing in developers building voice-enabled apps, giving their Echo platform the boost of applications it will need to find success. After poking around the AWS platform, you also find they have also had their AWS Grants program, investing in the best of breed projects that use the AWS platform to give back to the world.

I came across a number of other fund announcements about available funds, only to find the pages gone, but MailChimp had an interesting accounting of their fund which they started in November of 2010, pledging $1M to "provide developers with funding assistance to build integrations on our email platform". I also found another fund developer fund out of Cisco, to encourage development with Cisco Spark and Tropo, and the other APIs they provide at Cisco DevNet, that was noteworthy.

There are not enough examples of API investment funds out there for me to add the concepts as what I'd consider being a common API building block, but there are enough out there, especially from leading pioneers, to keep me paying attention. API developer funds don't seem like something you have to go as big as AWS or Slack did, but it seems to me, if you provide a small fund like MailChimp did, it could incentivize development significantly--at least make your launch be fewer crickets.


Adding A 3rd Dimension To My API Monetization Thinking

When it comes to the API space, it always takes numerous conversations with API providers and practitioners, before something comes into focus for me. I've spent five years having API management conversations, an area that is very much in focus for me when it comes to my own infrastructure, as well as using as a metric for reviewing the other public and private APIs that I review regularly.

While I have been paying attention to API monetization for a couple years now (thank you @johnmusser), in 2015 I find myself involved in 20+ conversations, forcing the topic to slowly come into focus for me, whether I like it or not. When talking to companies and organizations about how they can generate revenue from their APIs, I generally find the conversation going in one of two directions:

  • Resource - We will be directly monetizing whatever resource we are making available via the API. Charging for access to the resource, and composing of multiple access tiers depending on volume, and partnerships.
  • Technology - We believe the technology behind the API platform is where the money is, and will be charging for others to use this technology. Resulting in a growing wholesale / private label layer to the API economy. 

90% of the conversation I engage in are focused in the first area, and how to make money off API access to a resource. The discussion is almost always about what someone will pay for a resource, something that is more art than science--even in 2015. The answer is, we don't know until there is a precedent, resulting in an imbalance where developers expect things for free, and providers freak the fuck out--then call me. ;-)

As more of my thoughts around API monetization solidfy, a third dimension is slowly coming into focus, one that won't exist for all API providers (especially those without consumers), but is something I think should be considered as part of a long term roadmap.

  • Exhaust - Capture, and make accessible the logs, usage, tooling, and other resources that are developed and captured through the course of API operations, and make available in a self-service, pay as you go approach.

There are many ways you can capture the exhaust around API operations, and sell access to it. This is where the ethics of APIs come into play--you either do this right, or you do it in a way that exploits everything along the way. This could be as simple as providing an API endpoint for accessing search terms executed against an API, all the way to providing a franchise model around the underlying technology behind an API, with all the resources someone needs to get up and running with their own version of an API. If you are very short-sighted this could be just about selling all your exhaust, behind the scenes to your partners and investors.

To me this is all about stepping back, and looking at the big picture. If you can't figure out a theoretical, 3rd dimension strategy for making money off the exhaust generated by the resource you are making available via an API, and the underlying technology used to do so---there probably isn't a thing there to begin with. Well, if you can't do this in an ethical way, that you will want to talk about publicly, and with your grandmother, you probably shouldn't be doing it in the first place. I'm not saying there isn't money to be made, I'm say there isn't real value, and money to be made that also lets you also sleep at night.

This opens up a number of ethical benchmarks for me. If you are looking at selling the exhaust from everything to your VC partners, and never open it up via your public APIs, you probably are going to do very well in the current venture capital (VC) driven climate. What I'm talking about, is how do you generate a new layer of revenue based upon the legitimate exhaust, that is generate from the valuable resource you are making available, and the solid technological approach that is behind it. If there is really something there, and you are willing to talk about it, and share publicly, the chances I'm going to care and talk about on my blog increases dramatically.

If you do not have a clue what I'm talking about, you probably aren't that far along in your API journey. That is fine. This isn't a negative. Just get going as soon as you can. If you are further along, and have people and companies actually using your API, there is ap robably a lot of value already being generated. If you partner with your ecosystem, and educate, as well as communicate with end-users properly--I am just highlight that there is a lot of opportunity to be captured in this 3rd dimension.


Are There Really Any Monetization Opportunities Around Open Data And APIs?

One of my readers recently reached out to me, in response to some of my recent stories of monetization opportunities around government and scientific open data and APIs. I'm not going to publish his full email, but he brought up a couple of key, and very important realities of open data and APIs that I don't think get discussed enough, so I wanted to craft a story around them, to bring front and center in my work.

  • Most open data published from government is crap, and requires extra work before you can do anything with it
  • There currently are very few healthy models for developers to follow when it comes to building a business around open data
  • Business people and developers have zero imagination when it comes to monetization -- aka ads, ads, ads, and more ads.

My reader sums it all up well with:

I don't dispute that with some pieces of government data, they can be integrated into existing businesses, like real estate, allowing a company to value add. But the startup space levering RAW open, or paid government data is a lot harder. Part of my business does use paid government data, accessible via an API, but these opportunities the world over are few and far between in my mind.

I think his statement reflects the often unrealized challenges around working with open data, but in my opinion it also the opportunity when it comes to the API journey, when applied to this world of open data.

APIs do not equal good, and if you build a simple API on top of open government data, it does not equal instant monetization opportunity as an API provider. It will take domain experts (or aspiring ones) to really get to know the data, make it accessible via simple web APIs, and begin iterating on new approaches to using the open data to enrich web and mobile applications in ways that someone is willing to pay for.

The reality of taking an open data set, cleaning it up, and then being able to monetize access to it directly via an API is simply not a reality, and is something that will only work in probably less than 5% of the scenarios where it is applied. However this doesn't mean that there aren't opportunities out there when it comes to monetizing adjacent to, and in relationship to the open data.

Before you can develop any APIs that any business or organization would want to pay for you have to add value. You do this by adding more meaningful endpoints that do not just reflect the original data or database resources, and provide actual value to end users of the web and mobile devices that being built--this is the API journey you hear me speak of so soften.

You can also do this by connecting the dots between disparate data-sets, in the form of crosswalks, and the establishing common data formats that can be applied across local, and regional governments, or possibly an industry. Once common data formats and interface models are established, and a critical mass of high value open data, common tooling can begin to evolve, creating opportunities for further software, service, and partnership revenue models.

The illness that exists when it comes to the current state of open data is something partly shared between early open data advocates when it came to over-promising the potential of open data, and their own under-delivery, as well as the governments under-delivery when it came to the actual roll-out and execution around their open data efforts. Most of the data published cannot be readily put to work, requiring several additional steps before the API journey even begins--making more work for anyone looking to develop around it, putting up obstacles, instead of removing them.

There is opportunity to generate revenue from open data published by government, but it isn't easy, and it definitely isn't VC scale opportunity, but for companies like HDSCore, when it comes selling aggregate restaurant inspection data to companies like Yelp, there is a viable business model. Companies that are looking to build business models around open data need to tamper down their expectations of being the next Twitter, and open data advocates need to stop trumpeting that open data and APIs will fix all that is wrong with government. We need to lower the bar, and just get to work doing the dirty work of exposing, cleaning up, and evolving how open data is put to work.

It will take a lot of work to find more of the profitable scenarios, and it will take years and years of hard work to get government open data to where it is default, and the cleanliness and uselessness levels are high enough, before we see the real potential of open data and APIs. All this hard work, and shortage of successful models, doesn't mean we shouldn't do it. For example, just because we can't make money providing direct access to Recreational Information Database (RIDB), doesn't mean there isn't potentially valuable APIs when it comes to understanding how people plan their summer vacations at our national parks--it will just take time to get there.

My Adopta.Agency project is purely about the next steps in this evolution, and making valuable government "open data" that has been published as CSVs and Excel, more accessible and usable, by cleaning them up and publishing them as JSON and / or APIs. I am just  acknowledging how much work there is ahead of us when it comes to making the currently available open data accessible and usable, so we can just begin the conversation about how we make them better, as well as how we generate revenue to fund this journey.


Catching My Breath On My API Monetization Ramblings Before I Enter Into Some New Conversations

I have two more conversations kicking off on the topic of API monetization, so I just needed to take a moment to gather up the last wave of posts on the subject, catch my breath, and refresh my overall thoughts in the area. What I really like about this latest wave, is that they are about providing much needed funding for some potentially very important API driven resources. Another thing is that they are pretty complicated, unproven approaches to monetizing APIs--breaking ground!!

Over the last couple weeks, I have be engaged in four specific conversations that have shifted my gaze to the area of API monetization:

  • Audiosear.ch - Talking with the PopupArchive team about making money around podcast search APIs.
  • Department of Interior - Providing feedback on the Recreation Information Database (RIDB) API initiative.
  • Caltech Wormbase - Helping organize a grant effort to fund the next generation of research, from Wormbase, and other scientific database.
  • HDScores - Mapping out how HDScores is funding the efforts around aggregating restaurant inspection data into a single, clean API.

 As I think through the approaches above, I'm pushed to exercise what I can from these discussions, on my own infrastructure:

  • My API Monetization - As I look to add more APIs to my stack, I'm being forced to clearly define all the moving parts of my API monetization strategy.
  • Blockchain Delusions - While thinking again about my API pricing and credit layer, I'm left thinking about how the blockchain can be applied to API monetization.

The API Evangelist network is my research notebook. I search, re-read, and refine all the thoughts curated, and published here. It helps me to aggregate areas of my research, especially in the fast moving areas, where I am receiving the most requests for assistance. Not only does it refresh my memory of what the hell I've written in the last couple weeks, I also hope it gives you a nice executive summary in case you missed anything.

If you are looking for assistance in developing your API monetization strategy, or have your own stories you'd like to share, let me know. If you have any feedback on my stories, including feedback for the folks I'm talking to, as well as items missing from my own API monetization approach, or blockchain delusions--let me know!


Expanding On My API Monetization Strategy And Research

This is a full walk-through of me trying to distill down my approach to API monetization, in a way that can be applied across not just 30 APIs, but potentially 300, or 3000. There are several things converging for me right now, which includes the maturing of my own infrastructure, as well as conversations I'm having with startups, enterprise groups, federal government agencies, and my own partner(s).

I need to keep hammering on this to help me operate my own infrastructure, but I am also partnering with APIWare to help me deliver on much of the API design, deployment, and management, so I need to have a good handle on my costs. As with all of my areas of research, within the area of API monetization I am just trying to get a handle on the common building blocks, and provide a checklist of considerations to be employed when I'm planning and operating my API infrastructure.

To help me build a base, let's walk through some of the building blocks of my own API monetization strategy.

Acquisition
What do I have invested into any single API. Even if I am building something from scratch, what went into it? Every API I possess has some sort of acquisition cost, even if it is just $14.00 for the two pints of beer I bought while I was brainstorming the idea.

  • Discover - What did I spent to find this. I may have had to buy someone dinner or beer to find, as well as time on the Internet searching, and connecting the dots.
  • Negotiate - What time to I have in actually getting access to something. Most of the time its on the Internet, and other times it requires travel, and meeting with folks.
  • Licensing - There is a chance I would license a database from a company or institution, so I want to have this option in here. Even if this is open source, I want the license referenced as part of acquisition.
  • Purchase - Also the chance I may buy a database from someone outright, or pay them to put the database together, resulting in one-time fee, which I'm going to call "purchase".

Having a framework for me to think about the acquisition of each API resource I possess, makes it easier for me to think it through when I am brainstorming new API ideas. Something that makes sure I am tracking all details from the moment of inception, to when I commit to actually making it available via an API on my platform.

Development
What does it actually take to stand up an API. There are a lot of moving parts with making an API happen, and not all of them are technical. Am I willing to invest the time necessary to stand up an API or will it require outside investment, as well as resources. What is needed to take an API from acquisition to actual operation?

  • Investment - Who put up the money to support the development of this API resource? Was it internal, or did we have to take external investment.
  • Grant - Was the development of this API rolled up in a grant, or specifically a grant for its development. Covering costs involved.
  • Normalization - What does it take me to cleanup, and normalize a dataset, or across content. This is usually he busy janitorial work necessary.
  • Design - What does it take me to generate a Swagger and API Blueprint, something that isn't just auto-generated, but also has the required hand polish it will require.
  • Database - How much work am I putting into setting up the database. A lot of this I can automate, but there is always a setup cost involved.
  • Server - Defining the amount of work I put into setting up, and configuring the server to run a new API, including where it goes in my overall operations plan.
  • Coding - How much work to I put into actually coding an API. I use the Slim PHP framework, and using automation scripts I can generate 75% of it usually, but there is always finish work.
  • DNS - What was the overhead in me defining, and configuring the DNS for any API, setting up endpoint domain, as well as potentially a portal to house operations. 

Historically when it came to APIs, I just dove into writing code with little consideration for what went into it. I'd say this is one by-product of the microservices way of thinking, is that I decoupled the moving parts of each of my APIs, allowing me to approach development in this way. I'm sure I will keep slicing off other elements within the development process as I progress.

Operation
What goes into keeping an API operational, reliable and available? How much do I spend on all aspects of an existing APIs lifecycle to make sure it meets the standards of API consumers. Ideally operational costs go down the more efficient the platform gets with overall operations, reducing overhead, and streamlining across everything.

  • Definition - How much resources am I applying to creating and maintaining APIs.json, Swagger, and API Blueprint definitions for my APIs.
  • Compute - What percentage of my AWS compute is dedicated to an API. Flat percentage of the server its one until usage history exists.
  • Storage - How much on disk storage am I using to operate an API? Could fluctuate from month to month, and exponentially increase for some.
  • Bandwidth - How much bandwidth in / out is an API using to get the job done.
  • Management - What percentage of API management resources is dedicated to the API. A flat percentage of API management overhead until usage history exists.
  • Code - What does it cost me to maintain my code samples, libraries, and SDKs for each individual API, or possibly across multiple APIs and endpoints.
  • Evangelism - How much energy do I put into evangelizing any single API? Did I write a blog post, or am I'm buying Twitter or Google Ads? How is the word getting out?
  • Monitoring - What percentage of the API monitoring, testing, and performance service budget is dedicated to this API. How large is surface area for monitoring?
  • Security - What does it cost for me to secure a single API, as part of the larger overall operations? Does internal resource spend time, or is this a 3rd party service.
  • Virtualization - What am I spending on virtualization for an API, as part of QA process, for retail sandbox and simulation environments, or for specific partner needs.

Ideally the more APIs you operate, the more detail you will get about each of these areas, and some of these areas you should get better deals, the more volume you run through each area listed above. Example of this would be with compute and storage costs going down, as we do more business. The more we understand the details of operations, the more we can optimize operations.

Access Levels
What sort of access levels are we going to provide across ALL APIs, not that all APIs will use all areas, but we should be ready for as many scenarios as we possibly can. We need to be clear of what access is the free layer, as well as the tiers of access, and any wholesale, partner, or re-sellers aspects.

  • Free (unlimited) - This is just a free API, I won't be rate limiting the usage of it. It will act similar to any website I put out there, but instead of HTML it is JSON.
  • Free Trial - I am only going to offer a limited use, or time period for access a resource, giving just a taste test, but won't be in main pool of APIs available. 
  • Not For Profit - This API is being subsidized somehow. Either there is direct investment from internal or external resources to subsidize or there is a grant involved.
  • Educational Access - Is this API available as an educational resource, with special pricing for students and teachers? This will usually be reflected in the tier, and credit availability.
  • Tier(s) - Which of these service tiers is an API available in, and which endpoint paths + verbs are accessible in the tier (api-pricing definition).
    • Public - General access, you usually don't even need a key. Only limited to specific APIs.
    • Retail - This is the pay as you go level for public acess to all APIs. This is where the retail side of business operations will occur.
    • Trusted - These are just a handful of trusted individuals or companies, who may have write access to some endpoints.
    • Education - Providing a specific access tier for education partners, including students, teachers, and institutions. Providing higher levels of free access, and lower price points.
    • Partner - These are partners I have prearranged agreements with, something I will be transparent about, showcasing them on partner page.
    • Wholesale - The wholesale, often non-rate limited portion of my platform, where I deploy APIs in other people infrastructure, or our own for flat fees.
    • Platform - This is all internal access by applications I build for my own usage. I still rate limit, and manage this access, I just give myself different privileges.
  • Partner Program - A structured program allowing API consumers to achieve higher levels of access, with reduced pricing levels, flat rate access, and other benefits.
  • Reseller Program - A structured programming for allowing API consumers to prove themselves, and share in revenues from API usage, affiliate links, and revenue share.

My intent around access levels is to be as transparent as possible. Not all users will access at all levels, and not all APIs, and their endpoints will be available within at all access levels. The goal is to optimize access, remain as open as makes sense, while also sensibly monetizing resources to cover costs, and make a fair profit.

Pricing & Credits
I am employing a universal credit system that will be used by all APIs. The goal is to expand the unit of currencies I employ beyond just API calls, and attach a universal unit of value that can be applied across all APis. API consumers will be given a certain amount of API credits to be used each day, as well be able to buy and sell credits at various rates. 

  • API Value - Each API will have its own credit rate set, where some will be one credit, while others may be 100 credits to make a single call, it can be defined by API or a specific endpoint.
  • Daily Limit - The daily allowed credit limit will be determined by the access level tier is registered at, starting with daily free public access to retail, trusted, and potentially custom tiers.
  • Usage - How many credits does any one user use during a day, week, or month, across all APIs. When each API is used, it will apply the defined credit value for the single API call.
  • Incentive - How can the platform give credits as an incentive for use, or even pay credits for writing to certain APIs, and enriching the system, or driving traffic.
  • Purchase - What does it cost to buy a credit, something that could fluctuate from day to day, week to week, or month to month.
  • Buyout - Allow API consumers to get paid for the credits on their account, preferably all users are encouraged to spend credits, but buyout is an option.
  • Discounts - Can we give discounts when you buy credits through specific channels, promotions, or other type of planned deal.
  • Volume - Are there volume discounts for buying of credits, allowing consumers to purchase credits in bulk, when they need to and apply when they desire. 
  • Applying - Can you wait to apply credits you have accumulated? Given the option with each billing cycle to apply, or you may want to wait and use at future date.

I envision credits being the lifeblood of the API monetization strategy for my platform, and would love to see it spread beyond any single API ecosystem, and be something that all providers could put to work. The benefits of this would be seen by both API provider, as well as consumers, in helping us establish a currency for the API economy.

Indirect Value Generation
What value is generated via API operations that isn't directly monetized, but driving value in other ways. These indirect value generators are often overlooked, and under-showcased areas of operation, often resulting in API failure--always showcase the buzz.

  • Marketing Vehicle - Having an API is cool these days, and some APIs are worth just having for the PR value, and extending the overall brand of the platform.
  • Web or Mobile Traffic - The API exists solely for distributing links to web and mobile applications, driving traffic to specific properties - is this tracked as part of other analytics?
  • Brand Awareness - Applying a brand strategy, and using the API to incentivize publishers to extend the reach of the brand and ultimately the platform - can we quantify this?
  • Data & Content Acquisition - Using the API, and the applications built on top as part of a larger data and content acquisition strategy--can we quantify this?

I could see data and content acquisition grow into an area we can better quantify soon. Putting a base value on each resource in the system, and figure out how much each resource grows in size, and quality over time. Applying value to these indirect areas is something I'd like to expand upon in future iterations.

Partner Revenue Generation
Ideally any platform should be sharing the revenue and value exhaust generated via the ecosystem, providing revenue opportunities for web, and mobile application developers. There are a handful of ways revenue can be shared via API operations.

  • Link Affiliate - What revenue is generated and paid out via links that are made available via the API, with potentially externally affiliate links embedded.
  • Revenue Share - What portion API revenue is paid out to potential re-sellers who drive API usage. Revenue is percentage of overall credit purchases / usage.
  • Credits to Bill - All revenue is paid in credits on account, and user can decide to get buy out of credits at any time, or possibly use in other aspects of system operation.

I will be expanding on these areas in the future, as I play with ways to incentivize content or data creation, or just driving API consumption well into the paid tiers. Right now many API platforms I study are essentially sharecropping plantations, reaping the value generated from developer activity. In the future, developers should be incentivized with cash and credit to help achieve platform monetization goals, which is something I want to explore via my own API resources when I have the bandwidth.

Internal Revenue Generation
Where are we making money? What revenue is generated across the platform, and then what are the breakdowns. I want to understand who my rockstar users and applications are, something that isn't isolated to external users. I am looking to craft all of my applications as individual citizens within the API ecosystem, measuring and limiting what type of access they have, and treat them like any other consumer on the platform.

  • Monthly - How much revenue is being brought in on a monthly basis for an API and all of its endpoints.
  • Users - How much revenue is being brought in on a monthly basis for a specific user, for an API and all of its endpoints.
  • Applications - How much revenue is being brought in on a monthly basis for a specific application, for an API and all of its endpoints.
  • Tiers -  Which tiers generate most usage and revenue? I should be applying just as easily to aspects of platform / internal usage as well.
  • Affiliate Revenue - What was generated from affiliate links made available via APIs, minus what percentage was paid out to API consumers.
  • Advertising Revenue - What advertising revenue was derived from web or mobile application traffic resulting from the API, minus whatever was paid out as rev share to API consumers.

The goal of my platform is not simply to make money. Sure I like making money, but I'm looking to flush out a reproducible framework for hanging each API, and making sense of it as part of my larger API platform operations. Not all APIs will be created equally, but I should be able to equally measure what it costs to develop, and operate, and apply consistent ways of generating revenue around its use. 

All of this looks intimidating when you scroll back through. However my goal is to produce a standardized pricing page that can exist across all of my API ecosystem(s), which are growing in number, and prompting me to think in this way. I need a better handle on my costs, and ultimately be able to generate more revenue to keep me with a roof over my head, food on the table, and my AWS bill paid.

While I only have a single API portal right now, I'm preparing to deploy a specific collection using APIs.json and publish as version 2.0 of my API Evangelist developer portal. I'm also looking to immediately publish a few other API portals, designed to support various collections or stacks of APIs available in my network (images, API definitions, etc.). I need a standard way to deliver on-boarding, and pricing for the APIs, and this backend framework gives me the v1 approach to that. 

Each API that I launch will have a pricing page, with each of the available service tiers as a box, and within each box it will list how many free credits you get each day, and other features available like per credit rate beyond the daily allowed limit, support elements, and other relevant details to that tier. There should also be links to more detail about partner, re-seller, and wholesale options for each API portal I launch. The API consumer never sees all of this. This framework is for me to hang each API upon, and think through it in context of the overall API lifecycle and platform operations.

I'm applying this outline to the 30 APIs I have in my stack, and then also applying it to a handful of new data APIs I'm working on. Along the way I will flush it out a little more, before I get to work on some of the more advanced pieces like a partner and re-seller programs. I'm not a big fan of advertising, but I do have some single page apps that perform pretty well, and it wouldn't be too intrusive to have some advertising on them. All of these SPAs are driven by my APIs, and they often exist as tools across my API driven content network as well.

This post will be published to my API monetization research, and this list will be published as common building blocks, that can be considered as part of any API monetization strategy. It makes me happy to see this portion of my research finally move forward, and evolve, especially since its based upon my own platform goals, as well as my wider monitoring and review of the space.


If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.