>1.2K subscribers
Share Dialog
Share Dialog

I've spent the better part of the last six years thinking about where web standards come from. Before joining USV, I was at the (now retired) urban tech incubator OpenPlans, where, among other things, we worked to further "open" technology solutions, including open data formats and web protocols. The two biggest standards we worked on were GTFS, the now ubiquitous format for transit data, including routes, schedules and real-time data for buses and trains; and Open311, an open protocol for reporting problems to cities (broken streetlights, potholes, etc) and asking questions (how do I dispose of paint cans?). Each has its own origin story, which I'll get into a little bit below. Last week, I wrote about "venture capital vs. community capital" (i.e., the "cycle of domination and disruption") -- and really the point of that talk was the relationship between proprietary platforms and open protocols. My point in that post was that this tension is nothing new; in fact it is a regular part of the continuous cycle of the bundling and unbundling of technologies, dating back, well, forever. Given the emergence of bitcoin and the blockchain as an application platform, it feels like we are in the midst of another wave of energy and effort around the development and deployment of web standards. So we are seeing a ton of new open standards and protocols being imagined, proposed, and developed. The key question to be asking at this moment is not "what is the perfect open standard", but rather, "how do these things come to be, anyway?" Joi Ito talks about the Internet as "a belief system" as much as a technology, and part of how I interpret that is the fact that it rests on the idea of everyone just agreeing to do things kind of the same way. So, we don't all need to run the same computers, use the same ISP, or be members of a common club (social network) -- rather, all we need to do is adhere to some common protocols (HTTP, SMTP, etc). No one owns the protocols (by and large) -- they are more like "customs" than anything else. It works because we all agree to do more or less the same thing. So when we're looking at all these new protocols appearing (from openname, to ethereum, to whatever), the question is not just "is this a good idea" but rather "how might everyone agree to do this?". It's a political and social problem as much as a technical problem. And more often than not, there is some sort of "magic" involved that is the difference between "cool idea" or "nice whitepaper" and "everyone does it this way". Here is a crack at bucketing a few of the major strategies I've observed for bringing standards to market. (These are not necessarily mutually exclusive, and are certainly not complete -- would love to find other patterns and examples.) Update: The Old Fashioned Way Max Bulger makes a good point on Twitter that I have neglected here to include the traditional, formal methods of developing web standards -- though standards bodies like the w3c and the IETF. That's how many standards get made, but not all. For this post, I want to focus on hacks to that traditional process. The Brute Force Approach One way to bring a standard to market is to simply force it in, using your market position as leverage. Apple has been doing this for decades, most recently with USB-C, two decades ago with the original USB.
Word on the street is that USB-C was less of a consensus-driven standards body project and more of an apple hand off. Time will tell, but now that USB-C is the port to beat all ports in the Macbook 12, it could become the single standard for laptop and mobile/tablet ports. You can do this if you're huge (see also: Microsoft and .doc, Adobe and .pdf) The Happy Magnet Approach I mentioned the GTFS standard, which is now the primary way transit agencies publish route, schedule and real-time data. GTFS came to be because of work between Google and Portland's Tri-Met back in 2005, as their collaboration to get Portland's transit data into Google maps - so they created a lightweight standard as part of that. Then, Google used "hey, don't you want your data in Google maps?" as the happy magnet, to draw other agencies (often VERY reluctantly) into publishing their data in GTFS as well. Here's a diagram I made back in 2010 to tell this story:
This approach includes elements of the Brute Force approach -- you need to have outsized leverage / distribution to pull this off. It's also worth noting that GTFS won the day (handily) vs. a number of similar formats that were being developed by the formal consortia of transit operators. I remember talking to folks at the time who had been working on these other standards, who were pissed that Google just swept in and helped bring GTFS to market. But that's exactly the point I want to make here: a path to market is often more is more important than a perfect design. The Awesome Partner Approach Not really knowing the whole story behind Creative Commons, it seems to me that one of the huge moments for that project was their partnership with Flickr to bring CC licensed photos to market -- giving photographers the ability to tag with CC licenses, and giving users the ability to search by CC. CC was a small org, but they were able to partner with a large player to get reach and distribution. The Make-them-an-offer-they-can't-refuse Approach Blockchain hacker Matan Field recently described the two big innovations of bitcoin as 1) the ledger and 2) the incentive mechanism. The incentive mechanism is the key -- bitcoin and similar cryptoequity projects have a built-in incentive to participate. Give (compute cycles) and get (coins/tokens). While the Bitcoin whitepaper could have been "just another whitepaper" (future blog post needed on that -- aka the open standards graveyard), it had a powerful built-in incentive model that drew people in. The Bottom-up Approach At our team meeting on Monday, we got to discussing how oAuth came to be. (for those not familiar, oAuth is the standard protocol for allowing one app to perform actions for you in a different app -- e.g., allow this app to post to twitter for me, etc). According to the history on Wikipedia, oAuth started with the desire to delegate API access between Twitter and Magnolia, using OpenID, and from there a group of open web hackers took the project on. First as an informal collaboration, then as a more organized discussion group, and finally as a formal proposal and working group at IETF. From being around the folks working on this at the time, it felt like a very organic, bottom-up situation. Less of a theoretical top-down need and more of a simple practical solution to a point-to-point problem that grew into something bigger.

I've spent the better part of the last six years thinking about where web standards come from. Before joining USV, I was at the (now retired) urban tech incubator OpenPlans, where, among other things, we worked to further "open" technology solutions, including open data formats and web protocols. The two biggest standards we worked on were GTFS, the now ubiquitous format for transit data, including routes, schedules and real-time data for buses and trains; and Open311, an open protocol for reporting problems to cities (broken streetlights, potholes, etc) and asking questions (how do I dispose of paint cans?). Each has its own origin story, which I'll get into a little bit below. Last week, I wrote about "venture capital vs. community capital" (i.e., the "cycle of domination and disruption") -- and really the point of that talk was the relationship between proprietary platforms and open protocols. My point in that post was that this tension is nothing new; in fact it is a regular part of the continuous cycle of the bundling and unbundling of technologies, dating back, well, forever. Given the emergence of bitcoin and the blockchain as an application platform, it feels like we are in the midst of another wave of energy and effort around the development and deployment of web standards. So we are seeing a ton of new open standards and protocols being imagined, proposed, and developed. The key question to be asking at this moment is not "what is the perfect open standard", but rather, "how do these things come to be, anyway?" Joi Ito talks about the Internet as "a belief system" as much as a technology, and part of how I interpret that is the fact that it rests on the idea of everyone just agreeing to do things kind of the same way. So, we don't all need to run the same computers, use the same ISP, or be members of a common club (social network) -- rather, all we need to do is adhere to some common protocols (HTTP, SMTP, etc). No one owns the protocols (by and large) -- they are more like "customs" than anything else. It works because we all agree to do more or less the same thing. So when we're looking at all these new protocols appearing (from openname, to ethereum, to whatever), the question is not just "is this a good idea" but rather "how might everyone agree to do this?". It's a political and social problem as much as a technical problem. And more often than not, there is some sort of "magic" involved that is the difference between "cool idea" or "nice whitepaper" and "everyone does it this way". Here is a crack at bucketing a few of the major strategies I've observed for bringing standards to market. (These are not necessarily mutually exclusive, and are certainly not complete -- would love to find other patterns and examples.) Update: The Old Fashioned Way Max Bulger makes a good point on Twitter that I have neglected here to include the traditional, formal methods of developing web standards -- though standards bodies like the w3c and the IETF. That's how many standards get made, but not all. For this post, I want to focus on hacks to that traditional process. The Brute Force Approach One way to bring a standard to market is to simply force it in, using your market position as leverage. Apple has been doing this for decades, most recently with USB-C, two decades ago with the original USB.
Word on the street is that USB-C was less of a consensus-driven standards body project and more of an apple hand off. Time will tell, but now that USB-C is the port to beat all ports in the Macbook 12, it could become the single standard for laptop and mobile/tablet ports. You can do this if you're huge (see also: Microsoft and .doc, Adobe and .pdf) The Happy Magnet Approach I mentioned the GTFS standard, which is now the primary way transit agencies publish route, schedule and real-time data. GTFS came to be because of work between Google and Portland's Tri-Met back in 2005, as their collaboration to get Portland's transit data into Google maps - so they created a lightweight standard as part of that. Then, Google used "hey, don't you want your data in Google maps?" as the happy magnet, to draw other agencies (often VERY reluctantly) into publishing their data in GTFS as well. Here's a diagram I made back in 2010 to tell this story:
This approach includes elements of the Brute Force approach -- you need to have outsized leverage / distribution to pull this off. It's also worth noting that GTFS won the day (handily) vs. a number of similar formats that were being developed by the formal consortia of transit operators. I remember talking to folks at the time who had been working on these other standards, who were pissed that Google just swept in and helped bring GTFS to market. But that's exactly the point I want to make here: a path to market is often more is more important than a perfect design. The Awesome Partner Approach Not really knowing the whole story behind Creative Commons, it seems to me that one of the huge moments for that project was their partnership with Flickr to bring CC licensed photos to market -- giving photographers the ability to tag with CC licenses, and giving users the ability to search by CC. CC was a small org, but they were able to partner with a large player to get reach and distribution. The Make-them-an-offer-they-can't-refuse Approach Blockchain hacker Matan Field recently described the two big innovations of bitcoin as 1) the ledger and 2) the incentive mechanism. The incentive mechanism is the key -- bitcoin and similar cryptoequity projects have a built-in incentive to participate. Give (compute cycles) and get (coins/tokens). While the Bitcoin whitepaper could have been "just another whitepaper" (future blog post needed on that -- aka the open standards graveyard), it had a powerful built-in incentive model that drew people in. The Bottom-up Approach At our team meeting on Monday, we got to discussing how oAuth came to be. (for those not familiar, oAuth is the standard protocol for allowing one app to perform actions for you in a different app -- e.g., allow this app to post to twitter for me, etc). According to the history on Wikipedia, oAuth started with the desire to delegate API access between Twitter and Magnolia, using OpenID, and from there a group of open web hackers took the project on. First as an informal collaboration, then as a more organized discussion group, and finally as a formal proposal and working group at IETF. From being around the folks working on this at the time, it felt like a very organic, bottom-up situation. Less of a theoretical top-down need and more of a simple practical solution to a point-to-point problem that grew into something bigger.
No comments yet