Standards
It's often assumed that standards work is inherently competitive. This post examines why Internet standards are often more collaborative than competitive, and outlines some implications of this approach.
The phrase 'Open Standards' is widely used but not well-understood. Let's take a look at what openness in standards is, with a focus on whether and how it helps to legitimise the design and maintenance of the Internet.
It’s common for voluntary technical standards developing organisations (SDOs such as the IETF and W3C) to make decisions by consensus, rather than (for example) voting. This post explores why we use consensus, what it is, how it works in Internet standards and when its use can become problematic.
No one requires tech companies or open source projects to use most Internet standards, and no one requires people to use them either. This post explains why the voluntary nature of its standards are critical to the Internet's health.
RFC 9518: Centralization, Decentralization, and Internet Standards has been published after more than two years of review, discussion, and revision.
There are lots of ways to view what Internet standards bodies like the IETF and W3C do. This post examines them as a type of regulator and explores what that means for how they operate.
A big change in how the Internet is defined - and who defines it - is underway.
The Internet Architecture Board (IAB) has published RFC8890, The Internet is for End Users, arguing that the Internet Engineering Task Force (IETF) should ground its decisions in what’s good for people who use the Internet, and that it should take positive steps to achieve that.
It’s become common for Web sites – particularly those that host third-party or user-generated content – to make a “safe” mode available, where content that might be objectionable is hidden. For example, a parent who wants to steer their child away from the rougher corners of the Internet might go to their search engine and put it in “safe” mode.
For better or worse, Requests for Comments (RFCs) are how we specify many protocols on the Internet. These documents are alternatively treated as holy texts by developers who parse them for hidden meanings, then shunned as irrelevant because they can’t be understood. This often leads to frustration and – more significantly – interoperability and security issues.
The IESG has approved “HTTP Alternative Services” for publication as a Proposed Standard.
Last night, we had a screening of CITIZENFOUR at the IETF meeting in Prague, and about 170 people showed up to see the movie about Edward Snowden’s relevations — information that led the IETF to declare such pervasive monitoring as an attack on the Internet itself.
Yesterday at IETF92 in Dallas, we had a “Bar BoF” (i.e., informal meeting) about improving the behaviour and handling of Captive Portals — those login pages that you have to click through to get onto networks in hotels, airports, and many other places.
A few months ago I went to the Internet Governance Forum, looking to understand more about the IGF and its attendees. One of the things I learned there was a different definition of “intermediary” — one that I think the standards community should pay close attention to.
Recently, one of the hottest topics in the Internet protocol community has been whether the newest version of the Web’s protocol, HTTP/2, will require, encourage or indeed say anything about the use of encryption in response to the pervasive monitoring attacks revealed to the world by Edward Snowden.
The NSA PRISM story broke while I was on the road; last week I was in Tokyo for W3C meetings, moving to San Francisco for a HTTP meeting and Velocity.
It used to be that when you registered a media type, a URI scheme, a HTTP header or another protocol element on the Internet, it was an opaque string that was a unique identifier, nothing more.
The Stockholm IETF meeting is shaping up to be an interesting one (and not just because it’s in such a beautiful city).
Over the past few weeks the Free Software Foundation has had its knickers in a twist about TLS authentication — specifically, its patent encumbrance;
It’s become quite fashionable for large IT shops to give blanket Royalty-Free licenses for implementation of “core” technologies, such as XML, Web Services and Atom. I’ll refrain from linking to any of them, as the purpose of this post* is not to pick on any single one**.
Everyone seems to be gushing about Microsoft’s Open Specification Promise. While any headway is good in the horrible landscape that is Intellectual Property, my initial reaction is that it — like most such vendor promises — is too little, too late.
Don Box (whose blog doesn’t seem to be taking comments any more, so I’ll do it over here) points out some very cool technology he’s using, Microsoft’s Office Communicator. Sounds very slick, I’m jealous (with my old tech phone line and last year’s GSM mobile)!
In his blog, Sean McGrath wonders about two potentially competing faces of standards; extensibility and interoperability.
I’ve had a fairly large and annoying bee in my bonnet for the past few months, regarding media type registration. It started buzzing when I tried (and failed) to register a media type for RSS, and has continued to grow as I attempt to do the same for SOAP, on behalf of the XML Protocol Working Group.
Looks like a good to-read list:
John Beatty: Economics of Standards
I agree with just about everything that Jim Waldo says here (at least for protocol standards). Well said!
Finally, the IESG puts its money where its mouth is; this tool allows you to see the status and individual AD’s comments about a particular I-D. It’s only a start, but at least you have some idea of what’s going on, instead of being left out in the cold.