I'm sitting here in Derek Featherstone's amazing a11y talk at Fronteers and I feel like I need to follow up the last post with a quick primer on the zen of function
for (both of) the spec authors who read this blog.
The reason it's offensive to the JS hacker for WebIDL to disallow new
against DOM types -- any of them -- is that it means that it's no longer specifying how you'd describe these types in the primitive we use over here in JS for this sort of thing: functions. This might sound nuts to folks who come from C++ or who spend their time in spec-ese, but there's no difference between a plain-old function, a "constructor", and a "partial" or "mixin" in JS semantics. We use functions for all of them. You can say new function(){ ... }
and create an instance of an anonymous "class" in JS. You can take the same function and invoke it as a "regular function" -- (function(){ ... })();
-- and you can use that sort of function as a mixin or partial
interface too: new function(){ (function(){ ... }).call(this); ... }
. The exact same function object can even act in all of these capacities (although it's rare). People use them as they need to, but they all boil down to functions.
What, then, does it mean for something to disallow new
against some type for which you can in some otherwise get an instance in JS? The same thing when you can't .call()
it: it's alien. It's not a class as we know it, which means that it's not a function, and if it's not a function...well, it doesn't belong. Fundamentally, it's smuggling static semantics into a language that has perfectly good dynamic semantics for the same thing. This strikes at the very heart of what WebIDL is supposed to be for: describing JS types for things implemented somewhere else. By not allowing new
and .call()
WebIDL is giving JS semantics the bird, asserting that the fact that these things aren't JS types makes them better in some way...and that is a bug, either in the perspective of the spec authors or of the specs themselves.
Luckily, the fix is easy: all WebIDL types should de-sugar to functions. All of them. All the time. No questions asked. That you will be able to use new
and .call()
and all the rest isn't a bug, and it's not something to guard against. It's just how JavaScript rolls...and how JavaScript's largest, most important library should roll too.
For those who haven't been following the progress of WebIDL -- and really, how could you not? An IDL? For the web? I'd like to subscribe to your newsletter... -- the standard is now in last call, which is W3C for "alllllllllllmost done".
Which it is not.
Before I get to why, let me first say some nice, well-earned things about WebIDL: first, it has helped us out of the ad-hoc IDL sludge that used to be how APIs for JavaScript have been exposed in the past. It has shaved off many sharp edges and is giving spec authors a single dialect in which to write their API descriptions. From a browser perspective, this is a Very Good Thing (TM). Next, the draft in question contains some wonderful changes from the status quo, particularly the addition of a sane prototype to all WebIDL-specified objects.
That all sounds good, so what's missing?
In a word, constructors.
Well, a lot more than that, but I'd settle for constructors. Functionally speaking, it boils down to the fact that WebIDL makes spec authors do extra work to make something like this sane:
new HTMLDivElement();
Why doesn't this work today? Funny story...see, HTML defines HTMLDivElement as a regular WebIDL interface. WebIDL doesn't really have the notion of concrete classes, just interfaces with and without constructors. Since the HTML spec is just doing what most specs will do -- adding the smallest IDL you can get away with -- the JS usability of this API is left in limbo; neither clearly HTML5's responsibility nor WebIDL's.
So what should a contentious version of HTML5 do? One answer is to specify a constructor, turning the IDL from this:
interface HTMLDivElement : HTMLElement {};
to this:
[Constructor]
interface HTMLDivElement : HTMLElement {};
Repeat ad-infinitum for each and every interface that should be constructable in every single spec that browser vendors ever implement. Don't miss any! And please make sure that all your spec editors are on-board with good JS APIs as a goal! As it stands today, WebIDL doesn't even force most spec authors to consider the question "do I need a constructor here?" -- spoiler: yes -- let alone the obvious follow-ups like "what arguments should one take?".
The obvious better answer here is to flip the default on interfaces, causing them to generate constructors by default unless turned off with [NoConstructor]
attributes or specified as partial
interfaces (i.e., mixins or traits).
Cameron McCormack who is heading up the WebIDL effort tweeted in response to my exasperation that:
I think a "W3C Web API design guidelines" document would be a perfect place for such a recommendation.
For serious? Such a document might be useful (and I'm working on something that might pass as a first draft), but what's the argument against flipping the default here? This isn't a dissent on the facts of the situation: most WebIDL "interfaces" that are exposed to JS are things that could be easily new
'd up to useful ends. Most specs flub this in spectacular style. Most spec authors seem entirely ignorant of the problem and the design language of WebIDL continues to lead down a primrose copy-and-paste path that has little overlap with sanity. So why punt the decision? And why did it take and act of coordination with TC39 to get the prototype thing fixed?
And Why Are We Having This Discussion Anyway?
WebIDL, for all of its virtues, is deeply confused.
If you're reading any of the stuff in the HTML5 spec that's describing its API this way, it's hard to see how it would have any sane relationship to JavaScript. Sure, you could argue that there might be other languages that matter, other languages for which you'd need to be able to generate some API, but none of them rise to anything like the importance of JavaScript. It is the programming language of the web, so if WebIDL has any animating force at all, it's JS. Then there's the "accident of history" aspect. Early DOM was specified as a form of IDL in part because there was some expectation that other languages would need to consume it and IDL was how C++ hackers (who still make up the entire set of people working on browser engines) are/were comfortable in describing their FFIs thanks to the legacy of COM/CORBA. Hilarious examples of multi-language-ism still persist in the WebIDL spec for no apparent reason whatsoever, warping the entire design around the altar of an ideal that is either quixotic or vestigial depending on which argument you give more weight.
Since the debate was re-kindled thanks to a debate at a TC39 meeting in July, I've been on the receiving end of more than one webdev's rant about DOM's JS incoherence, generally taking the form:
Why the *#!*?^@$ isn't DOM just #@!*@ing specified in JavaScript?
To be honest, I have no answer aside from pointing to the IDL history, the fact that browser hackers don't tend to write JS so don't feel the pain, and noting that WebIDL is better in some important ways. Certainly these interfaces could be specified in a subset of JS with annotations for where native behavior is required. But their larger lament has merit too: seamlessness with JS is the bar WebIDL should be judged by. I.e. does it help spec authors do the right thing by JS devs? Or does it lead them down paths that make their generated APIs stick out like sore thumbs, full of anti-social/alien behavior such that you can't think of them as "regular" JS?
Yes, constructors are only one minor step toward reaching this aspiration, but the fact that WebIDL has gotten to last-call without a reasonable solution to them speak volumes. If WebIDL isn't animated by the need of JS developers, it would be good if that could be loudly called out somewhere so that the community can have the spirited debate that this point demands. If it is, can we please get on discussing how best to ensure that most "interfaces" generate constructors and stop punting?
Either way, WebIDL isn't done yet.
Update: It occurred to me, as part of the discussion in the comments, that the provision against new
with a class or type of any type is completely non-sensical in JS, as is the lack of call()
and apply()
methods on them. Idiomatic subclassing requires that the mixin-style be available, which uses ClassName.call(this)
. This is what you'd do with things that are "virtual" or "partial" interfaces if you're describing them in actual JS. And there's no real problem with folks new
-ing them up. Doesn't happen, doesn't matter. Strong restrictions against it are, to quote Andrew DuPont, anti-social. Long story short: there is absolutely no reason whatsoever to disable construction on any WebIDL interface. It's deeply non-sensical from the JavaScript perspective.
I keep getting distracted from writing this long thing by responding to the discussion created by the last short-ish thing, but I wanted to explicitly call out one aspect, namely that standards are a form of insurance.
More correctly -- and apologies if this sounds like a Planet Money episode -- vendors sell derivatives contracts (insurance) against the proprietary nature of technologies. I.e., as a hedge against winner-take-all dynamics of network effects and the potential for monopoly rent extraction. Adopters of that technology often refuse to buy it at all without the availability of a derivative to cover their risk of lock-in.
The nice bit about these contracts is that the results (price) are free to the public -- ITU and other old-skool orgs are exceptions -- meaning anyone who wants to implement can assess what it'll cost, and if they can provide a solution at a lower to-end-user price, they can compete under the same terms. The public nature of the derivative contract can have the effect of lowering the price of the good itself in markets that clear and have many participants.
Standards organizations are correctly understood as middle-men or market-makers for these contracts.
Update: It may seem strange to some that derivatives contracts (insurance) can drive prices down, but it happens in many industries. E.g., satellite insurance has done wonders for the commercial launch vehicle industry. And who would fund the construction of new cargo ships if you couldn't convince enough people to ship their precious goods? Insurance matters.
Karl Dubost asked what a plan would look like for a W3C split along the lines I proposed in my last post. It's a fair question, so let me very quickly sketch out straw-men while noting that I would support many alternative plans as well. The shape of the details might matter, but not as much as movement in the right direction, and I have faith that the W3C members and staff would do a good job of executing on any such plan. Here are some general forms it could take, both of which seem workable to me:
Spin-Off With Cross-licensing
Taking many of the W3C's activities to a different organization might be traumatic for some members due to IPR concerns. If the members agree that this is the most important consideration, a pure split could be effected, leaving smaller, parallel organizations with identical legal arrangements and identical IPR policies in place. Member organizations would, in a transition period, decide to move their membership to the spin-off, stay with the W3C, or join both (perhaps at a discount?). A contract between the new organizations would allow perpetual cross-licensing of applicable patent rights.
The risk to this plan is that it may not be clear that one new organization would be appropriate -- remember, we're talking about many strange bedfellows under the current W3C umbrella -- so perhaps this may only work for whatever center of gravity exists in the spun-off concerns. Smaller efforts (Java-centric APIs, etc.) may want to find other homes.
Merger
One or more of the sub-efforts on the block may wish to find a new home in an existing standards organization. For these efforts there is likely to be larger mismatch between processes, IPR policies, etc. than a spin-off would entail, but the benefits to both receivers and members are clear: a viable home for the long term is hard to build, but coming home to "your people" isn't nearly as difficult. XML standards, for instance, have a long history at OASIS and it's unlikely that many members who care primarily about XML at the W3C aren't also active there. Transfer of IPR in this case is likely to be more fraught, so contracts which stipulate license from the W3C to these new organizations would also need to include clauses which cause the receiving orgs to exercise similar processes to the ones currently in place for the standards which they receive ownership of. The contract would also need to stipulate that RF never be endangered for new versions of divested specs. I haven't thought hard about transfer of membership under this scenario, but pro-rated and discounted membership terms at all merging organizations -- but only for divested activities -- might work.
Whatever path a breakup takes, the W3C should listen to each of the non-web communities it's spinning off and work to ensure that their needs are met in the process. There are lots of details I don't have time to think through right now, but none of the ones I can identify seem like deal-breakers. The biggest missing piece is the political will to make something happen in the first place.
This is a quick thought as I'm working on something much longer (and hopefully more interesting) to be published in the next week or so. Also, I just want to re-iterate that what's said here are my own thoughts and not those of Google unless expressly stated otherwise.
I've been arguing -- mostly over beers -- for the last year or so that the W3C needs to find a ways to re-focus on the needs of the constituencies that give it credibility; namely web developers and browser vendors. This tends to fall on frustrated W3C staffer ears as individuals might agree but the organization feels the effects of its failures not as a question about how to re-focus around a credible mission but as a short-run funding shortfall. I'm excited that the new Community Groups have the potential to help fix some of this by allowing a freer flow of ideas and a more representative expressions of demand, but there's the lingering pay-for-play issue that I think is liable to drag the organization back to the status-quo as soon as the excitement and newness of Community Groups wear off.
Now, let me say clearly up front that I don't think that standards bodies are charities, nor should they be. People join (in the case of the W3C, "people" == "companies", an equivalence that works as US legal fiction but never in the real world) because they're expressing their economic interests in agreement on thorny, market-focused issues. The W3C, as an organ, has grown out of an IETF alternative into a critical mediator in the flow of ideas and technology in the web ecosystem. When it fails, we're starved of progress because nothing else can easily grow in its place. Bless Hixie and the WHATWG for trying, but the W3C sits on fertile spec and patent rich land. It's tough growing in the rocky craigs, particularly without a patent policy.
So, in the spirit of reform, not revolution, I have a proposal, complete with Swiftian modesty:
At this year's TPAC, it should be agreed that the W3C will divest itself of any and all Semantic Web, RDF, XML, Web Services, and Java related activities. SVG can be saved, but only if it re-charters to drop all XML dependencies in the next version.
Where should these specs and member organizations go to get their itches scratched? I'm not sure it matters, although it seems likely that OASIS wouldn't mind being the go-to place for XML. And there's always ECMA, or IETF, or ISO, or...you get the idea.
Why do such a drastic thing?
Well, it's not that drastic, really. The SemWeb folks use the term "web" in a hopeful, forward looking way that nobody else does. Their technology is showing promise inside firewalls but their work has near zero relationship to the actual, messy web we swim in every day. The result is that their narrow constituency isn't the one that animates whatever legitimacy the W3C might enjoy, and therefore their influence in the W3C is out of all proportion to their organizational and web-wide value. Sir Tim's belief in their goal is noble, but to use the suffix "web" on it today is simply wishful thinking. And things that don't have anything to do with the web probably don't belong at the W3C. Besides, the W3C imprimatur doesn't help these specs and groups as much as their advocates might think -- standards bodies aren't venues for creating the future after all, only for cleaning it up in response to the messy, explosive process of market-driven evolution. Anyway, being endlessly frustrated that the entire web hasn't changed to accomodate some new model even though it has the phrase "web" in the name can't be fun. Technologies need to find their success where they really do fit and distributed extensibility and semantics might be valuable...but it hasn't been to the web yet. It needs to go.
As for XML, well, it won the enterprise and lost the web. That's still success, but it's not the web.
Rinse, repeat for RDF, Web Services, and Java.
What would that leave us with? CSS, DOM & JS APIs, HTML, a11y, i18n, and all the other stuff that has legs out on the public web. More to the point, it would have the beneficial effect of re-focusing the organization around getting HTML5 done, getting DOM APIs done that don't suck for JavaScript on the altar of IDL "language neutrality", etc.
Organizationally, it would leave the W3C in a much leaner, more focused place. It's much easier to build a focused constituency and membership, and the lower cost structure would only help the organization weather hard times. I have high hopes that it can be brutally effective again...but to get there, it needs to focus, and focus is the process of ignoring things. The time has come for the W3C to grab the mantle of the web, shake off its self-doubt, and move to a place where doing good isn't measured by numbers of specs and activities, but by impact for web developers.
Older Posts
Newer Posts