Tuesday 11 May 2010
Discontent On The Web
I just read Sachin Agarwal's post titled "The web sucks. Browsers need to innovate" and my head exploded. But I can see that he's not alone in his views.
His basic thesis --- at least, the part that makes my head explode --- is that standardization efforts are slowing down innovation on the Web and therefore browsers should just provide whatever APIs they want and make no effort to standardize. Web authors should target particular browsers and then market pressure will force other browsers to implement those APIs.
Well, we tried that and it didn't work. It was called IE6. There were several problems, but the main problem is that the last step --- post-facto cloning of APIs based on reverse-engineering of the dominant browser --- is absolutely horrible in every way. It's expensive, slow, error-prone, and leads to a crappy platform because the dominant browser's bugs *are* the standard so you're stuck with whatever insane behaviour the dominant browser happens to have.
It's also highly prone to creating monopolies due to network effects. Agarawal supposes that users will just install several browsers and view each site using the browser it works in. But that won't happen due to corporate policies, platform limitations (how many browsers can you use on the iPad, again?) and sheer inconvenience. Inevitably one dominant browser will rise, the others will find it impossible to keep up with the reverse-engineering, and it will be IE6 all over again. Keep in mind that if Firefox --- and standards --- hadn't broken down the IE6 hegemony, Safari on the iPhone would be useless and the iPhone probably wouldn't be where it is today.
This sentiment is ill-timed. For a long time we had to burn a lot of energy to recover from the damage of the IE6 era. Now we are really in a new era of innovation on the Web. There is far more investment in browsers than there ever was before, and we're expanding what Web apps can do faster than ever. Agarwal complains
Web applications don't have threading, GPU acceleration, drag and drop, copy and paste of rich media, true offline access, or persistence.
but we've already shipped HTML5 offline apps, drag and drop, and Web Workers in Firefox, we've got WebGL and GPU-accelerated everything in the works (some already in nightly builds), and WebSimpleDB is coming along for persistent storage.
One area where we do have a problem is ease of development, especially tools. Agarwal is right about that.
Parting shots:
Right now browser updates fix bugs and add application features, but can't enhance the functionality of the web. This is only done by standards boards.
This is just incorrect. Browsers add functionality on their own initiative all the time --- hopefully prefixed so that developers know it's non-standard.
Browsers are forced to implement every "standard" that is agreed on, even if it's not the best decision for the platform.
Not at all true. Right now I'm explaining to people that we don't want to implement SVG fonts because they're almost entirely useless.
Browsers don't add functionality outside of standards because developers wouldn't utilize them. This means they can't innovate.
Untrue, see first point. But we get the best results when standards are developed in parallel with implementations.
Browsers don't even comply with standards well. Developing for the web is a disaster because every browser has its own quirks and issues. They can't even do one thing right.
But it's constantly getting better, especially as old Microsoft browsers go away. Plus you can't complain about cross-browser compatibility and advocate targeting one browser at the same time.
When GMail launched in 2004, it took one step forward and 10 steps backwards from the mail application I was using. Even today, the major features GMail is releasing are simply trying to match the features I've had on the desktop for years.
And yet, people have migrated to Web-based email en masse. Why is that?
I think this is the tipping point for the web. The modern web had over 10 years to reach parity with desktop applications, and it couldn't even hit that. Now it faces extinction as innovation in native applications accelerates.
The Web didn't exactly fail while it was miles behind native apps. Now it's a lot closer and moving a lot faster.
Comments
Ah! So that's why Firefox doesn't pass Acid3. Good to know :)
Although it's a bit of bad publicity, to be honest. Stupid people who think Google Chrome is lighter than Firefox care about blunt Acid3 scores, and ignore that Firefox is the browser that best shows the content of the web... But I'm not sure we want that kind of people using Firefox... After all, higher market share isn't necessarily beneficial...
The other obvious problem with the thesis is that there is no longer one single browser that could sway website authors to use browser-specific technologies.
Sachin wants browsers to create standards. It is not a bad idea. It is something that was happening and is happening and may yield very useful ideas.
Standardization is really slowing down the progress but the advantage of it is that it makes new things available to everybody and not only to elite. Slower progress but wider penetration. He is right. You are right. Depends on the problem you are solving.
BTW Half of your article is about proving that browsers do make great initiative outside of the standards and also don't follow standards (SVG fonts).
I think that the truth is exactly in the middle. We cannot allow one or two browsers to dominate and we need healthy progress that bubbles up from the bottom into the standards (We've learn from XHTML failure, haven't we?).
roc, issues is They are useless, but standardized. There are other people who has another big list of "useless" things, probably you (and web developer) feel useful. Difficult to justify anyone.
There are dozens of -moz-* and -webkit-* selectors that are effectively innovations that have not gone through a standards process. One would hope that they eventually would become standard.
Innovation needs to start somewhere, and it needs to start with a prototype. You can't really start with an idea and having people agree on it before you write a line of code. The Bike shed and all that.
I think the main problem was IE6 wasn't that it ignored standards. Most early browsers had bad standards compliance. The main problem with IE6 is that people are still using it.
Browsers today include all kinds of non-standard ways of doing things as well. Canvas was just a proprietary Apple extension to Safari that exposed their drawing API to javascript, and only became a standard after the fact. The same thing is true of xmlhttprequest which was originally an *activex* extension for IE to support outlook web access.
I think the mistake is to assume there is some kind of conflict between standards and proprietary extensions. If a standard codifies and fixes something that's already been implemented and used in the real world, the results tend to be better than something that comes out of design by committee that has never been 100% implemented.
I do think implementation should lead the way, and standardisation should come second.
<canvas> does not seem to me to be an example of what Sachin is proposing.
It really seems to me that Sachin is proposing a mix of what we already do and stuff that makes no sense. Exactly how much of each is unclear.
It's certainly true that a lot of what holds Web apps back is legacy IE users. To the extent he's suggesting apps and users ditch legacy IE, I'm on board with that!
Just a minor correction - I believe it's "WebSimpleDB", the almighty Google knows of no WebIndexedDB.
In fact, on some of the Google Groups to which I'm subscribed, there are so many people (falsely) asserting "I cannot bottom-post because of the Google web interface" that everyday I'm a little more convinced that reading these groups in some client like Sm-Mail or Thunderbird -- or even bad old Outlook Express -- (either as NNTP newsgroups or as mailing lists sent to a POP3 or maybe IMAP4 mail account) is the way to go.
Specification is documentation that allows other browser developers to replicate the innovation. (Which is a bit like experiment replication in science ... or not.) Innovation that is missing specification needs to be reverse-engineered, second-guessed, potentially masks security and other bugs and issues and just generally causes everyone a huge headache.
Discussion, on the other hand, is equivalent to peer review in science. It won't identify all the problems with an innovation, but it can avoid the obvious issues.
Ideally, a browser developer will hit upon an innovation, create a decently replicable specification for it, put it up for discussion for a short period (a month or even less would be enough) and then go ahead and implement it if there are no major issues identified. If the innovation catches on, it can then be standardised. (Lots of variations on this process would also work.)
This seems to me pretty close to what Mozilla already does. If this process is considered slow (and I don't think anyone believes it is), then I think people are being short-sighted, petty, selfish and impatient.
The whole developer team are Jesus Freaks.