Wednesday 23 January 2008
Slipping The Ball And Chain
I argued in my last post that implementing IE's <meta> tag for opt-in engine selection puts an extremely heavy burden on browser development in the long term. Furthermore, I just don't see the need for it in Firefox. I meet up with Web developers a few times a year, plus I am exposed to a lot of bug traffic, and I always ask the developers I meet whether they have problems with Firefox breaking their sites. So far I've not met one who rated this as an important issue. I'm not saying we don't, or that site breaking is unimportant; I work very hard to fix reported regressions. I do think our users don't clamour for cast-iron compatibility the way IE users apparently do. There are a few possible reasons:
- Lack of intranet penetration. Anecdotally, intranets are full of unmaintained, hairy Web content. Public sites with lots of users have high traffic and can justify maintenance; no-one cares if unmaintained low traffic sites drop out of site. Not so with intranet sites. Since we have pretty low market share in intranets, we don't see the problems there.
- Setting developer expectations. We have always revved our engine on a regular basis and never promised, nor delivered, total compatibility. Developers understand this and have set their expectations accordingly.
- Better historical adherence to standards. I think it's fair to say that IE's standards-breaking bugs have been a lot more severe historically than ours have, since the Firefox resurrection. So when we fix our bugs to become more standards compliant, that has a much lesser effect on Web sites.
What's remarkable is that we've not been hit by compatibility concerns even though up to and including our latest shipping product, we had no serious test automation! Thanks to all the test automation work during the Firefox 3 cycle, we should be even better at compatibility in the future.
It seems clear that for now we have no market need for drastic multi-engine compatibility, and therefore there's no need to even consider the pain it would cause. One could argue that by slaving themselves to the needs of the corporate intranet, IE is actually being hobbled for the mass market.
People have raised the "archival format" issue ... how do archaeologists decipher the late-90s Web far in the future. I honestly think that for total compatibility the best approach is virtual machines running the software of the age. As I mentioned in my last post, even the best side-by-side-engine efforts can't actually guarantee total compatibility. I don't think this should be a goal for Firefox. Maybe if there was nothing else left to do...
Comments
Not actually that remarkable. Firefox is the "new Netscape", remember. Most web developers know (knew) about Netscape. When Netscape 6 was shipped, millions of sites were (re)designed with NS6 compatibility in mind.
And since Netscape used Gecko, Firefox benefited from that and got compatibility for free.
You can have a 'view this page in legacy firefox'. It gives control of that space to a firefox that is a completely different application. That firefox has a different install path, runs as a seperate application.
An optimization of that would be having that remote process able to handle multiple pages, but in the theoretic future, one would probably only have one tab, for some intranet page.
This would also be the best method for IE as well.
I could see this as for an intranet page, an administrator sets some legacy white list, and no user knows or cares, and more importantly, the vast majority of the web doesn't need a tag it doesn't use.
There's some honorific html/js which seem to often be written by Java developers who know only compiled code but end up having to fake knowing front-end tech due to lack of a developer with better knowledge of the stuff.
These apps exist for years on intranets, B2B sites, and even the general websites we all occasionally hit.
Nobody dares touch that code since it's pretty much duct tape and bubble gum holding it together. Hence the gripes about breaking compatibility.
I can't say sites I've worked on are perfect, but I really make an effort keep them clean and understandable in the markup. Fixing an issue that someone notices with some browser OS combo is generally trivial at best.
As for "archival" purposes, I don't think it's an issue. HTML itself has always been pretty easy to extract the actual info from. It's the mid-late 2000's ajaxcentric sites that I'd be more worried about. How do you scrape all that JSON and xmlHttpRequest responses and recreate what JS does? Look at archive.org. Many/most sites have a few imperfections in that archive, but you can still get the info you need from pretty much any site dating back to the beginning of the archive.
http://blogs.msdn.com/cwilso/
;)
Definitely. The version target system is a solution to a problem only existant in IE.
If they hadn't left IE6 to rot for so long, this problem never would have existed.
The best solution proposed so far (in the various discussions) is to set the "real" standards mode in IE8 as default, and have a switch in the options to allow "IE7 mode" or "IE6 mode" for companies with ancient intranet apps.
I should have added to the "archival format" article that not only do you need the old browser engine, you need the old fonts, the old Flash player, the old version of Uniscribe, and goodness knows what else.
Why bother testing your site with FF or Opera, or looking at the internet standards when you can force it to render using an IE7 engine.
(And those of us not on Windows will continue to be locked out of intranets)
- Colin
Toe: If the intranet application is unmaintained, how exactly are they supposed to add a special IE6/IE7 tag to it? It's unmaintained!
Robert: I think part of the point of this change is that it's an admission that the majority of web developers do not test their sites on multiple browsers, or multiple versions of a browser, or against the standards. Ever. Nor will they ever.
Any solution that requires web developers to change is going to fail for the same reason that DOCTYPE switching failed, because of this class of web developers. (They just saw the doctype tag as a "magic word" to add at the beginning of the code, and started pasting it on everything. Or their CMS, meaning well, added it despite the other code in the CMS being completely contrary to the DOCTYPE used.) Software can't change human behavior, it can only change its own behavior.
Now the type of developer who does care, they can still benefit from this situation. They can test their web app in the latest browsers and update their tag as needed, or simply set it to "edge" and keep up as browser developments come down the line.
Alan: If the intranet app is unmaintained, who exactly is going to port it into this fictional Microsoft CMS?
Robert: Your "Preview" button for comments on this blog is broken.
> developers do not test their sites on multiple
> browsers, or multiple versions of a browser, or
> against the standards. Ever. Nor will they ever.
That may be true for the intranet, but I don't think it's true for the public Web. If it was, Firefox wouldn't work on most Web sites and we wouldn't have the market share we have.
In fact, the picture I get from Web developers for public Web sites is that most of them primarily test with Firefox and when they've finalized the site design they then backport it to IE's quirks.
For years Microsoft has delivered non-standard solutions, and used all sorts of workarounds to hide consequences under the carpet, and delay pay-time as much as possible. However this can only go on so long and corps are beginning to realise the critical mass point is not far off.
.doc binary format relative stability is replaced by .docx experimenting hell.
office UI is rewritten by mad squirrels
w2k & winXP are replaced by vista
last traces of MS JVM need to be hunted down
old software stacks are deprecated in favour of .Net
ie6 gets superceded by incompatible ie7 then (soon) ie8
win32 will have to be migrated to 64bit soon (for memory handling reasons mainly)
All those changes have huge costs for the conservative corporate users that froze their tech in the win2k/ie6 era. They're ripe for a major wintel house clean-up, and proposing a new hack now is not going to sell well.
This is about saying "I tested this page on the following versions of the following browsers" as a HINT to the browsers that _IF_ they have specific compatibility functionality to better render pages intended for one of those browser/version combinations, then the page is less likely to break in unexpected ways.
Standards evolve. Products have bugs. People can't test. There's no way ANY page will continue to look exactly the same, and while the information may still be accessible if you've done a good job, you will still miss something, and your site will break or not look as good as it could do everywhere.
Nothing forces browsers to keep rendering pages 100% the same. Nothing forces browsers to keep supporting really old pages.
But this provides a mechanism that allows browser innovation to speed up and also make it easier to deprecate outdated functionality, because you have an explicit mechanism to prevent the worst breakage from immediately affecting users until web designers can catch up.
If you're concerned about web designers becoming complacent about fixing broken sites, then set explicit schedules. Specifically state that just one or two prior versions rendering quirks will remain supported.
Or leave it to extensions to support the quirks, and it's entirely in the hand of users. That would allow users/developers to add the option of better compatibility with other browsers too, even when that other browser is "broken" and you really don't want to change the main codebase.
It seems to be very much the same problem, you face when needing to be able to build old versions of software. You might try putting the third party library in the source control, later adding the compiler, but in the end you end you really need to save the entire machine and that is of course easiest done when its virtual.
It seems to be very much the same problem, you face when needing to be able to build old versions of software. You might try putting the third party library in the source control, later adding the compiler, but in the end you end you really need to save the entire machine and that is of course easiest done when its virtual.
Firefox blocks load events from propagating to the window object to prevent too many sites from breaking since the sites were coded before a differnt bug was fixed.
https://bugzilla.mozilla.org/show_bug.cgi?id=335251
Now, FF did go to the W3 group to try to get the spec re-written so that their fix was spec compliant.
But the upshot is that there is no spec either requiring or prohibiting Gecko's behavior with regard to load events on Window.
OTOH, I cannot think of a single piece of content for which I would care, ten years later, exactly what the layout originally looked like. For long-term archive purposes, all that is really necessary, in vanishingly close to 100% of all cases, is to be able to read the words, and maybe follow the links. Nothing else matters on stuff that old. It doesn't have to look totally perfect, because you're not trying to impress anyone with it ten years later. It is, at that point, of *historical* interest, so only the facts really matter, not visual impressions.
Of course, being able to follow links is something that's especially likely to break when sites targetted to IE6 are viewed in a newer browser, especially if there's significant client-side scripting. But I consider that to be a design flaw in the first place. Sites should always be designed to be at least minimally navigable without client-side scripting, for a long list of very good reasons.
Why doesn't IE just provide a similar crutch mechanism for hotpatching old broken content. That would provide a way for old crap content to be hacked so it works without even editing the original pages.
Even corporations can hire people to mess with css/js modifications.
Of course such features are a magnet for hackers. I know that I just expect MS products to be insecure these days.
I'm still cynical about IE and I'd say they aren't fixing the nonstandard crap because of fears about losing marketshare. It seems hypocritical that office 2003 sp3 won't open legacy office documents but god forbid that IE.next would do the same.
http://www.w3.org/TR/2000/REC-DOM-Level-2-Events-20001113/events.html#Events-flow-bubbling
is indeed vague on what to do with the parent of the document object. Sorry for the derail.
Now lets get back on topic with wishing that the IE8 team would join the team of standard compliant browser developers. Wouldnt it be great if they worked with everyone else to fix the pages that break in anything but IE6/7 instead of breaking IE8 rendering by default. :)
Unfortunately, we then have to spend 10-30% of the development budget trying to find standards-compliant work arounds to IEs many failings. It's time and money that could be spent much better elsewhere...
Writing no browser specific code at all is our goal, so there is no way we'd ever use an IE-only meta tag -- even though we're gagging for a more standards-compliant IE.
Also because IE has security zones they could turn off older engines at different times in each zone. So sites on an intranet could get extra time to upgrade.
As I was saying though, that was my initial thought, but now I'm wondering just how many web pages are put up and forgotten about and will never be updated. The web is growing all the time so eventually they will be outnumbered by newer pages for newer browsers but most of that content will still be interesting to someone.
Will Microsoft have to have to support IE8 indefinitely?
What will be the half life of an X-UA-Compatible: IE=8 web page?
And, it doesn't prevent it from breaking in IE8 (with IE8 mode). In IE8, the bug fixes you added for IE6/7 might break the page.
Remember which type of website did break with the launch of IE7:
http://www.thinkvitamin.com/features/design/internet-explorer-7-were-you-ready
The argument of opt-in rendering modes for IE wouldn't happen if they played in the same game, instead of being left behind for years. Let's not forget the IE 5.0 was really the good guy in the Netscape 4 days, simply because of innovation and bringing out a better product that let developers build more advanced websites. If IE updated several times a year then intranet developers wouldn't be so complacent about building IE-specific products, and the resulting sites wouldn't deviate so far from working in other browsers. Even I've been down the path between 2000 and 2004 of making IE-only projects; and while IE continues to be off in their own neighbourhood it will be easy for developers to be led astray. I think, to be honest, more years just need to go by so that firefox & safari continue to outperform IE, until such time as Microsoft puts in much more resource and keeps up. I'm not holding my breath.
The same sort of problem hits the server side too. Look at how so many projects and webhosts are stuck on PHP4 and how that has given rise to rails and django being seen as far more elegant code than what can be done with PHP5 (when in actual fact there's a bunch of interesting frameworks like our SilverStripe, symfony, kohanaphp, et al, that show the benefit of continually living on recent technology, i.e. PHP 5.2+)
What I actually want to see is an effort to abolish quirks modes of all stripes. If I was to write a web browser, I'd impliment the specs exactly as written. Of course, that'd "break the web"...
Microsoft is corporate scum and this is just the latest proof.
I never develop a site outside of standads. I simply dont want to rewrite them period!
So, will this work for all that dont like standards or "half of it"?
If there were no standard, you couldn't buy a new owen to your kitchen and be sure it will fit!
Sara