The transition to robotic warfare scares me a lot. Soon, war will *only* kill civilians. Making war cheap for the aggressor can't be good. Most troublesome of all is the obvious problem that I rarely see discussed: computers, software and people being what they are, it's inevitable that drone command and control will be compromised by enemy agents. It will be very exciting when the US army turns around and flies back to flatten Washington, and no-one has any idea who did it. People think drones accentuate US military superiority, but it's not true; they neutralize it. This is going to be a tragic example of common sense being trumped by overconfidence and dazzling technology.
Tuesday, 28 July 2009
Justifying Evil
The slogan products aren't for everyone, but there's definitely a place in our society for provocative humour that pushes the boundaries. Translation: if we think it's funny and it shocks people, then that in itself is a good thing --- no matter who gets hurt.
I've seen lots of T-shirts like that overseas that are a lot more out there. Translation: Because this isn't the worst possible example, it's OK.
You hear the same sort of irrational nonsense used to defend absurdly violent video games and movies, and anything else that degrades human beings for profit.
Wednesday, 15 July 2009
Approved For The Intelligence Community
I hate to link and run, but I have to share this with my friends who aren't on Mozilla internal mailing lists: State Department workers beg Hillary Clinton for Firefox.
"It was approved for the entire intelligence community, so I don’t understand why State can’t use it."
I need a T-shirt that says "Approved for the intelligence community".
Tuesday, 14 July 2009
On Not Being Evil
About ten years ago I happened to have lunch with Eric Schmidt. The CMU board of trustees was visiting the School of Computer Science and they had lunch with some of the CS grad students. Our table was just Mr Schmidt, me, and a couple of other students. At the time he was CEO of Novell. I was feeling cantankerous and took the opportunity to ask Mr Schmidt how he managed when the imperative to maximise shareholder returns required unethical (but legal) behaviour. His answer was that this never occurs. He suggested that ethical behaviour always, in the long run, maximises returns.
I thought (and still think) this was grossly wishful thinking. It would require extraordinary luck or divine providence, and even for a theist like me I think it's clear that God has not always arranged for profit and ethics to be perfectly aligned (end sarcasm). A company truly focused on shareholder returns will sometimes, probably often, be obliged take advantage of unethical opportunities. Fortunately, Mr Schmidt's wishful thinking seems very common. People want to do the right thing, and convince themselves and each other that it's going to be best for the company too. They cheat the shareholders and do the rest of humanity a favour. Bravo!
I saw a lot more of this at IBM Research and (from outside) other labs --- publishing corporate research labs depend on corporate vanity mixed with altruism, and a desperate wish to believe against all evidence that they're an acceptable return on investment. It's a relief to work for a non-profit where we can be honest about such things.
Of course back at that lunch at CMU I had no idea that Mr Schmidt would go on to become CEO of the world's newest corporate superpower. Keep the faith, Mr Schmidt, and may the scales never fall from your eyes!
Saturday, 4 July 2009
Faces Of The Web Video Revolution
There's a lot of press these days about the HTML5 <video> tag and the struggle for universal unencumbered video and audio codecs --- much of it associated with the Firefox 3.5 launch. I wonder how many people know that the Firefox video implementation is almost entirely due to just a few people in the Mozilla office in Newmarket --- Chris Double, Matthew Gregan, Chris Pearce, and to a lesser extent, me. (Justin Dolske did the controls UI, but I'm not sure where he lives!) I'm proud that we managed to get considerably more done for the 3.5 release than I expected.
It's a great privilege to have the opportunity to really shake up the world for the better, with a very small team, in a relatively small amount of time, and do it right here in Auckland. I'm very thankful.
Thursday, 2 July 2009
Progress
I've submitted all of my compositor phase 1 patches for review. The patches are also published here. 111 files changed, 2573 insertions, 1448 deletions, divided into 39 separate patches over multiple Bugzilla bugs. In theory every one of those steps should build and pass tests, although I haven't actually verified that for all the patches. I managed to break things up pretty well --- the largest patch is only 536 insertions and 75 deletions --- so hopefully that will make reviewing easier.
MQ has been working pretty well for me. I get into a routine of applying all the patches, doing some testing, fixing a number of bugs, and then redistributing the changes across the patches that they logically belong to. I'm not 100% sure this is the most efficient way to work --- sometimes I burn quite a bit of time putting all the changes in just the right places --- but at least it's now possible.
Now it's time to start working on something else. My immediate next task is to restructure the media tests so we can generalize tests across file types and backends; for example, right now we have one set of seeking tests for Ogg and another for Wave, but we should just have a single set of tests parametrized by test files of different types.
After that, I plan to do some cleanup that's enabled by compositor phase 1. In particular, we can move scrolling out of the view system and integrate it all directly into the scrollframes in layout.
After that I plan to work on compositor phase 2. Right now in Gecko whenever something needs to be repainted we make platform-level invalidation requests, the platforrm dispatches paint events and we paint. This often leads to over-frequent painting. For example if there's a script changing the DOM 100 times a second, we'll try to paint 100 times a second if we can keep up, which is a waste of time since most screens only refresh at 60Hz or so. Even worse, if you have that script, and an animated image painting 20 times a second, and a video playing at 25 frames per second, we invalidate them all independently and if your computer is fast enough we'll paint 145 times a second. We need to fix this, and I have a plan.
Part of that plan is to create an internal animation API that various Gecko components (animated images, video, smooth scrolling, SMIL, CSS Transitions, etc) can plug into. But we also recognize that declarative animations will never be expressive enough for all use cases, and there's a lot of existing scripted animation libraries out there, so I have an idea for tying in scripted animations as well. Basically, I would expose the following API:
- window.mozRequestAnimationFrame(): Signals that an animation is in progress, and requests that the browser schedule a repaint of the window for the next animation frame, if the window is visible.
- The browser will fire a mozBeforePaint event at the window before we repaint it. Animation libraries should register an event handler that checks the current time, and updates the DOM/CSS state to show that point in the animation. If the animation has not ended, the event handler should call window.mozRequestAnimationFrame() again to ensure another frame will be scheduled in a timely manner.
That's it! That API gives the browser control over the frame rate, while allowing JS to do anything it wants in each frame.