Monday, 31 December 2018

Vox On Nietzsche

When I was thinking of becoming a Christian I wanted to read some anti-Christian books. I'd heard Nietzsche was worth reading so I read The Anti-Christ and Twilight Of The Idols. If anything they pushed me towards Christ: rather than presenting arguments against Christianity, they assume it's false and then rant about the implications of that — implications which are wholly unattractive to anyone reluctant to give up on morality. So I can recommend those books to anyone :-).

I was reminded of that by this Vox piece. The author tries to put some distance between Nietzsche and the "alt-right" but only partially succeeds. It's certainly true that atheist alt-righters, in rejecting Jesus but idolizing secular Christendom, have it exactly the wrong way around (though I'm glad they understand Jesus is incompatible with their ideology). It's also correct that Nietzsche argued for demolishing the trappings of Christianity that people hold onto after rejecting Jesus. Unfortunately for the Vox thesis, as far as I read, Nietzsche focused his contempt not on the geopolitics of "Christendom", but (quoting Vox) "egalitarianism, community, humility, charity, and pity". In this, Nietzsche is on the side of Nazis and against progressives and other decent human beings.

The Vox author points out that Nietzsche himself was against racism and anti-Semitism, but those who embrace his philosophy, who "reckon with a world in which there is no foundation for our highest values", can end up anywhere. If you see "egalitarianism, community, humility, charity, and pity" as non-obligatory or contemptible, your prejudices are likely to blossom into racism and worse. Fortunately Nietzsche's philosophy is incompatible with human nature, our imago Dei; intellectuals (both actual and aspiring) pay lip service to "a world in which there is no foundation for our highest values", but they do not and cannot live that way.

Friday, 21 December 2018

Hollyford Track

Previously I recounted our Milford Track trip up to the point where the rest of our group departed, leaving my children and I in Milford. On the morning of December 12 we flew in a light plane from Milford up the coast to Martins Bay; from there we walked inland over the following four days up the Hollyford Valley until we reached the lower end of the Hollyford road.

The flight itself was a great experience. We flew down the Milford Sound to the ocean and turned north to fly up the coast to Martins Bay. We were flying pretty low and got a great view of the Sound, the rugged and relatively inaccessible Fiordland coast, and the bottom end of the Hollyford Valley. Our pilot didn't have other passengers that day, so he brought along his dive gear and went diving at Martins Bay after he dropped us off, leaving his plane parked beside the tiny gravel airstrip.

We walked for about an hour from the airstrip to Martins Bay Hut and spent the rest of the day based there. Probably my best moment of the trip happened nearly right away! I thought I'd try swimming across the Hollyford River to the sandspit, but as soon as I got into the water four dolphins appeared and swam around me for a couple of minutes until, presumably, they got bored. That was an amazing experience and completely unexpected. I felt blessed and privileged. Apparently dolphins and seals often swim from the ocean up the Hollyford River all the way to the head of Lake Mckerrow, which must be around 15km inland.

That day we also visited the Long Reef seal colony about 20 minutes walk from Martins Bay Hut. We were a bit nervous since December is calving time for the seals, and indeed we met a seal on the track who barked at us, sending us running the other way! I also saw, from a distance, a Fiordland crested penguin.

By the evening of that day five other trampers had arrived at Martins Bay Hut, but it's a large hut with plenty of room for up to 24 so it still felt very spacious.

The following day we walked to Hokuri Hut along the shore of Lake Mckerrow and had a relaxing afternoon. It rained, but only after we'd arrived at the hut. (In fact we didn't use our rain jackets at all on the Hollyford Track.) A couple of the trampers from Martins Bay Hut joined us, and we also had a couple coming south from Demon Hut. A group of four visited the hut; they had rafted down the Pyke River and the Hollyford River to Lake Mckerrow and were planning to fly out once they reached Martins Bay. Rather than stay in the hut they camped by the lake. Apparently they saw seals catching fish down there.

On the third day we walked the infamous Demon Trail along Lake Mckerrow to Mckerrow Island Hut. It's several hours of picking one's way over piles of large, slippery rocks. We took it slowly and it didn't bother us, but we were glad to reach the end. We crossed "3-wire bridges" for the first time and mostly enjoyed them.

We'd been warned that Mckerrow Island Hut was dirty and rodent-infested, but despite the hut being a bit old (built in the 1960s) it seemed fine and the location is wonderful — a very short track to a beach with great views down Lake Mckerrow. We saw no sign of rodents, though they may have been deterred because we had six people in the hut that night. Two of them were pack-rafting from the Hollyford road end, down the Hollyford River, out to Martins Bay, then carrying their rafts to Big Bay, over to the Pyke River, and back to the Hollyford confluence.

Our fourth day was pretty easy, about six hours of walking to get to the Hidden Falls Hut. On the fifth day we walked for just two and a half hours to reach the Hollyford Road end, a fine riverside spot to wait for a couple of hours for a shuttle to pick us up.

The Hollyford was a harder walk than a Great Walk, and would have been harder still with less perfect weather, but it was a bit quieter and the Hollyford Valley is just as stunning, so it was well worth doing. As you'd expect the trampers we met were, on average, a lot more hard-core. Apparently we just missed meeting a couple of Chileans who walked from the road to the ocean and back carrying surfboards, which sounds crazy. We met a few guys who had done the pack-rafting round trip from the Hollyford Road end to Martins Bay to Big Bay and back down the Pyke River in just over 24 hours, which is also crazy. We took it relatively easy and I'm happy with that.

Thursday, 20 December 2018

Milford Track 2018

Earlier this month I spent 11 days in the South Island walking the Milford Track and then, after a short break in Milford, the Hollyford Track.

It was my second time on the famous Milford Track. I took my kids again, and this time went with some friends from Auckland Chinese Presbyterian Church. We booked it back in June in the first hour or two after bookings opened for this summer; it's the most popular track in New Zealand and books up very fast. Note that despite being popular, because you have to book, it's not actually busy on the track. There are only 40 unguided walkers allowed per day on each section of track. There are another 40 or so guided walkers staying at the Ultimate Hikes lodges, but they start an hour or two behind the unguided walkers each day, so you seldom see many of them.

Once again we were lucky to have mostly good weather. Unlike last time, the weather on our first day (December 7) was excellent. The boat trip up to the end of Lake Te Anau to the trailhead is a wonderful start to the experience; you feel yourself leaving civilization behind as you enter the Fiordland mountains via the fjords of Lake Te Anau.

Our only rainy day was the third day (out of four), when we crossed Mckinnon Pass. Unfortunately this meant that once again I could not see the view at the pass, which is apparently spectacular on a good day. I guess I'll have to try again sometime! Next time, if the weather's good on day two, I should go as fast as possible up the Clinton Valley to Mintaro Hut, drop my gear there and carry on up to the pass for a look around before returning to Mintaro. I guess a reasonably fit person without a pack can probably get to the top from the hut in an hour and a half.

Bad weather days on these trips don't bother me that much since I will probably be able to go again if I really want to. I feel bad for foreign visitors who are much less likely have that chance!

I did get a chance to explore Lake Mintaro and its streams this time. It's very close to the hut and well worth a walk around.

I'm not very good at identifying wildlife but I think we saw a number of whio (blue ducks). They're still endangered but it appears their numbers are rebounding thanks to the intensive predator trapping going on in the Clinton and Arthur valleys and elsewhere. Apparently it is now quite rare for the trappers to catch stoats there. There is a beech mast this season which will probably mean large-scale aerial poison drops will be needed this winter to keep rats down.

Overall I really enjoyed the time with family and friends, met some interesting people, and thanked God for the beauty of Fiordland both in the sun and in the wet. We had a particularly good time stopping for over an hour at Giant's Gate Falls near the end of the track, where the warmth of the sun and the spray from the falls mostly keep the sandflies at bay.

After we got to Milford on the last day most of our group checked into Milford Lodge and cleaned up. The next day we did a Milford Sound cruise with some kayaking, which was lots of fun. Then the rest of our group bussed out to Te Anau while the kids and I stayed another night before starting the Hollyford Track on December 12. That deserves its own blog post.

Wednesday, 28 November 2018

Capitalism, Competition And Microsoft Antitrust Action

Kevin Williamson writes an ode to the benefits of competition and capitalism, one of his themes being the changing fortunes of Apple and Microsoft over the last two decades. I'm mostly sympathetic, but in a hurry to decry "government intervention in and regulation of the part of our economy that is, at the moment, working best", he forgets or neglects to mention the antitrust actions brought by the US government against Microsoft in the mid-to-late 1990s. Without those actions, there is a high chance things could have turned out very differently for Apple. At the very least, we do not know what would have happened without those actions, and no-one should use the Apple/Microsoft rivalry as an example of glorious laissez-faire capitalism that negates the arguments of those calling for antitrust action today.

Would Microsoft have invested $150M to save Apple in 1997 if they hadn't been under antitrust pressure since 1992? In 1994 Microsoft settled with the Department of Justice, agreeing to refrain from tying the sale of other Microsoft products to the sale of Windows. It is reasonable to assume that the demise of Apple, Microsoft's only significant competitor in desktop computer operating systems, would have increased the antitrust scrutiny on Microsoft. At that point Microsoft's market cap was $150B vs Apple's $2B, so $150M seems like a cheap and low-risk investment by Gates to keep the US government off his back. I do not know of any other rational justification for that investment. Without it, Apple would very likely have gone bankrupt.

In a world where the United States v. Microsoft Corporation (2001) antitrust lawsuit didn't happen, would the iPhone have been as successful? In 1999 I was so concerned about the potential domination of Microsoft over the World Wide Web that I started making volunteer contributions to (what became) Firefox (which drew me into working for Mozilla until 2016). At that time Microsoft was crushing Netscape with superior engineering, lowering the price of the browser to zero, bundling IE with Windows and other hardball tactics that had conquered all previous would-be Microsoft competitors. With total domination of the browser market, Microsoft would be able to take control of Web standards and lead Web developers to rely on Microsoft-only features like ActiveX (or later Avalon/WPF), making it practically impossible for anyone but Microsoft to create a browser that could view the bulk of the Web. Web browsing was an important feature for the first release of the iPhone in 2007; indeed for the first year, before the App Store launched, it was the only way to do anything on the phone other than use the built-in apps. We'll never know how successful the iPhone would have been without a viable Web browser, but it might have changed the competitive landscape significantly. Thankfully Mozilla managed to turn the tide to prevent Microsoft's total browser domination. As a participant in that battle, I'm convinced that the 2001 antitrust lawsuit played a big part in restraining Microsoft's worst behavior, creating space (along with Microsoft blunders) for Firefox to compete successfully during a narrow window of opportunity when creating a viable alternative browser was still possible. (It's also interesting to consider what Microsoft could have done to Google with complete browser domination and no antitrust concerns.)

We can't be sure what the no-antitrust world would have been like, but those who argue that Apple/Microsoft shows antitrust action was not needed bear the burden of showing that their counterfactual world is compelling.

Sunday, 25 November 2018


We spent a couple of days in Raglan celebrating our wedding anniversary. It's a small coastal town a couple of hours drive south of Auckland, famous for surfing, quiet at this time of year though I hear it gets very busy in the summer holidays. We had a relaxing time, exploring the town a little bit and driving down the coast to Te Toto Gorge to climb Mt Karioi. That's a smaller version of the nearby Mt Pirongia — the summit area comprises many steep volcanic hillocks and ridges, all covered in dense bush. The track is rough, and in one place there are chains to help climb up and down. After two and a half hours we got as far as the lookout for some fabulous views over Raglan and the coast further north, and decided not to bother with the extra hour to the true summit. The views over the ocean on the way up were quite spectacular. On the way up we could see the snowy peaks of Mt Taranaki and, from the lookout, Mt Ruapehu — each about 170km away.

Tuesday, 13 November 2018

Comparing The Quality Of Debug Information Produced By Clang And Gcc

I've had an intuition that clang produces generally worse debuginfo than gcc for optimized C++ code. It seems that clang builds have more variables "optimized out" — i.e. when stopped inside a function where a variable is in scope, the compiler's generated debuginfo does not describe the value of the variable. This makes debuggers less effective, so I've attempted some qualitative analysis of the issue.

I chose to measure, for each parameter and local variable, the range of instruction bytes within its function over which the debuginfo can produce a value for this variable, and also the range of instruction bytes over which the debuginfo says the variable is in scope (i.e. the number of instruction bytes in the enclosing lexical block or function). I add those up over all variables, and compute the ratio of variable-defined-bytes to variable-in-scope-bytes. The higher this "definition coverage" ratio, the better.

This metric has some weaknesses. DWARF debuginfo doesn't give us accurate scopes for local variables; the defined-bytes for a variable defined halfway through its lexical scope will be about half of its in-scope-bytes, even if the debuginfo is perfect, so the ideal ratio is less than 1 (and unfortunately we can't compute it). In debug builds, and sometimes in optimized builds, compilers may give a single definition for the variable value that applies to the entire scope; this improves our metric even though the results are arguably worse. Sometimes compilers produce debuginfo that is simply incorrect; our metric doesn't account for that. Not all variables and functions are equally interesting for debugging, but this metric weighs them all equally. The metric assumes that the points of interest for a debugger are equally distributed over instruction bytes. On the other hand, the metric is relatively simple. It focuses on what we care about. It depends only on the debuginfo, not on the generated code or actual program executions. It's robust to constant scaling of code size. We can calculate the metric for any function or variable, which makes it easy to drill down into the results and lets us rank all functions by the quality of their debuginfo. We can compare the quality of debuginfo between different builds of the same binary at function granularity. The metric is sensitive to optimization decisions such as inlining; that's OK.

I built a debuginfo-quality tool in Rust to calculate this metric for an arbitrary ELF binary containing DWARF debuginfo. I applied it to the main Firefox binary built with clang 8 (8.0.0-svn346538-1~exp1+0~20181109191347.1890~1.gbp6afd8e) and gcc 8 (8.2.1 20181105 (Red Hat 8.2.1-5)) using the default Mozilla build settings plus ac_add_options --enable-debug; for both compilers that sets the most relevant options to -g -Os -fno-omit-frame-pointer. I ignored the Rust compilation units in libxul since they use LLVM in both builds.

In our somewhat arbitrary metric, gcc is significantly ahead of clang for both parameters and local variables. "Parameters" includes the parameters of inlined functions. As mentioned above, the ideal ratio for local variables is actually less than 1, which explains at least part of the difference between parameters and local variables here.

gcc uses some debuginfo features that clang doesn't know about yet. An important one is DW_OP_GNU_entry_value (standardized as DW_OP_entry_value in DWARF 5). This defines a variable (usually a parameter) in terms of an expression to be evaluated at the moment the function was entered. A traditional debugger can often evaluate such expressions after entering the function, by inspecting the caller's stack frame; our Pernosco debugger has easy access to all program states, so such expressions are no problem at all. I evaluated the impact of DW_OP_GNU_entry_value and the related DW_OP_GNU_parameter_ref by configuring debuginfo-quality to treat definitions using those features as missing. (I'm assuming that gcc only uses those features when a variable value is not otherwise available.)

DW_OP_GNU_entry_value has a big impact on parameters but almost no impact on local variables. It accounts for the majority, but not all, of gcc's advantage over clang for parameters. DW_OP_GNU_parameter_ref has almost no impact at all. However, in most cases where DW_OP_GNU_entry_value would be useful, users can work around its absence by manually inspecting earlier stack frames, especially when time-travel is available. Therefore implementing DW_OP_GNU_entry_value may not be as high a priority as these numbers would suggest.

Improving the local variable numbers may be more useful. I used debuginfo-quality to compare two binaries (clang-built and gcc-built), computing, for each function, the difference in the function's definition coverage ratios, looking only at local variables and sorting functions according to that difference:

debuginfo-quality --language cpp --functions --only-locals ~/tmp/ ~/tmp/
This gives us a list of functions starting with those where clang is generating the worst local variable information compared to gcc (and ending with the reverse). There are a lot of functions where clang failed to generate any variable definitions at all while gcc managed to generate definitions covering the whole function. I wonder if anyone is interested in looking at these functions and figuring out what needs to be fixed in clang.

Designing and implementing this kind of analysis is error-prone. I've made my analysis tool source code available, so feel free to point out any improvements that could be made.

Update Helpful people on Twitter pointed me to some excellent other work in this area. Dexter is another tool for measuring debuginfo quality; it's much more thorough than my tool, but less scalable and depends on a particular program execution. I think it complements my work nicely. It has led to ongoing work to improve LLVM debuginfo. There is also CheckDebugify infrastructure in LLVM to detect loss of debuginfo, which is also driving improvements. Alexandre Oliva has an excellent writeup of what gcc does to preserve debuginfo through optimization passes.

Update #2 Turns out llvm-dwarfdump has a --statistics option which measures something very similar to what I'm measuring. One difference is that if a variable has any definitions at all, llvm-dwarfdump treats the program point where it's first defined as the start of its scope. That's an assumption I didn't want to make. There is a graph of this metric over the last 5.5 years of clang, using clang 3.4 as a benchmark. It shows that things got really bad a couple of years ago but have since been improving.

Sunday, 4 November 2018

What Is "Evil" Anyway?

I found this Twitter thread insightful, given its assumptions. I think that, perhaps inadvertently, it highlights the difficulties of honest discussion of evil in a secular context. The author laments:

It is beyond us, today, to conclude that we have enemies whose moral universe is such that loyalty to our own morality requires us to understand it and them as evil.
That is, evil means moral principles (and the people who hold them) which are incompatible with our own. That definition is honest and logical, and I think probably the best one can do under physicalist assumptions. Unfortunately it makes evil entirely subjective; it means other people can accurately describe us and our principles as evil in just the same way as we describe them as evil. All "evil" must be qualified as "evil according to me" or "evil according to you".

This is a major problem because (almost?) nobody actually thinks or talks about evil this way in day to day life, neither explicitly nor implicitly. Instead we think and act as if "evil" is an objective fact independent of the observer. Try tacking on to every expression of moral outrage the caveat "... but I acknowledge that other people have different moral assumptions which are objectively just as valid". It doesn't work.

Christians and many other monotheists avoid this problem by identifying a privileged frame of moral reference: God's. Our moral universe may or may not align with God's, or perhaps we don't care, or we may have trouble determining what God's is, but at least it lets us define evil objectively.

The Twitter thread raises a further issue: when one encounters evil people — people whose moral universe is incompatible with our own — what shall we do? Without a privileged frame of moral reference, one can't honestly seek to show them they are wrong. At best one can use non-rational means, including force, to encourage them to change their assumptions, or if that fails, perhaps they can be suppressed and their evil neutralized. This too is most unsatisfactory.

The Christian worldview is a lot more hopeful. We believe in a standard by which all will be measured. We believe in God's justice for transgressions. We believe in redemption through Jesus for those who fall short (i.e. everyone). We seek to love those who (for now) reject God's moral universe ... a group which sometimes includes ourselves. We see that even those most opposed or indifferent to God's purposes can change. These beliefs are not purely subjective, but grounded in objective truths about what God has done and is doing in the world.