Category Archives: Opinion

Where now for A of A?

For the first time in several years I will not be attending the annual AofA workshop/conference. This year it is being held in Brazil at an inconvenient time for me. The scientific programme looks quite interesting.

I have never seen a full history of the field of analysis of algorithms (the SODA talk by Philippe Flajolet has some interesting comments). A rough summary of what I believe to be the case: before Knuth came along, people analysed algorithms by implementing them and guessing their asymptotic runtime via extrapolation. Knuth changed this by showing that serious classical mathematics could be brought to bear on analysis of algorithms, and that this ultimately led to models that would predict to high accuracy the real running time on large problems. Notwithstanding the fact that many researchers in the area of algorithms (the “big-O crowd”) are content with less precise estimates, I am sure they were happy that such analysis was possible.

Recently I have seen more and more examples of situations where this kind of analysis is called into question. The main problems seem to be caching and branch prediction, which ensure that the time to execute an “elementary” operation can vary wildly. A relatively small number of papers have appeared that discuss these issues (after all, it should not be beyond the wit of man to say something analytical in these situations) but the results are necessarily less precise than in the old days (for a start, even finding out the branch prediction algorithm used by chip makers is not trivial).  Let’s see where the next few years lead.

Elitism: throw out the bathwater, keep the baby

Denis Dutton was a lecturer for a first-year Philosophy course I took over 20 years ago at the University of Canterbury. At the time I was fascinated by his lecture style (“Aristotle was the last person who knew everything”, or equivalent turn of phrase, is what I recall), as well as his very peculiar pronunciation of the word “character” (it sounded like “caric-chuh”). Years later I found out that there was more to him when he started the Arts and Letters Daily website, which he still edits, though it is now owned by the Chronicle of Higher Education.

I still read A&LD frequently. There is a clear slant toward ideas that go against conventional wisdom, and particularly those that ridicule the excesses of “postmodern” (for want of a better word) thinkers in the humanities. Of course, many of the articles are elegant time-wasters, but the best ones are really worth reading.

He has just published an essay on elitism, which discusses what evolutionary psychology has to say about our anti-elitist instincts. After discussing hunter-gatherer group organization and zero-sum economics in the distant past, he says

“It is part of our Pleistocene inheritance that many people will resent the elitist values they associate with the rich, whether it’s the opera, chardonnay, gallery openings, being able to distinguish between words such as criterion and criteria, using apostrophes properly or spouting an apposite quote from Shakespeare off the top of your head.”

I find the evolutionary stuff interesting, but for the purposes of this post I want to focus on the last point. The key point for me is that one should not throw the baby (higher culture and luxuries) out with the bathwater (rich and powerful rulers). The early trade unions and worker organizations represented people who worked in factories and were denied proper education. They aimed to offer these people an expanded mind and appreciation of things normally only available to the rich, such as poetry, opera, philosophy, etc. This was hundreds of years ago, and we are all now (in the “first world”) essentially at least as rich as kings were back then. Yet how many people, despite all their leisure, connect with the higher elements of our culture today? Far too few, I venture to say, though I don’t have clear proof.

The rich and powerful have done terrible things in the past, but they have always patronized great artists and thinkers. The products of that sponsorship are now available to us all as a result of great struggles. I think we should be thankful that we live now, and take advantage of these hard-won fruits. But I am afraid that too few people are doing so. There are a lot of distractions in the modern world, like “reality TV” and media-fuelled celebrity obsession. And I do detect, at least in the Anglosphere, a real dumbing down of public discourse, as exemplified by careless use of language. Politicians seem to think nothing of cavorting onstage to Rolling Stones music, but Mozart or opera seem impossible. I guess politicians don’t want to be seen as “elitist”. But having sympathy for the masses does not mean we have to share all their interests, surely. I come from a working-class background and am the first person ever in my family tree to go to university, so I have no time for hereditary privilege or the feelings of superiority of the idle rich. That is not the same as wanting to reject all the good things that a privileged upbringing can yield.

Waiting on Godot?

When I was a child, New Zealand had more British influence than it does today, at least as far as popular culture goes. We watched American TV, but relatively few people visited the USA, and it seemed very different. Britain was much more familiar.

I have recently noticed a large increase in Americanisms here. I had an argument at a petrol station with an attendant who insisted on calling it “gas” (this is one of the sillier uses the US has given us – it is a liquid). Yesterday I talked to a call centre operator who told me she was “waiting on the computer” (I wanted to ask whether this was comfortable, but decided this would just confuse the poor benighted and easily influenced soul). At a lower level, I have seen the word “ass” in common use here, but it was always “arse” when I was growing up.

I have nothing against change in language for logical reasons.  The head of my department doesn’t know the difference between “criterion” and “criteria”, but such things could be changed (“criterions?”) and have been changed in the past to make it easier to remember. What I find annoying is slavish adherence to foreign influence for no other reason than that we hear it on TV. “Waiting on” makes much less sense than “waiting for” in the context of a delay, since it already has a meaning: “I was waiting on Godot, and he was waiting on the pier”, seems a bit confusing. So why adopt it?

Why (some) mathematics is hard

Here’s an interesting article by Jim Holt in the New Yorker about the work of Stanislas Dehaene on the number sense in humans. From studying accident victims, children, Amazon tribes, etc, he concludes:

… we are all born with an evolutionarily ancient mathematical instinct. To become numerate, children must capitalize on this instinct, but they must also unlearn certain tendencies that were helpful to our primate ancestors but that clash with skills needed today. And some societies are evidently better than others at getting kids to do this.

When it comes to mathematics education, what he says is both standard:

“Give a calculator to a five-year-old, and you will teach him how to make friends with numbers instead of despising them,” he has written. By removing the need to spend hundreds of hours memorizing boring procedures, he says, calculators can free children to concentrate on the meaning of these procedures, which is neglected under the educational status quo.

and nonstandard (as far as I know, these days):

“The idea that all children are different, and that they need to discover things their own way—I don’t buy it at all,” he said. “I believe there is one brain organization. We see it in babies, we see it in adults. Basically, with a few variations, we’re all travelling on the same road.” He admires the mathematics curricula of Asian countries like China and Japan, which provide children with a highly structured experience, anticipating the kind of responses they make at each stage and presenting them with challenges designed to minimize the number of errors.

A very interesting fact is that the encoding of numbers in languages does seem to affect  efficiency of arithmetic operations. Chinese has more regular number names, and they are shorter, than say English or French, and this is reflected in the arithmetic performance of native speakers of these languages.

The (un)importance of theory

I went to a talk by Mike Powell (Cambridge University) yesterday. He is a well-known worker in approximation theory and optimization. His title, “The unimportance of theory in the development of optimization algorithms” certainly provoked me to attend.

As I understood his thesis, many practically useful algorithms have been developed for optimization problems (we are talking about continuous stuff here, not the discrete algorithms beloved of CS theorists), despite a lack of proper convergence theory. People (may) have been deterred from publishing good algorithms for lack of such theory. Also, worst-case bounds available do not predict the running time on typical problem instances.

This of course is old news if you know about quicksort, or indeed almost any algorithm. In 2003 Robert Sedgewick told me about experiments he undertook when writing the latest version of his series of textbooks on algorithms. The known upper bounds turned out to be useless in comparing algorithms, since they were so huge compared to the typical runtime. This is why the field of analysis of algorithms has existed and developed since started by Knuth in 1962.

But there is a big difference, it seems to me. For algorithms like quicksort, correctness is easy to prove and we are only discussing running time. But for optimization algorithms as discussed above, as far as I understand it, correctness can only be established by convergence theory. When I raised this,, I was told that the algorithms in question had been validated by checking them on test cases. If I were relying on an algorithm for running a nuclear power station or flying a spacecraft, I would not be reassured by this.

The right approach to a poor theory is to find a better theory, not to claim that theories are not useful.

The remarkable Donald Knuth

On the occasion (today in the USA) of Don Knuth’s 70th birthday, several bloggers have posted tributes (see Jeffrey Shallit’s list). His list of achievements is really impressive, and he has had a big influence on me. To name a few areas, somewhat facetiously: his founding of the field of analysis of algorithms (AofA) led to the development of a community of people who are heavy users of generating function methodology and therefore may be interested in my research in that area, and it gave me something nice to teach as a mathematician moving into a CS department! More seriously, I can’t conceive of computer science as a real science without proper analysis of algorithms, and I would hate to have to write papers without using TeX.

Since unalloyed praise can get a bit cloying, I will mention that it might have been better for the world if Knuth (or someone else) had been able to build a school of AofA in the USA. The algorithms community in the USA is heavily dominated by “worst-case” analysis and indeed by analysis of problem complexity, which is a very weak form of analysis of algorithms, concentrating as it does on lower bounds and rather weak (many would say useless) upper bounds.

The main development of AofA along Knuthian lines has been in Europe, in France (Philippe Flajolet and others) and Austria particularly. In recent years Bob Sedgewick and others have tried to bridge the gap between these communities through meetings such as ANALCO, but at least to my eyes they have not yet had great success.

Of course some may say that the sort of very detailed analysis by Knuth is unnecessary for a working computer scientist. And even if his analyses made sense in the days of flat memory, the prevalence of such things as caching makes it impossible to make such detailed predictions nowadays. But perhaps we just need some better mathematics.

Goodness, greatness

During an enforced period of recuperation I have been thinking about various “big issues” in life. One is the relation between goodness, by which I mean consistent behaviour according to a moral code that places a high weight on the welfare of others, and greatness, by which I mean consistent highly skilled performance at some activity.

The “myth of monolithic personality” described by Gian-Carlo Rota (every great person is also good, and great at everything) was probably created by rulers seeking to show their subjects how godlike they were. I doubt that many people believe this nowadays. In fact the opposite idea, that goodness is inversely correlated with greatness, is surely more common. It seems plausible: after all, we only have so much energy, and acquiring great skill requires a lot of time spent practicing, not to mention introspection.

In the musical Chess, by Tim Rice and half of ABBA, the Russian (world chess champion) and his ex-wife trade recriminations which express this (perceived) selfishness of the great:

“Anyone can be a husband, lover / sooner them than me, when they discover / domestic bliss is shelter for their failing / Nothing could be worse than self-denial / having to rehearse the endless trial / of a partner’s rather sad demands prevailing”

“As you watch yourself caring / about a minor sporting triumph sharing / your win with esoterics, paranoids, hysterics / who don’t pay attention to / what goes on around them, they leave the ones they loved the way they found them / a normal person must dismiss you with disgust / and weep for those who trusted you”.

My question is: what theoretical reasons are there for this alleged inverse correlation? There are numerous examples of high achievers who appear to have happy personal lives, and of course many who don’t. Are there hidden variables here – for example, the type of endeavour in which greatness is being sought (how competitive is it, intrinsically?), the performance level versus the maximum potential performance (are the really top people less stressed and therefore nicer than the ones just a little below, who don’t have as much talent and compensate by selfish overwork?), etc. In other words, could we have a decent theory that answers: when, and why, is a genius likely to be a jerk?

Public relations and social capital for nerds

I just went to a nice talk by Michael Trick at the 42nd ORSNZ Conference being held at the University of Auckland today and tomorrow. The gist of it was that while Operations Research as a subject is doing really well, with many interesting applications arising in the service sector of the economy in particular, the OR societies are not thriving and it does not have a clear “brand” at least in the public mind. This is partly because of the “Bowling Alone” phenomenon described by Robert Putnam (less engagement in any sort of civic activity by younger people), partly because the very success of the field makes it more likely to fragment, and partly because of a lack of effort at cohesion. He urged the use of new technology (blogs, etc) to try to create more sense of community and focus.

The theoretical computer science community have realised that even within computer science their contribution is greatly misunderstood (see the Theory Matters wiki), and convincing the public is likely to be much harder. I would love to hear from anyone who knows the most effective way to address this problem of correcting misperceptions. It causes problems like the job market time lag and other evils, and we academics really need to attack it more seriously.

Research funding in New Zealand: more money, less form-filling needed

I recently came across a submission by Nicholas Wormald (who a long time ago worked in the building where I am now) to the 2005/6 National Strategic Review of Mathematical Sciences in Australia. Two things stood out for me: how similar Australian government ideas are to those in New Zealand, and how different they seem to be from those in Canada. The rejection rate for the only serious fund for “pure” research in NZ, the Marsden Fund, is about 93%, resulting in much cynicism and quite likely a Matthew effect (one of my pet hates – I would love to see evidence that giving so much more money to the already successful is somehow better for science overall). It seems clear that many proposals are rejected simply for lack of funding, not because of any problem with the proposal. The Canadian NSERC system appears much more enlightened: smaller grants, spread more widely, with less paperwork.

If it makes sense to do so, please sign the Marsden Fund Open Letter from the New Zealand Association of Scientists, requesting a tripling of the fund.

The job market time lag: why?

 Yet another press article has come out (Tech sector needs to be sexier) saying that the computer industry has a serious skills shortage. At the same time enrolments have been dropping in the CS area over several years, and not just in New Zealand, but worldwide. Again we are being told that the parents and guidance counsellors are giving poor advice to high school students, and not explaining just how strong their prospects of interesting and well-paid jobs are.

Is there a coherent theory that explains this sort of mismatch? It has been several years now – why do parents and guidance counsellors take so long to get the message? Don’t they look at any evidence and predictions by analysts? On what do they base their ideas? It seems rather odd to give advice to teenagers that may have a major impact on their future without using some rigorous analysis. Or am I missing the point?