Author Archives: wilson.mark.c

Where now for A of A?

For the first time in several years I will not be attending the annual AofA workshop/conference. This year it is being held in Brazil at an inconvenient time for me. The scientific programme looks quite interesting.

I have never seen a full history of the field of analysis of algorithms (the SODA talk by Philippe Flajolet has some interesting comments). A rough summary of what I believe to be the case: before Knuth came along, people analysed algorithms by implementing them and guessing their asymptotic runtime via extrapolation. Knuth changed this by showing that serious classical mathematics could be brought to bear on analysis of algorithms, and that this ultimately led to models that would predict to high accuracy the real running time on large problems. Notwithstanding the fact that many researchers in the area of algorithms (the “big-O crowd”) are content with less precise estimates, I am sure they were happy that such analysis was possible.

Recently I have seen more and more examples of situations where this kind of analysis is called into question. The main problems seem to be caching and branch prediction, which ensure that the time to execute an “elementary” operation can vary wildly. A relatively small number of papers have appeared that discuss these issues (after all, it should not be beyond the wit of man to say something analytical in these situations) but the results are necessarily less precise than in the old days (for a start, even finding out the branch prediction algorithm used by chip makers is not trivial).  Let’s see where the next few years lead.

Bernard Chazelle to visit NZ

As part of the programme in Algorithmics sponsored by the NZIMA, Professor Bernard Chazelle of Princeton University (he also has his own Wikipedia article) will visit New Zealand to give some public talks in March 2009. I have found some of his writings very stimulating, if a little bewildering in their language. In particular his essay The Algorithm: Idiom of Modern Science is very persuasive. With luck, public appreciation of algorithms will be considerably higher after his visit than it is now!

Elitism: throw out the bathwater, keep the baby

Denis Dutton was a lecturer for a first-year Philosophy course I took over 20 years ago at the University of Canterbury. At the time I was fascinated by his lecture style (“Aristotle was the last person who knew everything”, or equivalent turn of phrase, is what I recall), as well as his very peculiar pronunciation of the word “character” (it sounded like “caric-chuh”). Years later I found out that there was more to him when he started the Arts and Letters Daily website, which he still edits, though it is now owned by the Chronicle of Higher Education.

I still read A&LD frequently. There is a clear slant toward ideas that go against conventional wisdom, and particularly those that ridicule the excesses of “postmodern” (for want of a better word) thinkers in the humanities. Of course, many of the articles are elegant time-wasters, but the best ones are really worth reading.

He has just published an essay on elitism, which discusses what evolutionary psychology has to say about our anti-elitist instincts. After discussing hunter-gatherer group organization and zero-sum economics in the distant past, he says

“It is part of our Pleistocene inheritance that many people will resent the elitist values they associate with the rich, whether it’s the opera, chardonnay, gallery openings, being able to distinguish between words such as criterion and criteria, using apostrophes properly or spouting an apposite quote from Shakespeare off the top of your head.”

I find the evolutionary stuff interesting, but for the purposes of this post I want to focus on the last point. The key point for me is that one should not throw the baby (higher culture and luxuries) out with the bathwater (rich and powerful rulers). The early trade unions and worker organizations represented people who worked in factories and were denied proper education. They aimed to offer these people an expanded mind and appreciation of things normally only available to the rich, such as poetry, opera, philosophy, etc. This was hundreds of years ago, and we are all now (in the “first world”) essentially at least as rich as kings were back then. Yet how many people, despite all their leisure, connect with the higher elements of our culture today? Far too few, I venture to say, though I don’t have clear proof.

The rich and powerful have done terrible things in the past, but they have always patronized great artists and thinkers. The products of that sponsorship are now available to us all as a result of great struggles. I think we should be thankful that we live now, and take advantage of these hard-won fruits. But I am afraid that too few people are doing so. There are a lot of distractions in the modern world, like “reality TV” and media-fuelled celebrity obsession. And I do detect, at least in the Anglosphere, a real dumbing down of public discourse, as exemplified by careless use of language. Politicians seem to think nothing of cavorting onstage to Rolling Stones music, but Mozart or opera seem impossible. I guess politicians don’t want to be seen as “elitist”. But having sympathy for the masses does not mean we have to share all their interests, surely. I come from a working-class background and am the first person ever in my family tree to go to university, so I have no time for hereditary privilege or the feelings of superiority of the idle rich. That is not the same as wanting to reject all the good things that a privileged upbringing can yield.

Waiting on Godot?

When I was a child, New Zealand had more British influence than it does today, at least as far as popular culture goes. We watched American TV, but relatively few people visited the USA, and it seemed very different. Britain was much more familiar.

I have recently noticed a large increase in Americanisms here. I had an argument at a petrol station with an attendant who insisted on calling it “gas” (this is one of the sillier uses the US has given us – it is a liquid). Yesterday I talked to a call centre operator who told me she was “waiting on the computer” (I wanted to ask whether this was comfortable, but decided this would just confuse the poor benighted and easily influenced soul). At a lower level, I have seen the word “ass” in common use here, but it was always “arse” when I was growing up.

I have nothing against change in language for logical reasons.  The head of my department doesn’t know the difference between “criterion” and “criteria”, but such things could be changed (“criterions?”) and have been changed in the past to make it easier to remember. What I find annoying is slavish adherence to foreign influence for no other reason than that we hear it on TV. “Waiting on” makes much less sense than “waiting for” in the context of a delay, since it already has a meaning: “I was waiting on Godot, and he was waiting on the pier”, seems a bit confusing. So why adopt it?

Why (some) mathematics is hard

Here’s an interesting article by Jim Holt in the New Yorker about the work of Stanislas Dehaene on the number sense in humans. From studying accident victims, children, Amazon tribes, etc, he concludes:

… we are all born with an evolutionarily ancient mathematical instinct. To become numerate, children must capitalize on this instinct, but they must also unlearn certain tendencies that were helpful to our primate ancestors but that clash with skills needed today. And some societies are evidently better than others at getting kids to do this.

When it comes to mathematics education, what he says is both standard:

“Give a calculator to a five-year-old, and you will teach him how to make friends with numbers instead of despising them,” he has written. By removing the need to spend hundreds of hours memorizing boring procedures, he says, calculators can free children to concentrate on the meaning of these procedures, which is neglected under the educational status quo.

and nonstandard (as far as I know, these days):

“The idea that all children are different, and that they need to discover things their own way—I don’t buy it at all,” he said. “I believe there is one brain organization. We see it in babies, we see it in adults. Basically, with a few variations, we’re all travelling on the same road.” He admires the mathematics curricula of Asian countries like China and Japan, which provide children with a highly structured experience, anticipating the kind of responses they make at each stage and presenting them with challenges designed to minimize the number of errors.

A very interesting fact is that the encoding of numbers in languages does seem to affect  efficiency of arithmetic operations. Chinese has more regular number names, and they are shorter, than say English or French, and this is reflected in the arithmetic performance of native speakers of these languages.

Humour: US election news

 Check this out!

I read a lot of Noam Chomsky in my younger days. Not so much now, and I certainly have become less enthusiastic about his political ideas. But Manufacturing Consent I still regard as an excellent book. I suspect Noam wouldn’t find this sort of cynicism funny, but it made me smile.

Algorithms research in NZ

Last week I was at the NZIMA Algorithms meeting in Napier, which I helped to organize. There were 5 invited speakers, and all their talks (they gave 2 each) were worthwhile listening. I got the role of amateur cameraman and so most of the talks have full video footage, which I hope to make available on the web soon.

At a strategy session it was decided that we will try to create a more formal community for those researching in algorithms (interpreted broadly, from very pure to very applied). The mechanism will probably be to set up something like a Google group. Please contact me if you are interested in joining.

The (un)importance of theory

I went to a talk by Mike Powell (Cambridge University) yesterday. He is a well-known worker in approximation theory and optimization. His title, “The unimportance of theory in the development of optimization algorithms” certainly provoked me to attend.

As I understood his thesis, many practically useful algorithms have been developed for optimization problems (we are talking about continuous stuff here, not the discrete algorithms beloved of CS theorists), despite a lack of proper convergence theory. People (may) have been deterred from publishing good algorithms for lack of such theory. Also, worst-case bounds available do not predict the running time on typical problem instances.

This of course is old news if you know about quicksort, or indeed almost any algorithm. In 2003 Robert Sedgewick told me about experiments he undertook when writing the latest version of his series of textbooks on algorithms. The known upper bounds turned out to be useless in comparing algorithms, since they were so huge compared to the typical runtime. This is why the field of analysis of algorithms has existed and developed since started by Knuth in 1962.

But there is a big difference, it seems to me. For algorithms like quicksort, correctness is easy to prove and we are only discussing running time. But for optimization algorithms as discussed above, as far as I understand it, correctness can only be established by convergence theory. When I raised this,, I was told that the algorithms in question had been validated by checking them on test cases. If I were relying on an algorithm for running a nuclear power station or flying a spacecraft, I would not be reassured by this.

The right approach to a poor theory is to find a better theory, not to claim that theories are not useful.

Commutative ring theory strikes again

A long time ago I tried to do research in (noncommutative) ring theory. Although I have not worked in the area this millennium, I am always glad to see algebraic methods used in novel ways. One of the most interesting authors to me is Jesus de Loera, who has been involved in LattE (lattice point enumeration program that implements Barvinok’s algorithm), Groebner basis approaches to integer programming, and now using Hilbert’s Nullstellensatz to show that, for example, certain graphs are not 3-colorable.

The idea is that for a graph with $latex n$ vertices and $latex e$ edges, the property of being 3-colorable is equivalent to the solvability over $latex mathbb{C}$ of a certain set of polynomial equations in $latex n + e$ variables. The Nullstellensatz (constructive form) says that if the set is not solvable, then a certificate can be found. The worst-case upper bounds on certificate degree are doubly exponential, but for this particular problem it appears that the degree is low rather often. The paper above has lots of clever tricks to reduce the computational effort. Great stuff!