Author Archives: wilson.mark.c

Departmental happenings

My department has started a blog aimed at increasing public understanding of computer science and raising our profile. The NZ secondary school curriculum and the news media clearly conflate CS (computer science) with IT (information technology), and this must be resisted.

The department’s annual public lecture series is coming up. Detail:

The Gibbons Lectures 2012: The Turing Legacy

The Department of Computer Science is delighted to announce the 2012 Gibbons Lectures. The lectures describe developments in research in Computer Science and are aimed at a general but technical audience – to Computer Science students at all levels, to IT practitioners in other departments and outside the University. The lecture series is memory of Associate Professor Peter Gibbons, who sadly passed away early in 2008.

2012 Synopsis:
Alan Matheson Turing was born in 1912. It is now widely accepted that he was one of the most important founders of both theoretical and practical computing, although he died in 1954 just when the field of computing was getting underway. Many of his contributions were not widely recognized at first, but this year, the hundredth anniversary of his birth, he is being celebrated by a series of events organized world-wide. We are joining the festivities by devoting this year’s Gibbons lectures to Turing and his influence.

Turing’s work was the basis for many areas of computing research and development that are still on-going. We have in New Zealand and our Department researchers who are experts in some of these areas and also experts in Turing and his achievements. The four Gibbons lectures this year will involve University of Auckland speakers discussing four topics in the rough order of Turing’s involvement during his lifetime.

Presented in association with the New Zealand Computer Society.

Schedule of lectures:
Thursday 26 April: Alan Turing and the Unsolvable Problem: To Halt or Not to Halt – That is the Question (Professor Cristian Calude)
Thursday 3 May: NOTE: This lecture was recorded in Nov 2011 and will be webcast only, at http://www.cs.auckland.ac.nz/our_department/Gibbons_Lectures/
Alan Turing and the Secret Cyphers: Breaking the German Codes at Bletchley Park (Professor Jack Copeland)
Thursday 10 May: Alan Turing and the Computing Engine: Turing’s achievements in practical computing (Professor Brian Carpenter and Professor Bob Doran)
Thursday 17 May: Alan Turing and the Artificial Brain: The Development of Artificial Intelligence (Associate Professor Ian Watson)

Where: University of Auckland Conference Centre, 22 Symonds St, Building/room 423-342
When: 5.30pm for refreshments, prior to a 6.00pm start.
Please RSVP to robyn@cs.auckland.ac.nz for catering purposes.

Further information and abstracts are available on the Gibbons Lectures webpage at www.cs.auckland.ac.nz/our_department/Gibbons_Lectures/
Video streaming will be available via the webpage

RIP Paul Callaghan

Dead of cancer at 64. I never met Paul Callaghan, New Zealand’s greatest public intellectual and by all accounts a thoroughly good human being. His vision for this country struck a chord with me. A basic obituary is in the NZ Herald.  Let’s hope that his work is not wasted. Productivity growth through better use of the mathematical and physical sciences is what we need.

Beyond the boycott

The boycott against Elsevier has been interesting so far. I have had some discussions with colleagues, most of whom have not signed. I am still struggling to detect any principled reason: worry about destroying the “brand” of the journal (from members of an editorial board of an Elsevier journal) is the only one I can sympathize with, although the fact (opinion, at least) that Elsevier has subtracted value from these brands after buying them from other publishers should be noted. Very few people have stated their reasons for not signing – Ariel Procaccia is a welcome exception. I suspect many researchers don’t give the issue more than a passing thought, others are too timid, and others like a free ride. Perhaps a few really think the current system is good, but in that case I would have to question their fitness for research work. The main feedback I have had informally is that it is all too hard, since they own the “best” journals in my field, I am an editor of one of their journals and I like it, etc. One interesting opinion is that although it is a problem, it will soon be fixed by advances in technology. I wish I could be confident of that.

In any case Elsevier has made some grudging concessions, so I guess that this will mean fewer people feel pressure to sign up. However, researchers shouldn’t have to waste their time to get such small concessions. The current system is clearly unsustainable and it seems almost impossible to imagine the commercial incentives of a company such as Elsevier ever allowing them to do what is right for society, or science/scholarship as a whole. Thus alternatives must be explored, and now is an important time for discussion. One forum for such discussions is Math2.0, which I read often and recommend highly.

Separating out the many related issues takes time and I will write several posts. For now, I have some concrete recommendations for mathematical researchers, none of them original to me. Many take very little effort.

  • Sign the boycott petition, or at least don’t work for journals that are exploiting free labour to make large profits.
  • Practice self-archiving rigorously. Use the arXiv and update your paper to the final accepted version. List all your papers on your own webpage. Encourage all colleagues to do the same – if you want to read their paper and can’t find it on their website, ask them to put it there.
  • Familiarize yourself with the policies of the publisher on author self-archiving. Stand up for your rights as an author (see this important article by Kristine Fowler in Notices of the American Mathematical Society).
  • When citing a journal publication, also give the arXiv version of the paper (if they are essentially the same).
  • Encourage those involved in hiring and evaluation at your institution to ignore journal impact factors and use article-level metrics and other more nuanced measures.
  • Encourage granting agencies you deal with to require open access to all publications associated with grants they award.
  • If you are on an editorial board of a journal run by Elsevier or the like, talk to your co-editors about moving to another publisher with better practices, or at least registering your displeasure to the current publisher and tell them they need to change if they want to keep you. Discuss with other editors of other journals, and share approaches that work.
  • Find out what your professional society (AMS, IMU, …)  is doing about these issues, and whether its publishing arm is helping the cause of open access, or harming it. Get involved on committees in the organization where possible.
  • Talk to your institution’s librarians. Find out what they can offer in terms of institutional repositories, hosting journals, etc.

 

 

 

The Elsevier boycott

There is a lot of information available about this, including the petition page, the official statement explaining the petition,  the PolyMath journal publishing reform page. I won’t repeat that here. This is just a note to say that after an internal struggle, I have signed the petition. Some of the details of this publisher’s bad behaviour were new and rather shocking to me. It is clear that a new system for dissemination, archiving, and evaluation of research is needed, and this boycott looks like a necessary, though not sufficient, step.

 

A downside to double blind review

A discussion started by Daniel Lemire reminded me of this issue. I recently participated in refereeing for a CS conference using double-blind review. I noticed an issue that I have not seen mentioned before (for example in the IMS Ad Hoc committee report). Several papers were not accepted, but they had some good ideas. If I now write a paper building on these ideas, I have no idea whom to cite or credit. I suppose I could ask the programme chair to put us in contact, but of course that may not occur for a long time (especially given the speed at which I can write papers these days). Still, this seems an obvious drawback not shared by single-blind review.

Prometheus

I don’t claim to resemble him in every way, but his unfortunate fate came to mind in the last few months as I suffered from recurrent corneal erosion. A short description: imagine that intermittently 10-20% of the skin of your leg is torn off, and grows back within a few days. Now imagine that instead of the leg, it is in fact your cornea, it only happens at night, and it is your own eyelid that is sticking to the cornea and tearing off the top layer. This is certainly the most annoying, painful and debilitating condition I have come across that is essentially trivial in that it won’t really cause any permanent damage. Doing anything with the eye for hours or days after an attack is very difficult, and for me going an hour without reading is very unusual, let alone several days.

Apparently there is a good chance of simple treatment succeeding, and more complicated options exist.

Trip to Iran

I have just returned home after 5 weeks in Iran, mostly for a family vacation. There were many interesting experiences, a lot of them positive. Skiing at about the height of Mt Cook just a few minutes from the city was a lot of fun, as was a bus trip to the ancient city of Kashan, which has had inhabitants nearby for the last 8000 years. Meeting relatives (by marriage) and some old and new friends, buying bread from the neighbourhood bakery, and eating the very high quality and cheap pastries were other highlights.

Of course the political and economic situation is very bad and getting worse. Taxi drivers (of whom we used a very large number – no way was I going to try driving for reasons discussed below, and buses are not that convenient) almost uniformly blamed the government and had no faith in the idea of an Islamic Republic. The rial depreciated sharply against the US dollar and other currencies during our trip. The government appears to be using the strategy of internet “filtering” (censorship) to avoid discussion of anything important, although as usual they were not very competent: the NZ Herald newspaper website was unavailable, although the NY Times was easily accessible. Most internet-savvy people I met were well aware of how to bypass the filters. The drawback is that speeds are reduced to the point where some services don’t really work (I couldn’t see anything on Youtube).

Iran seems to be the opposite of Sweden in the area of design. Doorhandles are loose and when tightened, the locks no longer work. Light switches are routinely dangling from the wall with wiring exposed. Quality control in building seems to be an exception.

Tehran seems to be one big health and safety violation. Some of the highlights I noticed: most taxi drivers have disabled the seatbelts in the back, and seem bemused or offended if this is pointed out (my favourite was the guy who said he would drive carefully and slowly on our trip home, although most of the trip was on the motorway); exposed wiring is visible in many public places; construction workers using welding torches with no eye protection. The most obvious health hazard is the foul stinking air which at this time of year is at its worst.

Many of the health hazards are traffic related (my guess is that the major contribution to the air pollution comes from vehicles, many of which seem to be of low quality and run on low grade fuel). The lane markings are completely ignored, and traffic proceeds (slowly in most places) by a process of everyone trying to claim the right of way and then negotiating silently with the other drivers, who by now are within a few centimetres of the car, approaching at an arbitrary angle. Apparently Iran has 4 times the death rate from traffic accidents that NZ has. We only saw a couple of accidents while there. The low speeds in Tehran must make fatal accidents less common, so I wonder how bad it must be in other places.

This leads to the idea of the rule of law, and the difference between writing laws and enforcing them. Particularly local government rules on footpaths, essentially none of which are wheelchair accessible in the hilly areas because they have steps every few metres (presumably each household deals with its own vehicle crossing and no one checks that they mesh together into a usable path).

It is hard to escape the conclusion that a country with a lot of natural resources, great potential for tourism, a strong and enduring culture with (as yet) strong family units, is being let down by an appallingly bad governance system. I don’t subscribe to the view that just because they have never had a decent government in 2500 years, they can’t have one now, but it does seem that a culture change is needed. Judging from some academic and industry contacts we made, it seems that this change is under way a lot faster than we expected in some sectors, but it seems it will still take considerable time.

2011 referendum simulator: experience so far

Several months ago I realized that the 2011 referendum in NZ on the voting system for parliamentary elections was coming soon. Geoff Pritchard and I developed a simulator with the aim of enabling voters to understand the consequences of a change to another system. In order to do this in a way that is useful to the non-expert, some simplifying assumptions must be made. We had substantial media coverage and some criticism.

Initial surprises:

  • How few people bothered to read the detailed FAQ before criticizing.
  • How many people thought that the simulator was trying to “back-cast” historical elections, and were certain that our results were unrealistic, without giving any evidence.
  • How much the criticisms, even unfounded ones, helped to clarify my understanding of what we had actually done, and suggested further research.
  • How short the attention span of internet visitors is.

In particular I want to respond to comments by David Farrar on his well-known site Kiwiblog. The relevant extract:

Now in 2002 National actually won 21 electorate seats out of 69 or 70. So this model is saying if there were 50 extra electorate seats, National would win 11 fewer seats!!

Why? Because they have come up with a formula based on the last 50 years or so of FPP elections, which they applied to the party vote figures for 2002. They ignored the actual electorate vote. It is a classic academic approach.

The more pragmatic approach, which is what others have done, is to say well if National won 21 electorate seats in 2002 out of 70, then if there 120 seats, their estimated number of seats would be 21*120/70, which is 36 seats.

In fact we did not look at any of the historical FPP elections The “formula” is based completely on the MMP party vote from the 2008 election (so yes, we did ignore the electorate vote, for what we think are good reasons).

However this got me thinking about how we might try to validate our assumptions. One way which I don’t (yet) claim is rigorous, but makes at least as much sense as the above, is to apply the simulator (the FPP part) to the historical FPP elections, and scale the 120 seats down to whatever it was historically (80 for many years, then increasing over time). The results surprised me greatly, as they are much better than expected, and this cries out for explanation (“further research”, always good for academics). Here they are. Note that these simulator results explicitly do not use any historical data, seat boundaries and parties have changed, etc.

1969: Real result was Nat 45 Lab 39; simulator scaled was Nat 46.9, Lab 37.1
1972: Real was Nat 55 Lab 32; simulator scaled was Nat 54.4, Lab 31.6
1975: Real was Nat 55 Lab 32; simulator scaled was Nat 55.1, Lab 31.9
1978: Real was Nat 51  Lab 40 SoCred 1; simulator scaled was Nat  47.5, Lab 44.5.
1981: Real was Nat 47 Lab 43 SoCred 2; simulator scaled was Nat 48.3, Lab 43.7
1984: Real was Nat 37 Lab 56 SoCred 2; simulator scaled was Nat 37 Lab 58.
1987: Real was Lab 57, Nat 40; simulator scaled was Lab 50.2, Nat 46.8
1990: Real was  Nat 67, Lab 29, NewLab 1; simulator scaled was Nat 71, Lab 26
1993: Real was Nat 50, Lab 45,  NZF 2, Alliance 2; simulator scaled was Nat 53.6, Lab 45.4

Big O(micron) and Big Omega and Big Theta

Teaching an algorithms course recently, I realized that the correct use of these asymptotic notations is apparently rather difficult for students to grasp. In fact, I have some colleagues whose understanding of these concepts is not maximally clear.

In 1976 Donald Knuth published an article explaining these notations in detail and recommending their use. After 35 years and plenty of evidence that it works well, we still have resistance. Perhaps one reason is that the idea of “asymptotically equivalent to”, or “of the exact order of”  is formalized by the symbol Theta, while O is a simpler and more familiar symbol that evokes the word “order”. The most common misuse I have seen is exactly this use of O when Theta is meant. The principle that the most important and useful concepts should have the simplest notation implies that O would be better for computer science use. However, that would make it inconsistent with mathematical use where big-O is definitely used for upper bounds only.

The main mistake seen (also mentioned by Knuth) apart from the perhaps excusable one above is not excusable: it is the use of O to discuss lower bounds as in the example: “Algorithm A is better than Algorithm B because A is known to  run in time O(n log n) and B in time O(n^2)”. If O is replaced by Theta, then this might make sense, but the literature is full of algorithms whose O-bounds are not asymptotically tight – how do we know B doesn’t run in time O(n) also? Again, if you think that O means Theta, this mistake is an easy one to make.