For the first time in several years I will not be attending the annual AofA workshop/conference. This year it is being held in Brazil at an inconvenient time for me. The scientific programme looks quite interesting.
I have never seen a full history of the field of analysis of algorithms (the SODA talk by Philippe Flajolet has some interesting comments). A rough summary of what I believe to be the case: before Knuth came along, people analysed algorithms by implementing them and guessing their asymptotic runtime via extrapolation. Knuth changed this by showing that serious classical mathematics could be brought to bear on analysis of algorithms, and that this ultimately led to models that would predict to high accuracy the real running time on large problems. Notwithstanding the fact that many researchers in the area of algorithms (the “big-O crowd”) are content with less precise estimates, I am sure they were happy that such analysis was possible.
Recently I have seen more and more examples of situations where this kind of analysis is called into question. The main problems seem to be caching and branch prediction, which ensure that the time to execute an “elementary” operation can vary wildly. A relatively small number of papers have appeared that discuss these issues (after all, it should not be beyond the wit of man to say something analytical in these situations) but the results are necessarily less precise than in the old days (for a start, even finding out the branch prediction algorithm used by chip makers is not trivial). Let’s see where the next few years lead.