Theoretical Computer Science
cc.complexity-theory average-case-complexity smoothed-analysis
Updated Sun, 22 May 2022 03:07:19 GMT

Paradigms for complexity analysis of algorithms


Worst-case and average case analysis are well-known measures for the complexity of an algorithm. Recently smoothed analysis has emerged as another paradigm to explain why some algorithms which are exponential in the worst-case work so well in practice, for instance the simplex algorithm.

My question is - are there any other paradigms to measure the complexity of an algorithm? I'm particularly interested in ones that attempt to explain why some algorithms which have bad worst-case complexity work well in practice.




Solution

There are natural variants of worst-case analysis that are also useful. Perhaps the most famous one is parametrized complexity. Here, we consider a "two-dimensional" measure: the usual input length $n$ and some additional non-negative integer $k$, the parameter. Even though an algorithm may run horribly in the worst case (for all values of $n$ and $k$), it could be that all the cases in one's application that needs to be solved, this parameter $k$ happens to be low, so the algorithm runs well on those instances.

For example, suppose you want to solve Maximum Independent Set on some class of graphs, and develop an interesting algorithm that is surprisingly fast. Investigating further into the class of graphs themselves, you find that all the graphs you examine just happen to have treewidth at most $10$. Well, Bodlaender (cf. Neidermeier [1]) showed that when the treewidth is k, Max Independent Set is fixed parameter tractable: it can be solved in $O(2^k (|E|+|V|))$ time. This gives some explanation as to why your algorithm works well.

[1] R. Niedermeier, Invitation to fixed-parameter algorithms. Oxford Lecture Series in Mathematics and its Applications, Oxford University Press, Oxford, 2006.