My question is simple:
What is the worst-case running time of the best known algorithm for computing an eigendecomposition of an $n \times n$ matrix?
Does eigendecomposition reduce to matrix multiplication or are the best known algorithms $O(n^3)$ (via SVD) in the worst case ?
Please note that I am asking for a worst case analysis (only in terms of $n$), not for bounds with problem-dependent constants like condition number.
EDIT: Given some of the answers below, let me adjust the question: I'd be happy with an $\epsilon$-approximation. The approximation can be multiplicative, additive, entry-wise, or whatever reasonable definition you'd like. I am interested if there's a known algorithm that has better dependence on $n$ than something like $O(\mathrm{poly}(1/\epsilon)n^3)$?
EDIT 2: See this related question on symmetric matrices.
Ryan answered a similar question on mathoverflow. Here's the link: mathoverflow-answer
Basically, you can reduce eigenvalue computation to matrix multiplication by computing a symbolic determinant. This gives a running time of O($n^{\omega+1}m$) to get $m$ bits of the eigenvalues; the best currently known runtime is O($n^3+n^2\log^2 n\log b$) for an approximation within $2^{-b}$.
Ryan's reference is ``Victor Y. Pan, Zhao Q. Chen: The Complexity of the Matrix Eigenproblem. STOC 1999: 507-516''.
(I believe there is also a discussion about the relationship between the complexities of eigenvalues and matrix multiplication in the older Aho, Hopcroft and Ullman book ``The Design and Analysis of Computer Algorithms'', however, I don't have the book in front of me, and I can't give you the exact page number.)
External links referenced by this document: