Theoretical Computer Science
quantum-computing circuit-complexity oracles
Updated Fri, 24 Jun 2022 07:01:14 GMT

Quantum oracle implementation overhead

I am a physicist getting acquainted with one of the typical constructs for formulation and analysis of quantum algorithms (such as search problems or query complexity models), namely the "oracle function". As far as I understand, the oracle is a quantum black-box entity which computes the quantity to be analyzed. For example, in a qubit circuit formulation of a universal quantum computer the oracle is a deterministically computed Boolean function (correct me here if I'm wrong).

My question is:

  • Is quantum oracle implementation overhead counted when the computational complexity of specific/well-known algorithms is compared to alternatives?

Examples: deterministic Deutsch-Jozsa algorithm algorithm is exponentially faster than a classical counterpart only if the implementation of the unknown function $f$ requires less than $2^n$ classical "steps'' (including obtaining the guarantee that $f$ is either partial or constant). A stochastic example: collapse of the Grover algorithm speedup by a faulty oracle with an arbitrary small but constant failure probability.

Why do one care: Of course, one can alsways say that we look just at the non-oracle part or count only the number of queries etc. But any talk of quantum speedup has any chance of empirical relevance only if the overhead for implementation of all of the quantum parts of the construction is taken into account (if it is trivial/constant - no problem). In other words, I find it meaningful to compare quantum algorithms only after they are "fully compiled'' onto (at least hypothetically) realizable classically-driven hardware (e.g., universal set of gates plus as many qubits as one needs - but we'll count all the time & memory required).

My question is probably fairly typical for a non-expert, perhaps I just got confused by some less sophisticated introductions to quantum computation. Please deconfuse me.


As pointed out by Logan Mayfield, Scott Aaronson's blogpost on the topic completely resolves my question. Oracles is a fundamental tool for rigorous investigation of computational complexity classes,

bring out the latent strengths of one class over another class.

The root cause of my concerns was that little (as far as I understand - nothing) can be proven unconditionally about quantum speedups in the unrelativized (compiled to physical hardware) world. Of course, this obstacle does not diminish the power of the oracle framework as a major tool enabling progress in quantum computation theory, including the celebrated example of Shor's algorithm.

Personally, it is a consolation to learn from Scott that

This [relativized vs. unrelatized complexity] is an absolutely crucial distinction that trips up just about everyone when theyre first learning quantum computing.