Cryptography

Is double encryption really a bad idea? Are meet-in-the-middle attacks practical at all?

Meet-in-the-middle attacks are used to justify that attacks on ECC and double encryption will have complexity of $$O(\sqrt{n})$$ for ECC and $$O(2^{n+1})$$ for double encryption complexity instead of $$O(n)$$ and $$O(2^{2n})$$, respectively.

From my understanding (and please correct me if I'm wrong), meet-in-the-middle attacks rely on storing the results in a map. For example, if we double-encrypt with DES, we precompute all possible encryptions with all keys (which is $$2^{56}$$ possibilities), put them in a sorted map, and then decrypt with all possible keys, and see which of the decryption intermediate cipher-texts match what we have in the map. Given that map searches are $$O(\log{n})$$, then we have no problem.

I get this is possible for DES, albeit it's super-hard. We'll need around 500 TB of memory to do this. Well, let's say the NSA has that.

But why is the meet-in-the-middle attack taken seriously at all with ECC and AES, when they have 160 or 128 bit at least? If we used all earth's atoms to build SSDs to store all the map information I discussed up there, it probably won't be enough.

My question is basically: Is double-encryption a stupid idea given all these practical limitations when talking about encryption algorithms other than DES that have relatively long keys? And for ECC, is it true that ECC can be broken with $$O(\sqrt{n})$$ complexity?

Solution

meet-in-the-middle attacks rely on storing the results in a map

Only the naive version of MitM does.

The impractical memory requirement (size and accesses) of naive MitM can be greatly lowered with a relatively modest increase in other computation costs, and the calculation distributed to independent devices. The reference paper is Paul C. van Oorschot and Michael J. Wiener's Parallel Collision Search with Cryptanalytic Applications (in Journal of Cryptology, 1999). It covers MitM in section 5.3.

That's also discussed in this question.

Is double-encryption a stupid idea

Largely so, because of MitM with such improvements. In the case of AES, using AES-256 is a much better idea than double AES-128: it foils MitM, and is faster.

Is it true that ECC can be broken with $$\mathcal{O}(\sqrt{n})$$ complexity?

Yes, that's the cost of solving the Discrete Logarithm in a general cyclic group of order $$n$$, including defined using point addition on an Elliptic Curve. The reference method is Pollard's rho. That is seldom identified to MitM (as the question does), but there are similarities. Not coincidentally, parallel Pollard's rho is described in section 5.1 of the above paper.