I can't say I entirely followed this paper, but I'm pretty sure that the
paper neglects to take into account the fact that you can move to more
aggressive error correction as the computer scales up. e.g. rather than just
having each logical qbit encoded as 7 physical qbits, you could have each

In case you missed this, since it appeared in the quantum physics
section; but is relevant to quantum cryptography (or, cryptography on
quantum computers.)
The argument here is that Shor's factoring algorithm is dependent on
the Quantum Fourier Transform, which is sensitive to errors in

Charles McElwain [EMAIL PROTECTED] writes:
Follow-ups on this line of research will be interesting for the
evaluation of any impact of quantum computers on cryptography, and
even generally, since the decoherence behavior would tend to make
quantum computers approximate improving classical