A lot of the "hand-waving" that is done is done for the same reason that we 
describe the atom using the planetary model to high school students, as the 
cloud model to undergraduate college students, and then using the probabilistic 
model to graduate physics students. There is a lot of "stuff" behind quantum 
calculations that need time to sink in.

My responses are all vastly simplified:

1. It's not phrased well in the show. Each qubit is both 0 and 1 simultaneously 
with the probability of each value being determined by the wave function by 
which it was programmed.

2. Same problem as #1. Basically, a problem that is solvable by a quantum 
computer (kind of) assigns probabilities to each bit being zero or one. When 
the qubits are read back, the quantum wave function (which set those 
probabilities) collapses so that each is read as either zero or one.

3. Qubits are not at all like transistors. Each classical bit doubles the range 
of possible values but doesn't double the available calculations. Each qubit 
actually doubles the number of available calculations.

4. Yes, very much so. Error correction is going to be critical. That's why 
getting more qubits in a single calculation is not just incrementally harder 
but polynomially harder (at least for now) as errors compound as more qubits 
are introduced. (this is the part I know the least about the math behind)

5. I think that was a "hopeful" statement. It's not entirely wrong but not 
entirely correct either. Quantum computers are great at solving problems 
involving a large number of probabilities (like factoring large composite 
numbers into their prime number components, as can be used to attack RSA). So, 
the shift in thinking will be not so much the programming aspect itself, but 
the fact that a programmer will have to describe the problem in a probabilistic 
way instead of a step-by-step way.

Eric Rossman

-----Original Message-----
From: IBM Mainframe Discussion List <[email protected]> On Behalf Of Bob 
Bridges
Sent: Tuesday, December 5, 2023 5:41 PM
To: [email protected]
Subject: [EXTERNAL] Re: CBS's "60 Minutes": Quantum Computing

Ok, I watched it.  I learned some things, but I still don't get it:

1) Scott Pelley describes the possible states of a bit (0 or 1), and then says 
"Quantum encodes information on electrons ... [which] behave in a way so that 
they are heads AND tails and everything in between.  You've gone from handling 
one bit of information to exponentially more data".  Omitting the unfortunate 
misuse of "exponentially", if an electron can be in all states at once, how can 
a programmer, or the program, determine what data is recorded on it?  I don't 
see how that can be true; they must be using impressive language to gloss over 
the details.

2) Michi Okaku likens the difference to calculating a path through a maze.
A "classical" computer (his word) must check all possible turnings one at a 
time.  But a quantum computer (he claims) "scans all possible routes 
simultaneously".  I can't picture that, and therefore I'm doubtful; again, I 
suspect him of blathering about something that he really does understand but 
cannot describe accurately for a 60-Minutes audience.

3) We're shown a diagram of five Q-bits, and the voiceover says "Unlike 
transistors, each additional Q-bit doubles the computer's power".  That is 
~not~ "unlike transistors"; it's exactly what traditional bits do.  "It's 
exponential", continues the voiceover, which, again, is exactly what classical 
bits are.  "So, while 20 transistors are 20 times more powerful than one, 
twenty Q-bits are a ~million~ times more powerful...".  Somebody should have 
vetted this sequence.

4) Karina Chou (sp?) of Google says their quantum computer is making an error 
about every 100 steps; they're aiming for one every million or so.
Even at that target rate they surely need a lot of self-checking and 
self-correcting, no?

5) Dario Gill, when the interviewer asked whether programmers have to learn a 
new way of programming, responds "I think that's what's really nice, that you 
actually just use a regular laptop, and you write a program - very much like 
you would write a traditional program - but when you click on 'Go', it just 
happens to run on a very different kind of computer".  I cannot reconcile this 
with the above nor with other statements being made about quantum computing.

It's occurred to me that the whole quantum-computing mania might be no more 
than a huge hoax.  I don't believe it, no.  But so far I'm utterly clueless how 
to understand the claims about it.

Regardless, thanks, Mr Sipples.  Very interesting.

---
Bob Bridges, [email protected], cell 336 382-7313

/* Silence promotes the presence of God, prevents many harsh and proud words, 
and suppresses many dangers in the way of ridiculing or harshly judging our 
neighbors....If you are faithful in keeping silence when it is not necessary to 
speak, God will preserve you from evil when it is right for you to talk.  
-Francois Fenelon (1651-1715) */

-----Original Message-----
From: IBM Mainframe Discussion List <[email protected]> On Behalf Of 
Timothy Sipples
Sent: Monday, December 4, 2023 23:22

If you'd like to understand why IBM is so bullish on quantum computing - and so 
focused on quantum-safe cryptography - this "60 Minutes" story is well worth 
watching:

https://www.youtube.com/watch?v=K4ssT6Dzmnw 

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send email to 
[email protected] with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to