Ok, I watched it. I learned some things, but I still don't get it: 1) Scott Pelley describes the possible states of a bit (0 or 1), and then says "Quantum encodes information on electrons ... [which] behave in a way so that they are heads AND tails and everything in between. You've gone from handling one bit of information to exponentially more data". Omitting the unfortunate misuse of "exponentially", if an electron can be in all states at once, how can a programmer, or the program, determine what data is recorded on it? I don't see how that can be true; they must be using impressive language to gloss over the details.
2) Michi Okaku likens the difference to calculating a path through a maze. A "classical" computer (his word) must check all possible turnings one at a time. But a quantum computer (he claims) "scans all possible routes simultaneously". I can't picture that, and therefore I'm doubtful; again, I suspect him of blathering about something that he really does understand but cannot describe accurately for a 60-Minutes audience. 3) We're shown a diagram of five Q-bits, and the voiceover says "Unlike transistors, each additional Q-bit doubles the computer's power". That is ~not~ "unlike transistors"; it's exactly what traditional bits do. "It's exponential", continues the voiceover, which, again, is exactly what classical bits are. "So, while 20 transistors are 20 times more powerful than one, twenty Q-bits are a ~million~ times more powerful...". Somebody should have vetted this sequence. 4) Karina Chou (sp?) of Google says their quantum computer is making an error about every 100 steps; they're aiming for one every million or so. Even at that target rate they surely need a lot of self-checking and self-correcting, no? 5) Dario Gill, when the interviewer asked whether programmers have to learn a new way of programming, responds "I think that's what's really nice, that you actually just use a regular laptop, and you write a program - very much like you would write a traditional program - but when you click on 'Go', it just happens to run on a very different kind of computer". I cannot reconcile this with the above nor with other statements being made about quantum computing. It's occurred to me that the whole quantum-computing mania might be no more than a huge hoax. I don't believe it, no. But so far I'm utterly clueless how to understand the claims about it. Regardless, thanks, Mr Sipples. Very interesting. --- Bob Bridges, [email protected], cell 336 382-7313 /* Silence promotes the presence of God, prevents many harsh and proud words, and suppresses many dangers in the way of ridiculing or harshly judging our neighbors....If you are faithful in keeping silence when it is not necessary to speak, God will preserve you from evil when it is right for you to talk. -Francois Fenelon (1651-1715) */ -----Original Message----- From: IBM Mainframe Discussion List <[email protected]> On Behalf Of Timothy Sipples Sent: Monday, December 4, 2023 23:22 If you'd like to understand why IBM is so bullish on quantum computing - and so focused on quantum-safe cryptography - this "60 Minutes" story is well worth watching: https://www.youtube.com/watch?v=K4ssT6Dzmnw ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to [email protected] with the message: INFO IBM-MAIN
