On Tue, Aug 21, 2018 at 7:43 PM Brent Meeker <[email protected]> wrote:

>
>
> On 8/21/2018 3:37 PM, Jason Resch wrote:
>
>
>
> On Tue, Aug 21, 2018 at 5:00 PM Brent Meeker <[email protected]> wrote:
>
>>
>>
>> On 8/21/2018 2:40 PM, [email protected] wrote:
>>
>>
>>> If I start a 200 qubit quantum computer at time = 0, and 100
>>> microseconds later it has produced a result that required going through
>>> 2^200 = 1.6 x 10^60 = states (more states than is possible for 200 things
>>> to go through in 100 microseconds even if they changed their state every
>>> Plank time (5.39121 x 10^-44 seconds), then physically speaking it *
>>> *must** have been simultaneous.  I don't see any other way to explain
>>> this result.  How can 200 things explore 10^60 states in 10^-4 seconds,
>>> when a Plank time is 5.39 x 10^-44 seconds?
>>>
>>
>> It's no more impressive numerically than an electron wave function
>> picking out one of 10^30 silver halide molecules on a photographic plate to
>> interact with (which is also non-local, aka simultaneous).
>>
>>
> Well consider the 1000 qubit quantum computer. This is a 1 followed by 301
> zeros.
>
>
> What is "this".  It's the number possible phase relations between the 1000
> qubits.  If we send a 1000 electrons toward our photographic plate through
> a 1000 holes the Schrodinger wave function approaching the photographic
> plate then also has 1e301 different phase relations.  The difference is
> only that we don't control them so as to cancel out "wrong answers".
>
>

The reason I think the quantum computer example is important to consider is
because when we control them to produce a useful result, it becomes that
much harder to deny the reality and significance of the intermediate
states. For instance, we can verify the result of a Shor calculation for
the factorization of a large prime.  We can't so easily verify the
statistics of the 1e301 phase relations are what they should be.


> This is not only over a googol^2 times the number of silver halide
> molecules in your plate, but more than a googol times the 10^80 atoms in
> the observable universe.
>
> What is it, in your mind, that is able to track and consistently compute
> over these 10^301 states, in this system composed of only 1000 atoms?
>
>
Are you aware of anything other than many-worlds view that can account for
this?



>
>
>> Also note that you can only read off 200bits of information (c.f.
>> Holevo's theorem).
>>
>>
> True, but that is irrelevant to the number of intermediate states
> necessary for the computation that is performed to arrive at the final and
> correct answer.
>
>
> But you have to put in 2^200 complex numbers to initiate your qubits.  So
> you're putting in a lot more information than you're getting out.
>

You just initialize each of the 200 qubits to be in a superposition.


> Those "intermediate states" are just interference patterns in the
> computer, not some inter-dimensional information flow.
>

What is interference, but information flow between different parts of the
wave function: other "branches" of the superposition making their presence
known to us by causing different outcomes to manifest in our own branch.


> Also, many quantum algorithms only give you an answer that is probably
> correct.  So you have to run it multiple times to have confidence in the
> result.
>

I would say it depends on the algorithm and the precision of the
measurement and construction of the computer.  If your algorithm computes
the square of a randomly initialized set of qubits, then the only answer
you should get (assuming perfect construction of the quantum computer)
after measurement will be a perfect square.


>
> Quantum computers will certainly impact cryptography where there's heavy
> reliance on factoring primes and discrete logarithms.  They should be able
> to solve protein folding and similar problems that are out of reach of
> classical computers.  But they're not a magic bullet.  Most problems will
> still be solved faster by conventional von Neumann computers or by
> specialized neural nets.  One reason is that even though a quantum
> algorithm is faster in the limit of large problem size, it may still be
> slower for the problem size of interest.  It's the same problem that shows
> up in classical algorithms; for example the Coppersmith-Winograd algorithm
> for matrix multiplication takes O(n^2.375) compared to the Strassen
> O(n^2.807) but it is never used because it is only faster for matrices too
> large to be processed in existing computers.
>

So where do you stand concerning the reality of the immense number of
intermediate states the qubits are in before measured?

Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to