On Saturday, January 12, 2019 at 7:09:29 PM UTC-6, Lawrence Crowell wrote:
>
> On Saturday, January 12, 2019 at 4:17:56 PM UTC-6, Brent wrote:
>>
>>
>>
>> On 1/12/2019 2:51 AM, Philip Thrift wrote:
>>
>>
>>
>> On Friday, January 11, 2019 at 7:19:06 PM UTC-6, Brent wrote: 
>>>
>>>
>>>
>>> On 1/11/2019 1:57 PM, Philip Thrift wrote:
>>>
>>>
>>>
>>> On Friday, January 11, 2019 at 2:46:35 PM UTC-6, Brent wrote: 
>>>>
>>>>
>>>>
>>>> On 1/11/2019 6:01 AM, John Clark wrote:
>>>>
>>>> On Thu, Jan 10, 2019 at 8:18 PM Brent Meeker <meek...@verizon.net> 
>>>> wrote:
>>>>
>>>> * > The fine structure constant is e^2/hbar*c.  Those three values are 
>>>>> measured independent of any Feynman diagrams*
>>>>>
>>>>
>>>> Absolutely correct. So if you use Feynman diagrams to predict what some 
>>>> physical system is going to do, such as a physical system of 2 electrons 
>>>> being hit by a photon of light with a wavelength small enough to contain 
>>>> enough energy to prevent the electrons repulsion, then you'd better get a 
>>>> number very close to the Fine Structure Constant. If you don't then 
>>>> Feynman 
>>>> Diagrams aren't any good. 
>>>>
>>>> They didn't use 12,672 Feynman Diagrams because they wanted to know 
>>>> what the Fine Structure Constant was, they already knew what that 
>>>> number was to many decimal places from exparament, they used 12,672 
>>>> Feynman Diagrams because they wanted to see if Feynman Diagrams 
>>>> worked. And it turned out they worked spectacularly well in that 
>>>> situation, 
>>>> and that gives scientists great confidence they can use Feynman Diagrams 
>>>> in 
>>>> other situations to calculate what other physical systems will do that 
>>>> involve the Electromagnetic Force.
>>>>
>>>>
>>>> There's always an interplay between theory and experiment.  It's 
>>>> completely analogous to Maxwell's discovery that light is EM waves. There 
>>>> were already experimental values of the permittivity and permeability of 
>>>> the vacuum and there were values for the speed of light.  Maxwell showed 
>>>> that his theory of EM predicted waves and using the permittivity and 
>>>> permeability values the speed of the waves matched that of light.  Now the 
>>>> speed of light is a defined constant and so are the permittivity and 
>>>> permeability of the vacuum.  So the connecting of the three values by a 
>>>> theory allows their values to be defined.  In the case of the anomalous 
>>>> magnetic moment of the electron, hbar and c are already defined constants. 
>>>>  
>>>> So quantum field theory (for which Feynman diagrams are just a 
>>>> calculational tool) linked them and e to g.
>>>>
>>>> Brent
>>>>
>>>>
>>>
>>>
>>> If Feynman Diagrams (tools) are sufficient (to match experimental data) 
>>> then Quantum Field Theory can be thrown in the wastebasket.
>>>
>>>
>>> ?? Feynman Diagrams are just a mathematical trick for summing up terms 
>>> to approximate the propagator of QFT.  
>>>
>>> Brent
>>>
>>
>>
>> You just make Feynman Diagrams the fundamental elements of the theory, 
>> and propagators derived from them.
>>
>>
>> How many diagrams?  The propagator has a clear interpretation as 
>> connecting the field at x with the field at y.  Feynman showed that his 
>> diagrams provided a good mnemonic for the infinite number of terms that 
>> would sum to the propagator.  If you take the diagrams as fundamental you 
>> then need to specify how many.
>>
>>
>> Just like histories are made fundamental, and Hilbert Spaces are derived 
>> from them.
>>
>>
>> Hilbert spaces are infinite dimensional vector spaces.  So you have the 
>> same problem: How many histories?
>>
>> Brent
>>
>  
> The number of diagrams grows exponentially. As I recall the QED industry 
> is up to 12 orders of radiative corrections and renormalization orders. The 
> number of diagrams to evaluate and sum is in the millions if not billions. 
> This stuff is done on supercomputers these days. People do not really 
> evaluate Feynman diagrams, they write computer programs.
>
> LC
>


Supercomputers are the future of theoretical physics it seems, like the one 
at LSU, SuperMike-II.

http://www.hpc.lsu.edu/docs/guides.php?system=SuperMike2

*SuperMike-II is a 146 TFlops Peak Performance 440 compute node cluster 
running the Red Hat Enterprise Linux 6 operating system. Each node contains 
two 8-Core Sandy Bridge Xeon 64-bit processors operating at a core 
frequency of 2.6 GHz. Fifty of the compute nodes also have two NVIDIA M2090 
GPUs that provide an additional 66 Tflops total Peak performance.*

use in LQG:
https://www.lsu.edu/mediacenter/news/2018/12/20physastro_singh_prl.php 


- pt

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to