Thanks Dan,
 
 
Dan, is the output below normal when using --inline, or is this an indication that the installation is not right and that I need to make a change somewhere?
 
 
Buddy
 
**** some further information of interest, but not necessary  *********
I've tried a few things, FYI.
 
Baseline FiPy 1.0 is 3:35 mins
installing and running FiPy 1.1 gets me down to 3:15
Changing a time stepping value in my input brings it down to 2:45
running with --inline goes down to 2:25 but....
 
when I run inline I get all this output shown below.  I assumed that It just needed to install the first time, so I ran it again but got a similar thing.  It seemed that it took about 10-15 SEC to run through all the compiling the second time, which may mean that solution run time was more like 2:10-2:15 minutes, but I can't say for sure when the compiling was done, and when the problem solution began.  Also, when I run with the --inline flag, only one of my 2 plots updates.  Sometimes the conserved field plot, sometimes the non-conserved.
****************
 
Out put when inlining shown below.

D:\fipy\Doing>python fec650_12.py --inline
T 923.15
domainND 200.0
dxND 0.125
nx 1600
<return>
fec650_12.py:62: DeprecationWarning: 'where' should be used instead of 'cells'
  uCi.setValue(uC0, igCells)
Mobphi 0.0388258954912
MobPhiND 9.02643166204e-010
epsilonSq 4.24264068712e-009
epsilonSqND 4242640687.12
W 29698.4848098
WND 0.796570632813
press <Return>
position 0 =  25.0625
numerix Numeric 23.8
repairing catalog by removing key
mingw32
running build_ext
customize Mingw32CCompiler
customize Mingw32CCompiler using build_ext
building 'sc_8d9957f925cf2a1299f6f09b582221465' extension
compiling C++ sources
g++ options: '-O2 -Wall -Wstrict-prototypes'
compile options: '-ID:\fipy\Python23\lib\site-packages\weave -ID:\fipy\Python23\lib\site-packages\weave\scxx -ID:\fipy\Python23\lib\site-packages\weav
e\blitz-20001213 -ID:\fipy\Python23\include -ID:\fipy\Python23\PC -c'
extra options: '-O3'
g++ -O2 -Wall -Wstrict-prototypes -ID:\fipy\Python23\lib\site-packages\weave -ID:\fipy\Python23\lib\site-packages\weave\scxx -ID:\fipy\Python23\lib\si
te-packages\weave\blitz-20001213 -ID:\fipy\Python23\include -ID:\fipy\Python23\PC -c c:\docume~1\damme\locals~1\temp\damme\python23_compiled\sc_8d9957
f925cf2a1299f6f09b582221465.cpp -o c:\docume~1\damme\locals~1\temp\damme\python23_intermediate\compiler_b963931666ea5c01256212eb01c9e42f\Release\docum
e~1\damme\locals~1\temp\damme\python23_compiled\sc_8d9957f925cf2a1299f6f09b582221465.o -O3
g++ -shared c:\docume~1\damme\locals~1\temp\damme\python23_intermediate\compiler_b963931666ea5c01256212eb01c9e42f\Release\docume~1\damme\locals~1\temp
\damme\python23_compiled\sc_8d9957f925cf2a1299f6f09b582221465.o c:\docume~1\damme\locals~1\temp\damme\python23_intermediate\compiler_b963931666ea5c012
56212eb01c9e42f\Release\fipy\python23\lib\site-packages\weave\scxx\weave_imp.o -LD:\fipy\Python23\libs -LD:\fipy\Python23\PCBuild -lpython23 -o c:\doc
ume~1\damme\locals~1\temp\damme\python23_compiled\sc_8d9957f925cf2a1299f6f09b582221465.pyd
tND = 423.156846455 t = 9.83775469577e-006 pos = 2.85095927286e-008 vel = 0.000131520426264 dt = 2.85126813114e-007 iter = 159 CN = 0.295850304292
tND = 852.05651092 t = 1.98090211976e-005 pos = 2.95841276068e-008 vel = 9.28783200769e-005 dt = 4.03754072737e-007 iter = 188 CN = 0.297042747349
tND = 1270.48963058 t = 2.9536956412e-005 pos = 3.04024197152e-008 vel = 7.76076435672e-005 dt = 4.83199827702e-007 iter = 210 CN = 0.298000243651
tND = 1691.67086314 t = 3.93287810819e-005 pos = 3.11100506673e-008 vel = 6.81265002052e-005 dt = 5.50446594014e-007 iter = 229 CN = 0.298286114598
tND = 2114.58009335 t = 4.91607790755e-005 pos = 3.17437464944e-008 vel = 6.15350172745e-005 dt = 6.09409108194e-007 iter = 246 CN = 0.29868697445
tND = 2523.69322339 t = 5.86720386709e-005 pos = 3.23031293509e-008 vel = 5.66131672933e-005 dt = 6.62390072715e-007 iter = 261 CN = 0.297985510241
tND = 2936.06148136 t = 6.8258975052e-005 pos = 3.28254475774e-008 vel = 5.26957912965e-005 dt = 7.11631784576e-007 iter = 275 CN = 0.298144349817
tND = 3345.40088404 t = 7.77754951429e-005 pos = 3.33106317117e-008 vel = 4.95322281928e-005 dt = 7.57082840167e-007 iter = 288 CN = 0.298126477132
tND = 3745.91616107 t = 8.70868676997e-005 pos = 3.37586693442e-008 vel = 4.70011180098e-005 dt = 7.97853361534e-007 iter = 300 CN = 0.299183680261
tND = 4168.06026185 t = 9.6901077595e-005 pos = 3.42066796828e-008 vel = 4.45667764359e-005 dt = 8.41433978379e-007 iter = 312 CN = 0.298494828962
press <return>
 


From: [email protected] [mailto:[EMAIL PROTECTED] On Behalf Of Daniel Wheeler
Sent: Friday, June 09, 2006 10:22 AM
To: Multiple recipients of list
Subject: Re: Improving Speed

Try running with the --inline flag.

     $ python mycode.py --inline

This may give some improvements in run time.

On Jun 9, 2006, at 9:28 AM, Damm, Edward F. (E. Buddy) wrote:

Dave and Chet, you were right on.  Thanks!
 
Python/FiPy is giving the more accurate double precision answer.  I'll need to change all the numeric constants in the FORTRAN to "d0" to get a real comparision, but as the 2 models stand I have wide difference in the run times on a small test case.  Running for an elapsed 'real' time of 10e-5 seconds, my transformation front moves nominally 20-nanometers.  FORTRAN runtime is 20 seconds, and FiPy is 3 mins, 35 seconds.  My real case will need to go ~10 micrometers, so run time is going to need to be reduced.
 
Any tips out there on logical steps to improving efficiency would be appreciated.  I'll get version 1.1 and start the work understanding sweeping, etc.
 
Thanks,
Buddy
 
 


From: [email protected] [mailto:[email protected]] On Behalf Of E David Huckaby
Sent: Thursday, June 08, 2006 4:48 PM
To: Multiple recipients of list
Subject: RE: FiPy Precision

Have you tried putting "d0" after your numeric constants in FORTRAN,
 
i.e.
compare 
     easy = 5.11*5.234*4.987
vs.
     easy = 5.11d0*5.234d0*4.987d0
 
Dave

>>> [EMAIL PROTECTED] 6/8/2006 4:22 PM >>>
Hi,
 
I've added a very simply calculation to both the FORTRAN and Numeric calcs, and called it 'easy'.  I've defined easy as ---  easy = 5.11*5.234*4.987
 
the files are attached, and the output is below.  The discrepancy always seems to show up at the 8th or 9th significant digit for simple multiplication, or for Numeric functions.
 
FORTRAN
  M**p =   5.170837517496476E-002
  easy =   133.381011962891
 
 atan term =   -1.01175767177248
 
PYTHON
  M**p =  0.0517083688947
  easy 133.38100538
  atan term =  -1.01175769857
Any diagnosis is greatly appreciated.
 
If anyone out there cares to run these 2 and verify the output on their machine, that would perhaps rule out/in the problem being my pc.
 
Buddy
 
 


From: [email protected] [mailto:[email protected]] On Behalf Of Daniel Wheeler
Sent: Thursday, June 08, 2006 12:36 PM
To: Multiple recipients of list
Subject: Re: FiPy Precision

It depends on the size of the discrepancy. To diagnose this, we need a very simple calculation that
compares FORTRAN with Numeric for some very simple array manipulation (as Dan Lewis suggested).

On Jun 8, 2006, at 10:48 AM, Damm, Edward F. (E. Buddy) wrote:

Hello,
 
I'm getting some slight differences in some of my coefficients when comparing FiPy to Fortran benchmark code I'm using.  I have confirmed that the equations are the same.  There are some other potential areas that may be responsible for the discrepency, but I wonder if it may be due to precision. 
 
I'm using Double Precision in the fortran code, and for any numbers in FiPy I am ending them in a decimal point and zero if it does not have values after the decimal, and of course anything that does have values after the decimal are input as such.
 
Is it reasonable to think that my descrepencies are due to precision?  If so, do you have suggestions of how to reduce/eliminate the descrepancy?
 
Thanks
 
Buddy





This message and any attachments are intended for the individual or
entity named above. If you are not the intended recipient, please
do not forward, copy, print, use or disclose this communication to
others; also please notify the sender by replying to this message,
and then delete it from your system. The Timken Company / The
Timken Corporation



Daniel Wheeler




Daniel Wheeler






This message and any attachments are intended for the individual or
entity named above. If you are not the intended recipient, please
do not forward, copy, print, use or disclose this communication to
others; also please notify the sender by replying to this message,
and then delete it from your system. The Timken Company / The
Timken Corporation

Reply via email to