On May 19, 2012, at 8:09 AM, Matej Svejda wrote: > I am updating the function H in the following way: > > newHValues = [0.0] * len(centers) > for i in range(len(centers)): > alpha = centers[i] > newHValues[i] = h(alpha, t) > H.setValue(newHValues) > > The normalization is performed the same way: > > for value in values: > normalization += value * dAlpha > for i in range(length): > newValues[i] = values[i] / normalization > phi.setValue(value=newValues) > > Is this slover than using variables?
Definitely slower, although for 1D you may not notice it much. Basically, you should never (and should never have to) iterate over cells directly. You want to write vectorized expressions and let NumPy do the iterations for you. At a minimum, your H update inside your step loop should look like alpha = mesh.getCellCenters()[0] H.setValue(h(alpha, t)) but even better is to make the following definition *before* your step loop and then let FiPy take care of updating H as t changes. alpha = CellVariable(mesh=mesh, value=mesh.getCellCenters()[0]) t = Variable() H = h(alpha, t) For your renormalization, it's much more efficient to write phi.setValue(value=phi.getValue() / phi.getCellVolumeAverage()) > How would I combine the renormalization and the sweeping? I tried > something along the lines of: > > for step in range(timeSteps): > #update the H function variable > for i in range(sweepSteps): > eq.sweep(var=phi, boundaryConditions=BCs, dt=timeStepDuration) > normalize(phi, dAlpha) > phi.updateOld() > > Where normalize directly sets the value of phi (as can be seen in the > code I posted earlier). Is this the right way? It doesn't seem to make > any difference... You need to be sure that phi is defined with hasOld=True and you need to perform phi.updateOld() at the beginning of your step loop but outside your sweep loop. See http://www.ctcms.nist.gov/fipy/documentation/FAQ.html#iterations-timesteps-and-sweeps-oh-my It's quite possible that sweeping doesn't matter for this problem, but it's a good practice. > When compared to a solution that I get from Mathematica, my solution > tends to evolve slower. When convoluting the probability distribution > phi for each timestep with an almost linear weight-function, I always > get a value that is lower than the one Mathematica calculates (see > http://imgur.com/SCzVT ). > > Any ideas what the reason could be? It is difficult to be sure, but the green curve (Mathematica solution?) seems to have a small slope at alpha=1, where as the blue curve (FiPy solution?) looks like it has zero slope. Could there be a difference in the boundary conditions? FiPy assumes zero flux on all boundaries unless specified otherwise. _______________________________________________ fipy mailing list [email protected] http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
