Author: bugman
Date: Tue Aug 26 12:20:49 2014
New Revision: 25279
URL: http://svn.gna.org/viewcvs/relax?rev=25279&view=rev
Log:
The exponential curve numeric gradient script now uses only floating point
numbers.
This is to avoid integer truncation problems.
Modified:
trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.log
trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.py
Modified:
trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.log
URL:
http://svn.gna.org/viewcvs/relax/trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.log?rev=25279&r1=25278&r2=25279&view=diff
==============================================================================
--- trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.log
(original)
+++ trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.log
Tue Aug 26 12:20:49 2014
@@ -1,4 +1,4 @@
-The gradient at [1, 1000] is:
+The gradient at [1.0, 1000.0] is:
[-1.0995282792650802e-09, 2.1826111665238544e-12]
-The gradient at [2, 500] is:
+The gradient at [2.0, 500.0] is:
[722.67864120737488, -11.564651301654292]
Modified:
trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.py
URL:
http://svn.gna.org/viewcvs/relax/trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.py?rev=25279&r1=25278&r2=25279&view=diff
==============================================================================
--- trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.py
(original)
+++ trunk/test_suite/shared_data/curve_fitting/numeric_gradient/integrate.py
Tue Aug 26 12:20:49 2014
@@ -45,17 +45,17 @@
# The real parameters.
-R = 1
-I0 = 1000
+R = 1.0
+I0 = 1000.0
# The time points.
-times = [0, 1, 2, 3, 4]
+times = [0.0, 1.0, 2.0, 3.0, 4.0]
# The intensities for the above I0 and R.
I = [1000.0, 367.879441171, 135.335283237, 49.7870683679, 18.3156388887]
# The intensity errors.
-errors = [10, 10, 10, 10, 10]
+errors = [10.0, 10.0, 10.0, 10.0, 10.0]
# The numeric gradient at the minimum.
grad_R = derivative(func_R, R, dx=1e-5, order=11)
@@ -63,8 +63,8 @@
print("The gradient at %s is:\n %s" % ([R, I0], [grad_R, grad_I]))
# The numeric gradient off the minimum.
-R_off = 2
-I0_off = 500
+R_off = 2.0
+I0_off = 500.0
grad_R = derivative(func_R, R_off, dx=1e-5, order=11)
grad_I = derivative(func_I, I0_off, dx=1e-5, order=11)
print("The gradient at %s is:\n %s" % ([R_off, I0_off], [grad_R, grad_I]))
_______________________________________________
relax (http://www.nmr-relax.com)
This is the relax-commits mailing list
[email protected]
To unsubscribe from this list, get a password
reminder, or change your subscription options,
visit the list information page at
https://mail.gna.org/listinfo/relax-commits