yesusually the file is located in the $conda_install/etc/openturns
Sofiane
Le Jeudi 15 juin 2017 12h03, Julien Schueller | Phimeca
<[email protected]> a écrit :
#yiv0418679530 #yiv0418679530 -- P
{margin-top:0;margin-bottom:0;}#yiv0418679530 I think the path was good when I
tested it, does setting OPENTURNS_CONFIG_PATH make it work on your box ?
De : HADDAD Sofiane <[email protected]>
Envoyé : jeudi 15 juin 2017 11:37:33
À : Julien Schueller | Phimeca; Anita Laera
Cc : roy; users
Objet : Re: [ot-users] SpaceFillingC2 speed H Julien,
Yes, maybe we should enforce the OPENTURNS_CONFIG_PATH variable
Sofiane
Le Jeudi 15 juin 2017 11h17, Julien Schueller | Phimeca <[email protected]>
a écrit :
#yiv0418679530 -- P {margin-top:0;margin-bottom:0;}#yiv0418679530 Hello Anita,
Are you also using openturns from conda on osx ?Could you show us the script
about your default epsilon ? That should work even without loading the xml
defaults.
j
De : Anita Laera <[email protected]>
Envoyé : jeudi 15 juin 2017 10:56:15
À : Julien Schueller | Phimeca
Cc : roy; [email protected]; users
Objet : Re: [ot-users] SpaceFillingC2 speed I have the same message every time
I start a calculation.
Also, when I specify a certain value for the epsilon of the centered gradient I
want to use for AbdoRackwitz() in FORM, it keeps on using the default epsilon
1e-5 (as in the ResourceMap).
2017-06-15 10:44 GMT+02:00 Julien Schueller | Phimeca<[email protected]>:
Hi @roy
The message: "WRN - The configuration file has not been found, using default
parameters. "Is due to an error of the xml configuration loading code specific
to osx.I tried to debug it once, the openturns.conf file was really in the
correct location though.Sofiane, do you have this message when you compile on
osx box ?. I wonder if it's related to conda.
j
De :[email protected] <[email protected]> de la part de roy
<[email protected]>
Envoyé : jeudi 15 juin 2017 10:15:39
À : D. Barbier
Cc : users
Objet : Re: [ot-users] SpaceFillingC2 speed Hello Denis,
Indeed now OT is faster using ot.Sample(sample).
Regarding numba, it has to be pure python and not numpy for it to work
efficiently.
import numpy as npimport timeitimport openturns as otfrom numba import jit, njit
def discrepancy(sample): n_sample = len(sample) dim = sample.shape[1]
abs_ = abs(sample - 0.5) disc1 = np.sum(np.prod(1 + 0.5 * abs_ - 0.5 *
abs_ ** 2, axis=1))
prod_arr = 1 for i in range(dim): s0 = sample[:, i]
prod_arr *= (1 + 0.5 * abs(s0[:, None] - 0.5) + 0.5 *
abs(s0 - 0.5) - 0.5 * abs(s0[:, None] - s0)) disc2 =
prod_arr.sum()
c2 = (13 / 12) ** dim - 2 / n_sample * disc1 + 1 / (n_sample ** 2) * disc2
return np.sqrt(c2)
@jitdef discrepancy_numba(sample): n_sample = len(sample) dim =
sample.shape[1]
abs_ = abs(sample - 0.5) disc1 = np.sum(np.prod(1 + 0.5 * abs_ - 0.5 *
abs_ ** 2, axis=1))
prod_arr = 1 for i in range(dim): s0 = sample[:, i]
prod_arr *= (1 + 0.5 * abs(s0[:, None] - 0.5) + 0.5 *
abs(s0 - 0.5) - 0.5 * abs(s0[:, None] - s0)) disc2 =
prod_arr.sum()
c2 = (13 / 12) ** dim - 2 / n_sample * disc1 + 1 / (n_sample ** 2) * disc2
return np.sqrt(c2)
@njitdef discrepancy_faster_numba( sample): disc1 = 0 n_sample =
len(sample) dim = sample.shape[1]
for i in range(n_sample): prod = 1 for item in sample[i]:
sub = abs(item - 0.5) prod *= 1 + 0.5 * sub - 0.5 * sub ** 2
disc1 += prod
disc2 = 0 for i in range(n_sample): for j in range(n_sample):
prod = 1 for k in range(dim): a = 0.5 *
abs(sample[i,k] - 0.5) b = 0.5 * abs(sample[j,k] - 0.5)
c = 0.5 * abs(sample[i,k] - sample[j,k]) prod *= 1 + a + b
- c disc2 += prod
c2 = (13 / 12) ** dim - 2 / n_sample * disc1 + 1 / (n_sample ** 2) * disc2
return np.sqrt(c2)
sample = np.random.random_sample((500, 2))ot_sample =
ot.Sample(sample)print(discrepancy(sample))print(discrepancy_numba(
sample))print(discrepancy_faster_ numba(sample))print(ot.SpaceFillingC2().
evaluate(sample))
print('Function time: ', timeit.repeat('discrepancy( sample)', number=500,
repeat=4, setup="from __main__ import discrepancy, sample"))print('numba time:
', timeit.repeat('discrepancy_ numba(sample)', number=500, repeat=4,
setup="from __main__ import discrepancy_numba, sample"))print('Fast numba time:
', timeit.repeat('discrepancy_ faster_numba(sample)', number=500, repeat=4,
setup="from __main__ import discrepancy_faster_numba, sample"))print('OT time:
', timeit.repeat('ot. SpaceFillingC2().evaluate(ot_ sample)', number=500,
repeat=4, setup="from __main__ import ot_sample, ot"))
[34m [1mWRN - The configuration file has not been found, using default
parameters. [0m #### IF YOU HAPPEN TO KNOW HOW TO REMOVE THIS BY THE
WAY0.01814936702490.01814936702490.0181493670241497370.018149367024149737Function
time: [4.525451728957705, 4.541200206964277, 4.4143504980020225,
4.56408092204947]numba time: [4.3976798499934375, 4.876463262015022,
5.385470865992829, 5.138608552981168]Fast numba time: [0.6634743280010298,
0.6538278009975329, 0.7077985780197196, 0.6579875709721819]OT time:
[0.7988348260405473, 0.7220299079781398, 0.7797102630138397,
0.7526425909600221][Finished in 53.8s]
So using numba is here again faster. Even if I use a large sample (1000) numba
is slightly faster.
Sincerely,
Pamphile ROY
Chercheur doctorant en Quantification d’Incertitudes
CERFACS - Toulouse (31) - France
+33 (0) 5 61 19 31 57
+33 (0) 7 86 43 24 22
Le 14 juin 2017 à 23:20, D. Barbier <[email protected]> a écrit :
Hello Pamphile,
The problem is that your sample case is small, so the conversion from
a numpy array into an OT Sample has a significant cost.
If you rerun your benchmark on
otsample = ot.Sample(sample)
(or directly generate a random sample with OT), you will see that our
version is much faster.
BTW I was intrigued by your results with numba, but could not achieve
the same speedup, my gain is almost negligible. Can you please show
your test case with numba? Did you use a GPU?
Regards,
Denis
2017-06-14 10:32 GMT+02:00 roy <[email protected]>:
Hi,
Thanks for the feedback, indeed that could explain the behaviours.
Pamphile ROY
Chercheur doctorant en Quantification d’Incertitudes
CERFACS - Toulouse (31) - France
+33 (0) 5 61 19 31 57
+33 (0) 7 86 43 24 22
Le 14 juin 2017 à 10:15, HADDAD Sofiane <[email protected]> a écrit :
Hi,
It also depends on sample size
With sample's size=1000, I get this :
0.00975831343631
0.009758313432154839
Function time: [19.408187157008797, 21.296883990988135, 19.92589810100617]
OT time: [4.125010760006262, 4.1429947539872956, 4.138353090995224]
For small samples, maybe we spend more time for the generation of small
objects than the evaluation itself
Regards,
Sofiane
Le Mercredi 14 juin 2017 0h22, D. Barbier <[email protected]> a écrit :
On 2017-06-13 12:01 GMT+02:00 roy wrote:
Hi everyone,
I was playing with Centered discrepancy and wrote my function before I saw
the class SpaceFillingC2.
There is no issue except that I get 2x speedup with my python version.
There
might be room for improvement as I can even get a 10x on my version using
numba.
[...]
Hello Pamphile,
I will have a look, thanks a lot for your feedback.
Regards,
Denis
______________________________ _________________
OpenTURNS users mailing list
[email protected]
http://openturns.org/mailman/ listinfo/users
______________________________ _________________
OpenTURNS users mailing list
[email protected]
http://openturns.org/mailman/ listinfo/users
_______________________________________________
OpenTURNS users mailing list
[email protected]
http://openturns.org/mailman/listinfo/users
_______________________________________________
OpenTURNS users mailing list
[email protected]
http://openturns.org/mailman/listinfo/users
_______________________________________________
OpenTURNS users mailing list
[email protected]
http://openturns.org/mailman/listinfo/users