Hello,

I am not sure this issue is entirely relevant on this mailing list.
Maybe you could redirect me.

The same application (relatively heavy code), provides different values when it
is run and compiled on 2 different machines.
Both F40 (last update)
gcc (GCC) 14.2.1 20240912 (Red Hat 14.2.1-3)
One 
Intel(R) Core(TM) i5-7400 CPU @ 3.00GHz
The other one
Intel(R) Core(TM) i7-8700 CPU @ 3.20GHz
Actually, this happens when I use the gsl library.
gsl-devel-2.7.1-8.fc40.x86_64
for integration (gsl_integration_cquad).
Before integration, the values are strictly identical.
The same Makefile is used.
Now, if I copy the code generated by the machine A on machine B,
I get the same results as it had been run on machine A.
The size of both codes are slightly different.
I conclude that the issue is due to the compiler.
Indeed, the difference in the generated values seems pretty constant,
i.e., it seems proportional to the value itself: of the order 2.7e-8 (relative 
difference)
i.e. a lot higher than the accuracy of the machine: < 1e-35.

Which one is the good one?
Why this behavior?
Can I solve the issue?

Thank for any help.

===========================================================================
 Patrick DUPRÉ                                 | | email: [email protected]
===========================================================================

-- 
_______________________________________________
users mailing list -- [email protected]
To unsubscribe send an email to [email protected]
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/[email protected]
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue

Reply via email to