http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52485
Paolo Carlini paolo.carlini at oracle dot com changed:
What|Removed |Added
Status|UNCONFIRMED
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52485
--- Comment #2 from Václav Šmilauer eu at doxos dot eu 2012-03-05 08:55:12
UTC ---
(In reply to comment #1)
I think we should not have an option to disable user-defined literals at all.
Since their code is not C++11, they should fix their code
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52485
Jakub Jelinek jakub at gcc dot gnu.org changed:
What|Removed |Added
CC||jakub at gcc dot
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52485
--- Comment #4 from Václav Šmilauer eu at doxos dot eu 2012-03-05 09:20:26
UTC ---
I think you understood my source _is_ c++11, but the libs I have to use are
not; therefore I have to compile with -std=c++11. The the fix is admittedly
trivial and
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52485
--- Comment #1 from Andrew Pinski pinskia at gcc dot gnu.org 2012-03-04
22:56:18 UTC ---
I think we should not have an option to disable user-defined literals at all.
Since their code is not C++11, they should fix their code to be C++11 if they