https://gcc.gnu.org/bugzilla/show_bug.cgi?id=92059
Bug ID: 92059
Summary: Crash on tr2::dynamic_bitset::operator=() with
optimization
Product: gcc
Version: 8.3.1
Status: UNCONFIRMED
Severity: normal
Priority: P3
Component: c++
Assignee: unassigned at gcc dot gnu.org
Reporter: jharris at simplexinvestments dot com
Target Milestone: ---
Created attachment 47017
--> https://gcc.gnu.org/bugzilla/attachment.cgi?id=47017&action=edit
Self-contained preprocesor output file
On CentOS7 x86_64, gcc 8.3.1, this simple program (.ii attached) causes a crash
on execution when built with any level of optimization greater than -O0. It
seems to be in the vector assignment, but that works fine
outside of tr2::dynamic_bitset.
It also works fine if you specify a template parameter of uint8_t. Both
valgrind and asan complain loudly on execution, and it throws a bad_alloc() as
the code is shown.
I used the latest source (for tr2::dynamic_bitset) from github, but it's the
same result as that found with the compiler distribution (devtoolset-8).
#define _GLIBCXX_NODISCARD [[nodiscard]]
#include "./dynamic_bitset"
int main()
{
std::tr2::dynamic_bitset<> b1(1), b2(1);
b2 = b1; // crash
return 0;
}
Compiled with
gcc (GCC) 8.3.1 20190311 (Red Hat 8.3.1-3)
g++ -O3 --std=c++17 main.ii -o main
$ ./main
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted (core dumped)
g++ -O1 --std=c++17 main.ii -o main
$ ./main
Segmentation fault (core dumped)