https://gcc.gnu.org/bugzilla/show_bug.cgi?id=106248

            Bug ID: 106248
           Summary: operator>>std::basic_istream at boundary condition
                    behave differently in different opt levels
           Product: gcc
           Version: 11.2.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: libstdc++
          Assignee: unassigned at gcc dot gnu.org
          Reporter: Ting.Wang.SH at ibm dot com
  Target Milestone: ---

SYMPTOM:
Saw operator>>std::basic_istream set ios_base::eofbit differently when buffer
size equals to stream content size under different optimization levels. This is
observed on gcc version 11.2.0 with libstdc++.so.6.0.30, and not observed on
gcc version 9.4.0 with libstdc++.so.6.0.28.

This might not be a bug, however the behavior change under different
optimization levels is a little bit annoying.

An example C++ program is

$ cat a.cc
#include <istream>
#include <sstream>
#include <iostream>

char a_[10];

int main(int argc, char *argv[])
{
  std::basic_string<char, std::char_traits<char>, std::allocator<char> >
input((const char *)"  abcdefghi");
  std::basic_stringbuf<char, std::char_traits<char>, std::allocator<char> >
sbuf(input);
  std::basic_istream<char, std::char_traits<char> > istr(&sbuf);
  istr >> a_;
  std::cout << "istr.rdstate: " << istr.rdstate() << std::endl;
  return 0;
}

$ g++ -O0 -o a.O0 a.cc
$ ./a.O0
istr.rdstate: 2
$ g++ -O3 -o a.O3 a.cc
$ ./a.O3
istr.rdstate: 0

Reply via email to