I think we should get rid of it. There are plenty of discussions out there around C++ strings and how they relate to Unicode. Wide string does not mean Unicode. The general consensus I have seen is to just use char and require UTF-8 encoding. We have lots of confusion in our code around whether a string is Unicode or ASCII. I say remove the confusion and all strings are Unicode in UTF-8 encoded 8 bit chars. Also convert all char* to std::string for which we already have a ticket.
-Jake > On Oct 26, 2017, at 12:15 PM, David Kimura <dkim...@pivotal.io> wrote: > > While working on removing out parameters, we noticed code that makes > assumption that wchar_t is always 16 bits. > > virtual void PdxInstance::getField(const char* fieldName, wchar_t& value) > const = 0; > > void PdxInstanceImpl::getField(const char* fieldname, wchar_t& value) const > { > auto dataInput = getDataInputForField(fieldname); > uint16_t temp = dataInput->readInt16(); > value = static_cast<wchar_t>(temp); > } > > > > According to cppreference[1], this assumption is incorrect. If that is the > case, should this implementation be fixed or can it be deleted altogether? > > Thanks, > David > > [1] http://en.cppreference.com/w/cpp/language/types