aaron.ballman added a comment.

In https://reviews.llvm.org/D44231#1031380, @pfultz2 wrote:

> > Can you elaborate a bit more about this?
>
> This catches problems when calling `sizeof(f())` when `f` returns an 
> integer(or enum). This is because, most likely, the integer represents the 
> type to be chosen, however, `sizeof` is being computed for an integer and not 
> the type the integer represents. That is, the user has an enum for the 
> `data_type`:
>
>   enum data_type {
>       float_type,
>       double_type
>   };
>
>
> At some point the user may call a function to get the data type(perhaps 
> `x.GetType()`) and pass it on to `sizeof`, like `sizeof(x.GetType())`, which 
> is incorrect.


Can you point to some real world code that would benefit from this check? I 
don't see this as being a common issue with code and I am concerned about false 
positives. For instance, it's reasonable to write `sizeof(func_call())` in a 
context where you don't want to repeat the type name in multiple places. I've 
seen this construct used in LLVM's code base (see Prologue::getLength() in 
DWARFDebugLine.h for one such instance).


Repository:
  rCTE Clang Tools Extra

https://reviews.llvm.org/D44231



_______________________________________________
cfe-commits mailing list
cfe-commits@lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits

Reply via email to