dblaikie added a comment.

In D136011#3862150 <https://reviews.llvm.org/D136011#3862150>, @labath wrote:

> In D136011#3860637 <https://reviews.llvm.org/D136011#3860637>, @dblaikie 
> wrote:
>
>> I think the place where this will go wrong is in terms of how lldb renders 
>> `char` values on non-default-char-signedness programs (it'll render them as 
>> the default-char-signedness, which might be confusing to a user - since 
>> they'll be looking at literals, etc, that are the other signedness) and how 
>> lldb will interpret char literals (though that'll already be wrong - since 
>> the literals are already being parsed with the default-char-signedness, I 
>> think).
>
> Yes, I'm pretty sure that will happen. OTOH, I don't think there's any value 
> to fix this in a completely satisfactory way. Like, if the whole program was 
> consistently with the non-default signedness, we could try to detect it and 
> then configure the internal AST defaults accordingly. But that's hard to 
> detect, and I'd be surprised if most programs are completely homogeneous like 
> this.
>
> So, overall, I quite like this fix.

Yeah, this line of reasoning (non-homogenaeity) is one I'm on board with, 
thanks for the framing. Basically I/we can think of the debugger's expression 
context as some code that's compiled with the default char signedness always. 
Since char signedness isn't part of the ABI, bits of the program can be built 
with one and bits with the other - and the debugger is just a bit that's built 
with the default.

Works for me. (though lldb's sufficiently out of my wheelhouse I'll leave it to 
others to approve)


Repository:
  rG LLVM Github Monorepo

CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D136011/new/

https://reviews.llvm.org/D136011

_______________________________________________
lldb-commits mailing list
lldb-commits@lists.llvm.org
https://lists.llvm.org/cgi-bin/mailman/listinfo/lldb-commits

Reply via email to