You would need to verify with the llvm/clang folks if x86_64h is desired as the 
result of getDefaultTargetTriple() first. I am not sure they want that. The 
debugger really wants to know what the hardware currently supports and that 
might not be the same as what the default triple should be...


> On Aug 19, 2014, at 2:24 PM, Zachary Turner <ztur...@google.com> wrote:
> 
> Ok. So it looks like x86_64h is not currently supported by llvm::Triple.  I 
> will find out about adding support for it.  If it's possible to get that into 
> llvm::Triple, I'll post a patch that updates Host::GetArchitecture() to use 
> it, and then maybe one of you guys can test it in the various configurations.
> 
> Thanks!
> 
> 
> On Tue, Aug 19, 2014 at 2:18 PM, <jing...@apple.com> wrote:
> Sorry, I didn't read closely enough.  Greg's answer is actually relevant to 
> your question...
> 
> Jim
> 
> > On Aug 19, 2014, at 2:04 PM, Zachary Turner <ztur...@google.com> wrote:
> >
> > In this case though, we're talking about Host::GetArchitecture, which is a 
> > static function and is supposed to report information about the OS that the 
> > debugger is running on, and not the target that the debuggee is running on. 
> >  Does this mean that a single instance of LLDB cannot debug both an app 
> > running in the simulator, and an app running in darwin?  So you have to 
> > exit LLDB and start a new instance of LLDB in "simulator mode" or "mac 
> > mode"?  Or am I misunderstanding?
> >
> >
> > On Tue, Aug 19, 2014 at 1:58 PM, <jing...@apple.com> wrote:
> > Mac OS X is more complex because lldb also supports apps running in the iOS 
> > simulator on OS X.  Those apps are x86 or x86_64 processes, but their OS is 
> > "ios", not "darwin".  The platforms and a bunch of other fiddly bits rely 
> > on getting the OS right as well.
> >
> > Jim
> >
> > > On Aug 19, 2014, at 1:52 PM, Zachary Turner <ztur...@google.com> wrote:
> > >
> > > Looking over the apple code path for Host::GetArchitecture, I'm a little 
> > > confused about why all this custom logic is needed.  What are the 
> > > situations in which llvm::sys::getDefaultTargetTriple() will return 
> > > something other than what we want?  Specifically, a concrete example 
> > > might help illustrate the problem.
> > >
> > > I understand from the comment that it has something to do with being able 
> > > to run 32 and 64-bit executables in the same operating system.  Isn't 
> > > this the case everywhere?  I can run 32-bit executables on Windows x64 as 
> > > well.  llvm::Triple has a function called get32BitArchVariant() that I 
> > > thought returns a 32-bit triple for this case.  Does this not work for 
> > > some Apple configuration?
> > >
> > > It seems like this logic should be able to be sunk into llvm::Triple 
> > > somehow.  Conceptually speaking, it seems like there should be two cases:
> > >
> > > 64-bit:
> > > host_arch_64 = llvm::Triple::getDefaultTargetTriple()
> > > host_arch_32 = 
> > > llvm::Triple::getDefaultTargetTriple().get32BitArchVariant()
> > >
> > > 32-bit
> > > host_arch_64 = <empty>
> > > host_arch_32 = llvm::Triple::getDefaultTargetTriple()
> > >
> > > Why doesn't this work?
> > > _______________________________________________
> > > lldb-dev mailing list
> > > lldb-dev@cs.uiuc.edu
> > > http://lists.cs.uiuc.edu/mailman/listinfo/lldb-dev
> >
> >
> 
> 
> _______________________________________________
> lldb-dev mailing list
> lldb-dev@cs.uiuc.edu
> http://lists.cs.uiuc.edu/mailman/listinfo/lldb-dev

_______________________________________________
lldb-dev mailing list
lldb-dev@cs.uiuc.edu
http://lists.cs.uiuc.edu/mailman/listinfo/lldb-dev

Reply via email to