Title: [229767] trunk/Source/_javascript_Core
Revision
229767
Author
[email protected]
Date
2018-03-20 11:10:16 -0700 (Tue, 20 Mar 2018)

Log Message

Improve FunctionPtr and use it in the JIT CallRecord.
https://bugs.webkit.org/show_bug.cgi?id=183756
<rdar://problem/38641335>

Reviewed by JF Bastien.

1. FunctionPtr hold a C/C++ function pointer by default.  Change its default
   PtrTag to reflect that.

2. Delete the FunctionPtr::value() method.  It is effectively a duplicate of
   executableAddress().

3. Fix the FunctionPtr constructor that takes arbitrary pointers to be able to
   take "any" pointer.  "any" in this case means that the pointer may not be typed
   as a C/C++ function to the C++ compiler (due to upstream casting or usage of
   void* as a storage type), but it is still expected to be pointing to a C/C++
   function.

4. Added a FunctionPtr constructor that takes another FunctionPtr.  This is a
   convenience constructor that lets us retag the underlying pointer.  The other
   FunctionPtr is still expected to point to a C/C++ function.

5. Added PtrTag assertion placeholder functions to be implemented later.

6. Change the JIT CallRecord to embed a FunctionPtr callee instead of a void* to
   pointer.  This improves type safety, and assists in getting pointer tagging
   right later.

7. Added versions of JIT callOperations methods that will take a PtrTag.
   This is preparation for more more pointer tagging work later.

* assembler/MacroAssemblerARM.h:
(JSC::MacroAssemblerARM::linkCall):
* assembler/MacroAssemblerARMv7.h:
(JSC::MacroAssemblerARMv7::linkCall):
* assembler/MacroAssemblerCodeRef.h:
(JSC::FunctionPtr::FunctionPtr):
(JSC::FunctionPtr::operator bool const):
(JSC::FunctionPtr::operator! const):
(JSC::ReturnAddressPtr::ReturnAddressPtr):
(JSC::MacroAssemblerCodePtr::retagged const):
(JSC::MacroAssemblerCodeRef::retaggedCode const):
(JSC::FunctionPtr::value const): Deleted.
* assembler/MacroAssemblerMIPS.h:
(JSC::MacroAssemblerMIPS::linkCall):
* assembler/MacroAssemblerX86.h:
(JSC::MacroAssemblerX86::linkCall):
* assembler/MacroAssemblerX86_64.h:
(JSC::MacroAssemblerX86_64::callWithSlowPathReturnType):
(JSC::MacroAssemblerX86_64::linkCall):
* bytecode/AccessCase.cpp:
(JSC::AccessCase::generateImpl):
* ftl/FTLSlowPathCall.cpp:
(JSC::FTL::SlowPathCallContext::makeCall):
* ftl/FTLSlowPathCall.h:
(JSC::FTL::callOperation):
* ftl/FTLThunks.cpp:
(JSC::FTL::osrExitGenerationThunkGenerator):
(JSC::FTL::lazySlowPathGenerationThunkGenerator):
(JSC::FTL::slowPathCallThunkGenerator):
* jit/JIT.cpp:
(JSC::JIT::link):
(JSC::JIT::privateCompileExceptionHandlers):
* jit/JIT.h:
(JSC::CallRecord::CallRecord):
(JSC::JIT::appendCall):
(JSC::JIT::appendCallWithSlowPathReturnType):
(JSC::JIT::callOperation):
(JSC::JIT::callOperationWithProfile):
(JSC::JIT::callOperationWithResult):
(JSC::JIT::callOperationNoExceptionCheck):
(JSC::JIT::callOperationWithCallFrameRollbackOnException):
* jit/JITArithmetic.cpp:
(JSC::JIT::emitMathICFast):
(JSC::JIT::emitMathICSlow):
* jit/JITInlines.h:
(JSC::JIT::emitNakedCall):
(JSC::JIT::emitNakedTailCall):
(JSC::JIT::appendCallWithExceptionCheck):
(JSC::JIT::appendCallWithExceptionCheckAndSlowPathReturnType):
(JSC::JIT::appendCallWithCallFrameRollbackOnException):
(JSC::JIT::appendCallWithExceptionCheckSetJSValueResult):
(JSC::JIT::appendCallWithExceptionCheckSetJSValueResultWithProfile):
* jit/JITPropertyAccess.cpp:
(JSC::JIT::emitSlow_op_get_by_val):
(JSC::JIT::emitSlow_op_put_by_val):
(JSC::JIT::privateCompileGetByValWithCachedId):
(JSC::JIT::privateCompilePutByVal):
(JSC::JIT::privateCompilePutByValWithCachedId):
* jit/JITPropertyAccess32_64.cpp:
(JSC::JIT::emitSlow_op_put_by_val):
* jit/Repatch.cpp:
(JSC::linkPolymorphicCall):
* jit/SlowPathCall.h:
(JSC::JITSlowPathCall::JITSlowPathCall):
(JSC::JITSlowPathCall::call):
* jit/ThunkGenerators.cpp:
(JSC::nativeForGenerator):
* runtime/PtrTag.h:
(JSC::nextPtrTagID):
(JSC::assertIsCFunctionPtr):
(JSC::assertIsNullOrCFunctionPtr):
(JSC::assertIsNotTagged):
(JSC::assertIsTagged):
(JSC::assertIsNullOrTagged):
(JSC::assertIsTaggedWith):
(JSC::assertIsNullOrTaggedWith):
(JSC::uniquePtrTagID): Deleted.

Modified Paths

Diff

Modified: trunk/Source/_javascript_Core/ChangeLog (229766 => 229767)


--- trunk/Source/_javascript_Core/ChangeLog	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/ChangeLog	2018-03-20 18:10:16 UTC (rev 229767)
@@ -1,3 +1,114 @@
+2018-03-20  Mark Lam  <[email protected]>
+
+        Improve FunctionPtr and use it in the JIT CallRecord.
+        https://bugs.webkit.org/show_bug.cgi?id=183756
+        <rdar://problem/38641335>
+
+        Reviewed by JF Bastien.
+
+        1. FunctionPtr hold a C/C++ function pointer by default.  Change its default
+           PtrTag to reflect that.
+
+        2. Delete the FunctionPtr::value() method.  It is effectively a duplicate of
+           executableAddress().
+
+        3. Fix the FunctionPtr constructor that takes arbitrary pointers to be able to
+           take "any" pointer.  "any" in this case means that the pointer may not be typed
+           as a C/C++ function to the C++ compiler (due to upstream casting or usage of
+           void* as a storage type), but it is still expected to be pointing to a C/C++
+           function.
+
+        4. Added a FunctionPtr constructor that takes another FunctionPtr.  This is a
+           convenience constructor that lets us retag the underlying pointer.  The other
+           FunctionPtr is still expected to point to a C/C++ function.
+
+        5. Added PtrTag assertion placeholder functions to be implemented later.
+
+        6. Change the JIT CallRecord to embed a FunctionPtr callee instead of a void* to
+           pointer.  This improves type safety, and assists in getting pointer tagging
+           right later.
+
+        7. Added versions of JIT callOperations methods that will take a PtrTag.
+           This is preparation for more more pointer tagging work later.
+
+        * assembler/MacroAssemblerARM.h:
+        (JSC::MacroAssemblerARM::linkCall):
+        * assembler/MacroAssemblerARMv7.h:
+        (JSC::MacroAssemblerARMv7::linkCall):
+        * assembler/MacroAssemblerCodeRef.h:
+        (JSC::FunctionPtr::FunctionPtr):
+        (JSC::FunctionPtr::operator bool const):
+        (JSC::FunctionPtr::operator! const):
+        (JSC::ReturnAddressPtr::ReturnAddressPtr):
+        (JSC::MacroAssemblerCodePtr::retagged const):
+        (JSC::MacroAssemblerCodeRef::retaggedCode const):
+        (JSC::FunctionPtr::value const): Deleted.
+        * assembler/MacroAssemblerMIPS.h:
+        (JSC::MacroAssemblerMIPS::linkCall):
+        * assembler/MacroAssemblerX86.h:
+        (JSC::MacroAssemblerX86::linkCall):
+        * assembler/MacroAssemblerX86_64.h:
+        (JSC::MacroAssemblerX86_64::callWithSlowPathReturnType):
+        (JSC::MacroAssemblerX86_64::linkCall):
+        * bytecode/AccessCase.cpp:
+        (JSC::AccessCase::generateImpl):
+        * ftl/FTLSlowPathCall.cpp:
+        (JSC::FTL::SlowPathCallContext::makeCall):
+        * ftl/FTLSlowPathCall.h:
+        (JSC::FTL::callOperation):
+        * ftl/FTLThunks.cpp:
+        (JSC::FTL::osrExitGenerationThunkGenerator):
+        (JSC::FTL::lazySlowPathGenerationThunkGenerator):
+        (JSC::FTL::slowPathCallThunkGenerator):
+        * jit/JIT.cpp:
+        (JSC::JIT::link):
+        (JSC::JIT::privateCompileExceptionHandlers):
+        * jit/JIT.h:
+        (JSC::CallRecord::CallRecord):
+        (JSC::JIT::appendCall):
+        (JSC::JIT::appendCallWithSlowPathReturnType):
+        (JSC::JIT::callOperation):
+        (JSC::JIT::callOperationWithProfile):
+        (JSC::JIT::callOperationWithResult):
+        (JSC::JIT::callOperationNoExceptionCheck):
+        (JSC::JIT::callOperationWithCallFrameRollbackOnException):
+        * jit/JITArithmetic.cpp:
+        (JSC::JIT::emitMathICFast):
+        (JSC::JIT::emitMathICSlow):
+        * jit/JITInlines.h:
+        (JSC::JIT::emitNakedCall):
+        (JSC::JIT::emitNakedTailCall):
+        (JSC::JIT::appendCallWithExceptionCheck):
+        (JSC::JIT::appendCallWithExceptionCheckAndSlowPathReturnType):
+        (JSC::JIT::appendCallWithCallFrameRollbackOnException):
+        (JSC::JIT::appendCallWithExceptionCheckSetJSValueResult):
+        (JSC::JIT::appendCallWithExceptionCheckSetJSValueResultWithProfile):
+        * jit/JITPropertyAccess.cpp:
+        (JSC::JIT::emitSlow_op_get_by_val):
+        (JSC::JIT::emitSlow_op_put_by_val):
+        (JSC::JIT::privateCompileGetByValWithCachedId):
+        (JSC::JIT::privateCompilePutByVal):
+        (JSC::JIT::privateCompilePutByValWithCachedId):
+        * jit/JITPropertyAccess32_64.cpp:
+        (JSC::JIT::emitSlow_op_put_by_val):
+        * jit/Repatch.cpp:
+        (JSC::linkPolymorphicCall):
+        * jit/SlowPathCall.h:
+        (JSC::JITSlowPathCall::JITSlowPathCall):
+        (JSC::JITSlowPathCall::call):
+        * jit/ThunkGenerators.cpp:
+        (JSC::nativeForGenerator):
+        * runtime/PtrTag.h:
+        (JSC::nextPtrTagID):
+        (JSC::assertIsCFunctionPtr):
+        (JSC::assertIsNullOrCFunctionPtr):
+        (JSC::assertIsNotTagged):
+        (JSC::assertIsTagged):
+        (JSC::assertIsNullOrTagged):
+        (JSC::assertIsTaggedWith):
+        (JSC::assertIsNullOrTaggedWith):
+        (JSC::uniquePtrTagID): Deleted.
+
 2018-03-20  Stanislav Ocovaj  <[email protected]>
 
         [MIPS] Optimize generated JIT code for loads/stores

Modified: trunk/Source/_javascript_Core/assembler/MacroAssemblerARM.h (229766 => 229767)


--- trunk/Source/_javascript_Core/assembler/MacroAssemblerARM.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/assembler/MacroAssemblerARM.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -1614,9 +1614,9 @@
     static void linkCall(void* code, Call call, FunctionPtr function)
     {
         if (call.isFlagSet(Call::Tail))
-            ARMAssembler::linkJump(code, call.m_label, function.value());
+            ARMAssembler::linkJump(code, call.m_label, function.executableAddress());
         else
-            ARMAssembler::linkCall(code, call.m_label, function.value());
+            ARMAssembler::linkCall(code, call.m_label, function.executableAddress());
     }
 
     static const bool s_isVFPPresent;

Modified: trunk/Source/_javascript_Core/assembler/MacroAssemblerARMv7.h (229766 => 229767)


--- trunk/Source/_javascript_Core/assembler/MacroAssemblerARMv7.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/assembler/MacroAssemblerARMv7.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -2128,9 +2128,9 @@
     static void linkCall(void* code, Call call, FunctionPtr function)
     {
         if (call.isFlagSet(Call::Tail))
-            ARMv7Assembler::linkJump(code, call.m_label, function.value());
+            ARMv7Assembler::linkJump(code, call.m_label, function.executableAddress());
         else
-            ARMv7Assembler::linkCall(code, call.m_label, function.value());
+            ARMv7Assembler::linkCall(code, call.m_label, function.executableAddress());
     }
 
     bool m_makeJumpPatchable;

Modified: trunk/Source/_javascript_Core/assembler/MacroAssemblerCodeRef.h (229766 => 229767)


--- trunk/Source/_javascript_Core/assembler/MacroAssemblerCodeRef.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/assembler/MacroAssemblerCodeRef.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -65,9 +65,10 @@
     FunctionPtr() { }
 
     template<typename ReturnType, typename... Arguments>
-    FunctionPtr(ReturnType(*value)(Arguments...), PtrTag tag = SlowPathPtrTag)
+    FunctionPtr(ReturnType(*value)(Arguments...), PtrTag tag = CFunctionPtrTag)
         : m_value(tagCFunctionPtr<void*>(value, tag))
     {
+        assertIsCFunctionPtr(value);
         PoisonedMasmPtr::assertIsNotPoisoned(m_value);
         ASSERT_VALID_CODE_POINTER(m_value);
     }
@@ -77,9 +78,10 @@
 #if CALLING_CONVENTION_IS_STDCALL && !OS(WINDOWS)
 
     template<typename ReturnType, typename... Arguments>
-    FunctionPtr(ReturnType(CDECL *value)(Arguments...), PtrTag tag = SlowPathPtrTag)
+    FunctionPtr(ReturnType(CDECL *value)(Arguments...), PtrTag tag = CFunctionPtrTag)
         : m_value(tagCFunctionPtr<void*>(value, tag))
     {
+        assertIsCFunctionPtr(value);
         PoisonedMasmPtr::assertIsNotPoisoned(m_value);
         ASSERT_VALID_CODE_POINTER(m_value);
     }
@@ -89,9 +91,10 @@
 #if COMPILER_SUPPORTS(FASTCALL_CALLING_CONVENTION)
 
     template<typename ReturnType, typename... Arguments>
-    FunctionPtr(ReturnType(FASTCALL *value)(Arguments...), PtrTag tag = SlowPathPtrTag)
+    FunctionPtr(ReturnType(FASTCALL *value)(Arguments...), PtrTag tag = CFunctionPtrTag)
         : m_value(tagCFunctionPtr<void*>(value, tag))
     {
+        assertIsCFunctionPtr(value);
         PoisonedMasmPtr::assertIsNotPoisoned(m_value);
         ASSERT_VALID_CODE_POINTER(m_value);
     }
@@ -98,24 +101,28 @@
 
 #endif // COMPILER_SUPPORTS(FASTCALL_CALLING_CONVENTION)
 
-    template<typename FunctionType>
-    explicit FunctionPtr(FunctionType* value, PtrTag tag = SlowPathPtrTag)
+    template<typename PtrType, typename = std::enable_if_t<std::is_pointer<PtrType>::value && !std::is_function<typename std::remove_pointer<PtrType>::type>::value>>
+    explicit FunctionPtr(PtrType value, PtrTag tag)
         // Using a C-ctyle cast here to avoid compiler error on RVTC:
         // Error:  #694: reinterpret_cast cannot cast away const or other type qualifiers
         // (I guess on RVTC function pointers have a different constness to GCC/MSVC?)
-        : m_value(tagCodePtr<void*>(value, tag))
+        : m_value(tagCFunctionPtr<void*>(value, tag))
     {
+        assertIsCFunctionPtr(value);
         PoisonedMasmPtr::assertIsNotPoisoned(m_value);
         ASSERT_VALID_CODE_POINTER(m_value);
     }
 
-    explicit FunctionPtr(MacroAssemblerCodePtr);
-
-    void* value() const
+    explicit FunctionPtr(FunctionPtr other, PtrTag tag)
+        : m_value(tagCFunctionPtr<void*>(other.executableAddress(), tag))
     {
+        assertIsCFunctionPtr(other.executableAddress());
         PoisonedMasmPtr::assertIsNotPoisoned(m_value);
-        return removeCodePtrTag(m_value);
+        ASSERT_VALID_CODE_POINTER(m_value);
     }
+
+    explicit FunctionPtr(MacroAssemblerCodePtr);
+
     void* executableAddress() const
     {
         PoisonedMasmPtr::assertIsNotPoisoned(m_value);
@@ -122,6 +129,9 @@
         return m_value;
     }
 
+    explicit operator bool() const { return !!m_value; }
+    bool operator!() const { return !m_value; }
+
 private:
     void* m_value { nullptr };
 };
@@ -147,7 +157,7 @@
     }
 
     explicit ReturnAddressPtr(FunctionPtr function)
-        : m_value(function.value())
+        : m_value(function.executableAddress())
     {
         PoisonedMasmPtr::assertIsNotPoisoned(m_value);
         ASSERT_VALID_CODE_POINTER(m_value);
@@ -210,6 +220,11 @@
 
     PoisonedMasmPtr poisonedPtr() const { return m_value; }
 
+    MacroAssemblerCodePtr retagged(PtrTag oldTag, PtrTag newTag) const
+    {
+        return MacroAssemblerCodePtr(retagCodePtr(executableAddress(), oldTag, newTag));
+    }
+
     template<typename T = void*>
     T executableAddress() const
     {
@@ -351,7 +366,7 @@
 
     MacroAssemblerCodePtr retaggedCode(PtrTag oldTag, PtrTag newTag) const
     {
-        return MacroAssemblerCodePtr(retagCodePtr(m_codePtr.executableAddress(), oldTag, newTag));
+        return m_codePtr.retagged(oldTag, newTag);
     }
 
     size_t size() const
@@ -380,7 +395,6 @@
     : m_value(ptr.executableAddress())
 {
     PoisonedMasmPtr::assertIsNotPoisoned(m_value);
-    ASSERT_VALID_CODE_POINTER(m_value);
 }
 
 } // namespace JSC

Modified: trunk/Source/_javascript_Core/assembler/MacroAssemblerMIPS.h (229766 => 229767)


--- trunk/Source/_javascript_Core/assembler/MacroAssemblerMIPS.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/assembler/MacroAssemblerMIPS.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -3320,9 +3320,9 @@
     static void linkCall(void* code, Call call, FunctionPtr function)
     {
         if (call.isFlagSet(Call::Tail))
-            MIPSAssembler::linkJump(code, call.m_label, function.value());
+            MIPSAssembler::linkJump(code, call.m_label, function.executableAddress());
         else
-            MIPSAssembler::linkCall(code, call.m_label, function.value());
+            MIPSAssembler::linkCall(code, call.m_label, function.executableAddress());
     }
 
 };

Modified: trunk/Source/_javascript_Core/assembler/MacroAssemblerX86.h (229766 => 229767)


--- trunk/Source/_javascript_Core/assembler/MacroAssemblerX86.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/assembler/MacroAssemblerX86.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -369,9 +369,9 @@
     static void linkCall(void* code, Call call, FunctionPtr function)
     {
         if (call.isFlagSet(Call::Tail))
-            X86Assembler::linkJump(code, call.m_label, function.value());
+            X86Assembler::linkJump(code, call.m_label, function.executableAddress());
         else
-            X86Assembler::linkCall(code, call.m_label, function.value());
+            X86Assembler::linkCall(code, call.m_label, function.executableAddress());
     }
 };
 

Modified: trunk/Source/_javascript_Core/assembler/MacroAssemblerX86_64.h (229766 => 229767)


--- trunk/Source/_javascript_Core/assembler/MacroAssemblerX86_64.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/assembler/MacroAssemblerX86_64.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -152,7 +152,7 @@
     }
 
 #if OS(WINDOWS)
-    Call callWithSlowPathReturnType()
+    Call callWithSlowPathReturnType(PtrTag)
     {
         // On Win64, when the return type is larger than 8 bytes, we need to allocate space on the stack for the return value.
         // On entry, rcx should contain a pointer to this stack space. The other parameters are shifted to the right,
@@ -177,7 +177,7 @@
         add64(TrustedImm32(4 * sizeof(int64_t)), X86Registers::ecx);
 
         DataLabelPtr label = moveWithPatch(TrustedImmPtr(nullptr), scratchRegister());
-        Call result = Call(m_assembler.call(scratchRegister()), Call::Linkable);
+        Call result = Call(m_assembler.call(scratchRegister(), tag), Call::Linkable);
 
         add64(TrustedImm32(8 * sizeof(int64_t)), X86Registers::esp);
 
@@ -1953,11 +1953,11 @@
     static void linkCall(void* code, Call call, FunctionPtr function)
     {
         if (!call.isFlagSet(Call::Near))
-            X86Assembler::linkPointer(code, call.m_label.labelAtOffset(-REPATCH_OFFSET_CALL_R11), function.value());
+            X86Assembler::linkPointer(code, call.m_label.labelAtOffset(-REPATCH_OFFSET_CALL_R11), function.executableAddress());
         else if (call.isFlagSet(Call::Tail))
-            X86Assembler::linkJump(code, call.m_label, function.value());
+            X86Assembler::linkJump(code, call.m_label, function.executableAddress());
         else
-            X86Assembler::linkCall(code, call.m_label, function.value());
+            X86Assembler::linkCall(code, call.m_label, function.executableAddress());
     }
 };
 

Modified: trunk/Source/_javascript_Core/bytecode/AccessCase.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/bytecode/AccessCase.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/bytecode/AccessCase.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -863,9 +863,10 @@
 #endif
             jit.storePtr(GPRInfo::callFrameRegister, &vm.topCallFrame);
 
-            operationCall = jit.call(NoPtrTag);
+            PtrTag callTag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+            operationCall = jit.call(callTag);
             jit.addLinkTask([=] (LinkBuffer& linkBuffer) {
-                linkBuffer.link(operationCall, FunctionPtr(this->as<GetterSetterAccessCase>().m_customAccessor.opaque));
+                linkBuffer.link(operationCall, FunctionPtr(this->as<GetterSetterAccessCase>().m_customAccessor.opaque, callTag));
             });
 
             if (m_type == CustomValueGetter || m_type == CustomAccessorGetter)
@@ -1007,11 +1008,12 @@
                 if (!reallocating) {
                     jit.setupArguments<decltype(operationReallocateButterflyToHavePropertyStorageWithInitialCapacity)>(baseGPR);
                     
-                    CCallHelpers::Call operationCall = jit.call(NoPtrTag);
+                    PtrTag callTag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+                    CCallHelpers::Call operationCall = jit.call(callTag);
                     jit.addLinkTask([=] (LinkBuffer& linkBuffer) {
                         linkBuffer.link(
                             operationCall,
-                            FunctionPtr(operationReallocateButterflyToHavePropertyStorageWithInitialCapacity));
+                            FunctionPtr(operationReallocateButterflyToHavePropertyStorageWithInitialCapacity, callTag));
                     });
                 } else {
                     // Handle the case where we are reallocating (i.e. the old structure/butterfly
@@ -1019,11 +1021,12 @@
                     jit.setupArguments<decltype(operationReallocateButterflyToGrowPropertyStorage)>(
                         baseGPR, CCallHelpers::TrustedImm32(newSize / sizeof(JSValue)));
                     
-                    CCallHelpers::Call operationCall = jit.call(NoPtrTag);
+                    PtrTag callTag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+                    CCallHelpers::Call operationCall = jit.call(callTag);
                     jit.addLinkTask([=] (LinkBuffer& linkBuffer) {
                         linkBuffer.link(
                             operationCall,
-                            FunctionPtr(operationReallocateButterflyToGrowPropertyStorage));
+                            FunctionPtr(operationReallocateButterflyToGrowPropertyStorage, callTag));
                     });
                 }
                 

Modified: trunk/Source/_javascript_Core/ftl/FTLSlowPathCall.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/ftl/FTLSlowPathCall.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/ftl/FTLSlowPathCall.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
+ * Copyright (C) 2013-2018 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -118,9 +118,11 @@
     return SlowPathCallKey(m_thunkSaveSet, callTarget, m_argumentRegisters, m_offset);
 }
 
-SlowPathCall SlowPathCallContext::makeCall(VM& vm, void* callTarget)
+SlowPathCall SlowPathCallContext::makeCall(VM& vm, FunctionPtr callTarget)
 {
-    SlowPathCall result = SlowPathCall(m_jit.call(NoPtrTag), keyWithTarget(callTarget));
+    void* executableAddress = callTarget.executableAddress();
+    assertIsCFunctionPtr(executableAddress);
+    SlowPathCall result = SlowPathCall(m_jit.call(NoPtrTag), keyWithTarget(executableAddress));
 
     m_jit.addLinkTask(
         [result, &vm] (LinkBuffer& linkBuffer) {

Modified: trunk/Source/_javascript_Core/ftl/FTLSlowPathCall.h (229766 => 229767)


--- trunk/Source/_javascript_Core/ftl/FTLSlowPathCall.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/ftl/FTLSlowPathCall.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
+ * Copyright (C) 2013-2018 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -59,7 +59,7 @@
 
     // NOTE: The call that this returns is already going to be linked by the JIT using addLinkTask(),
     // so there is no need for you to link it yourself.
-    SlowPathCall makeCall(VM&, void* callTarget);
+    SlowPathCall makeCall(VM&, FunctionPtr callTarget);
 
 private:
     SlowPathCallKey keyWithTarget(void* callTarget) const;
@@ -84,7 +84,7 @@
     {
         SlowPathCallContext context(usedRegisters, jit, sizeof...(ArgumentTypes) + 1, resultGPR);
         jit.setupArguments<void(ExecState*, ArgumentTypes...)>(arguments...);
-        call = context.makeCall(vm, function.value());
+        call = context.makeCall(vm, function);
     }
     if (exceptionTarget)
         exceptionTarget->append(jit.emitExceptionCheck(vm));

Modified: trunk/Source/_javascript_Core/ftl/FTLThunks.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/ftl/FTLThunks.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/ftl/FTLThunks.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -125,15 +125,17 @@
 MacroAssemblerCodeRef osrExitGenerationThunkGenerator(VM* vm)
 {
     unsigned extraPopsToRestore = 0;
+    PtrTag tag = ptrTag(JITThunkPtrTag, nextPtrTagID());
     return genericGenerationThunkGenerator(
-        vm, FunctionPtr(compileFTLOSRExit, NoPtrTag), "FTL OSR exit generation thunk", extraPopsToRestore, FrameAndStackAdjustmentRequirement::Needed);
+        vm, FunctionPtr(compileFTLOSRExit, tag), "FTL OSR exit generation thunk", extraPopsToRestore, FrameAndStackAdjustmentRequirement::Needed);
 }
 
 MacroAssemblerCodeRef lazySlowPathGenerationThunkGenerator(VM* vm)
 {
     unsigned extraPopsToRestore = 1;
+    PtrTag tag = ptrTag(JITThunkPtrTag, nextPtrTagID());
     return genericGenerationThunkGenerator(
-        vm, FunctionPtr(compileFTLLazySlowPath, NoPtrTag), "FTL lazy slow path generation thunk", extraPopsToRestore, FrameAndStackAdjustmentRequirement::NotNeeded);
+        vm, FunctionPtr(compileFTLLazySlowPath, tag), "FTL lazy slow path generation thunk", extraPopsToRestore, FrameAndStackAdjustmentRequirement::NotNeeded);
 }
 
 static void registerClobberCheck(AssemblyHelpers& jit, RegisterSet dontClobber)
@@ -197,7 +199,8 @@
     
     registerClobberCheck(jit, key.argumentRegisters());
     
-    AssemblyHelpers::Call call = jit.call(NoPtrTag);
+    PtrTag callTag = ptrTag(JITThunkPtrTag, nextPtrTagID());
+    AssemblyHelpers::Call call = jit.call(callTag);
 
     jit.loadPtr(AssemblyHelpers::Address(MacroAssembler::stackPointerRegister, key.offset()), GPRInfo::nonPreservedNonReturnGPR);
     jit.restoreReturnAddressBeforeReturn(GPRInfo::nonPreservedNonReturnGPR);
@@ -223,7 +226,7 @@
     jit.ret();
 
     LinkBuffer patchBuffer(jit, GLOBAL_THUNK_ID);
-    patchBuffer.link(call, FunctionPtr(key.callTarget()));
+    patchBuffer.link(call, FunctionPtr(key.callTarget(), callTag));
     return FINALIZE_CODE(patchBuffer, NoPtrTag, "FTL slow path call thunk for %s", toCString(key).data());
 }
 

Modified: trunk/Source/_javascript_Core/jit/JIT.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/JIT.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/JIT.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -804,8 +804,8 @@
     }
 
     for (auto& record : m_calls) {
-        if (record.to)
-            patchBuffer.link(record.from, FunctionPtr(record.to, SlowPathPtrTag));
+        if (record.callee)
+            patchBuffer.link(record.from, record.callee);
     }
 
     for (unsigned i = m_getByIds.size(); i--;)
@@ -917,7 +917,8 @@
         poke(GPRInfo::argumentGPR0);
         poke(GPRInfo::argumentGPR1, 1);
 #endif
-        m_calls.append(CallRecord(call(SlowPathPtrTag), std::numeric_limits<unsigned>::max(), FunctionPtr(lookupExceptionHandlerFromCallerFrame, SlowPathPtrTag).value()));
+        PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+        m_calls.append(CallRecord(call(tag), std::numeric_limits<unsigned>::max(), FunctionPtr(lookupExceptionHandlerFromCallerFrame, tag)));
         jumpToExceptionHandler(*vm());
     }
 
@@ -936,7 +937,8 @@
         poke(GPRInfo::argumentGPR0);
         poke(GPRInfo::argumentGPR1, 1);
 #endif
-        m_calls.append(CallRecord(call(SlowPathPtrTag), std::numeric_limits<unsigned>::max(), FunctionPtr(lookupExceptionHandler, SlowPathPtrTag).value()));
+        PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+        m_calls.append(CallRecord(call(tag), std::numeric_limits<unsigned>::max(), FunctionPtr(lookupExceptionHandler, tag)));
         jumpToExceptionHandler(*vm());
     }
 }

Modified: trunk/Source/_javascript_Core/jit/JIT.h (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/JIT.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/JIT.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -70,16 +70,16 @@
     struct CallRecord {
         MacroAssembler::Call from;
         unsigned bytecodeOffset;
-        void* to;
+        FunctionPtr callee;
 
         CallRecord()
         {
         }
 
-        CallRecord(MacroAssembler::Call from, unsigned bytecodeOffset, void* to = 0)
+        CallRecord(MacroAssembler::Call from, unsigned bytecodeOffset, FunctionPtr callee)
             : from(from)
             , bytecodeOffset(bytecodeOffset)
-            , to(to)
+            , callee(callee)
         {
         }
     };
@@ -267,18 +267,18 @@
         void privateCompilePatchGetArrayLength(ReturnAddressPtr returnAddress);
 
         // Add a call out from JIT code, without an exception check.
-        Call appendCall(const FunctionPtr function)
+        Call appendCall(const FunctionPtr function, PtrTag tag)
         {
-            Call functionCall = call(NoPtrTag);
-            m_calls.append(CallRecord(functionCall, m_bytecodeOffset, function.value()));
+            Call functionCall = call(tag);
+            m_calls.append(CallRecord(functionCall, m_bytecodeOffset, FunctionPtr(function, tag)));
             return functionCall;
         }
 
 #if OS(WINDOWS) && CPU(X86_64)
-        Call appendCallWithSlowPathReturnType(const FunctionPtr function)
+        Call appendCallWithSlowPathReturnType(const FunctionPtr function, PtrTag tag)
         {
-            Call functionCall = callWithSlowPathReturnType();
-            m_calls.append(CallRecord(functionCall, m_bytecodeOffset, function.value()));
+            Call functionCall = callWithSlowPathReturnType(tag);
+            m_calls.append(CallRecord(functionCall, m_bytecodeOffset, FunctionPtr(function, tag)));
             return functionCall;
         }
 #endif
@@ -704,62 +704,105 @@
             linkAllSlowCasesForBytecodeOffset(m_slowCases, iter, m_bytecodeOffset);
         }
 
-        MacroAssembler::Call appendCallWithExceptionCheck(const FunctionPtr);
+        MacroAssembler::Call appendCallWithExceptionCheck(const FunctionPtr, PtrTag);
 #if OS(WINDOWS) && CPU(X86_64)
-        MacroAssembler::Call appendCallWithExceptionCheckAndSlowPathReturnType(const FunctionPtr);
+        MacroAssembler::Call appendCallWithExceptionCheckAndSlowPathReturnType(const FunctionPtr, PtrTag = NoPtrTag);
 #endif
-        MacroAssembler::Call appendCallWithCallFrameRollbackOnException(const FunctionPtr);
-        MacroAssembler::Call appendCallWithExceptionCheckSetJSValueResult(const FunctionPtr, int);
-        MacroAssembler::Call appendCallWithExceptionCheckSetJSValueResultWithProfile(const FunctionPtr, int);
+        MacroAssembler::Call appendCallWithCallFrameRollbackOnException(const FunctionPtr, PtrTag);
+        MacroAssembler::Call appendCallWithExceptionCheckSetJSValueResult(const FunctionPtr, PtrTag, int);
+        MacroAssembler::Call appendCallWithExceptionCheckSetJSValueResultWithProfile(const FunctionPtr, PtrTag, int);
         
         template<typename OperationType, typename... Args>
         std::enable_if_t<FunctionTraits<OperationType>::hasResult, MacroAssembler::Call>
+        callOperation(OperationType operation, PtrTag tag, int result, Args... args)
+        {
+            setupArguments<OperationType>(args...);
+            return appendCallWithExceptionCheckSetJSValueResult(operation, tag, result);
+        }
+
+        template<typename OperationType, typename... Args>
+        std::enable_if_t<FunctionTraits<OperationType>::hasResult, MacroAssembler::Call>
         callOperation(OperationType operation, int result, Args... args)
         {
+            PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+            return callOperation(operation, tag, result, args...);
+        }
+
+        template<typename OperationType, typename... Args>
+        MacroAssembler::Call callOperation(OperationType operation, PtrTag tag, Args... args)
+        {
             setupArguments<OperationType>(args...);
-            return appendCallWithExceptionCheckSetJSValueResult(operation, result);
+            return appendCallWithExceptionCheck(operation, tag);
         }
 
         template<typename OperationType, typename... Args>
         MacroAssembler::Call callOperation(OperationType operation, Args... args)
         {
+            PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+            return callOperation(operation, tag, args...);
+        }
+
+        template<typename OperationType, typename... Args>
+        std::enable_if_t<FunctionTraits<OperationType>::hasResult, MacroAssembler::Call>
+        callOperationWithProfile(OperationType operation, PtrTag tag, int result, Args... args)
+        {
             setupArguments<OperationType>(args...);
-            return appendCallWithExceptionCheck(operation);
+            return appendCallWithExceptionCheckSetJSValueResultWithProfile(operation, tag, result);
         }
 
-
         template<typename OperationType, typename... Args>
         std::enable_if_t<FunctionTraits<OperationType>::hasResult, MacroAssembler::Call>
         callOperationWithProfile(OperationType operation, int result, Args... args)
         {
-            setupArguments<OperationType>(args...);
-            return appendCallWithExceptionCheckSetJSValueResultWithProfile(operation, result);
+            PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+            return callOperationWithProfile(operation, tag, result, args...);
         }
 
         template<typename OperationType, typename... Args>
-        MacroAssembler::Call callOperationWithResult(OperationType operation, JSValueRegs resultRegs, Args... args)
+        MacroAssembler::Call callOperationWithResult(OperationType operation, PtrTag tag, JSValueRegs resultRegs, Args... args)
         {
             setupArguments<OperationType>(args...);
-            auto result = appendCallWithExceptionCheck(operation);
+            auto result = appendCallWithExceptionCheck(operation, tag);
             setupResults(resultRegs);
             return result;
         }
 
         template<typename OperationType, typename... Args>
-        MacroAssembler::Call callOperationNoExceptionCheck(OperationType operation, Args... args)
+        MacroAssembler::Call callOperationWithResult(OperationType operation, JSValueRegs resultRegs, Args... args)
         {
+            PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+            return callOperationWithResult(operation, tag, resultRegs, args...);
+        }
+
+        template<typename OperationType, typename... Args>
+        MacroAssembler::Call callOperationNoExceptionCheck(OperationType operation, PtrTag tag, Args... args)
+        {
             setupArguments<OperationType>(args...);
             updateTopCallFrame();
-            return appendCall(operation);
+            return appendCall(operation, tag);
         }
 
         template<typename OperationType, typename... Args>
-        MacroAssembler::Call callOperationWithCallFrameRollbackOnException(OperationType operation, Args... args)
+        MacroAssembler::Call callOperationNoExceptionCheck(OperationType operation, Args... args)
         {
+            PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+            return callOperationNoExceptionCheck(operation, tag, args...);
+        }
+
+        template<typename OperationType, typename... Args>
+        MacroAssembler::Call callOperationWithCallFrameRollbackOnException(OperationType operation, PtrTag tag, Args... args)
+        {
             setupArguments<OperationType>(args...);
-            return appendCallWithCallFrameRollbackOnException(operation);
+            return appendCallWithCallFrameRollbackOnException(operation, tag);
         }
 
+        template<typename OperationType, typename... Args>
+        MacroAssembler::Call callOperationWithCallFrameRollbackOnException(OperationType operation, Args... args)
+        {
+            PtrTag tag = ptrTag(JITOperationPtrTag, nextPtrTagID());
+            return callOperationWithCallFrameRollbackOnException(operation, tag, args...);
+        }
+
         template<typename SnippetGenerator>
         void emitBitBinaryOpFastPath(Instruction* currentInstruction);
 

Modified: trunk/Source/_javascript_Core/jit/JITArithmetic.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/JITArithmetic.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/JITArithmetic.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -707,9 +707,9 @@
     if (!generatedInlineCode) {
         ArithProfile* arithProfile = mathIC->arithProfile();
         if (arithProfile && shouldEmitProfiling())
-            callOperationWithResult(profiledFunction, resultRegs, srcRegs, arithProfile);
+            callOperationWithResult(profiledFunction, NoPtrTag, resultRegs, srcRegs, arithProfile);
         else
-            callOperationWithResult(nonProfiledFunction, resultRegs, srcRegs);
+            callOperationWithResult(nonProfiledFunction, NoPtrTag, resultRegs, srcRegs);
     } else
         addSlowCase(mathICGenerationState.slowPathJumps);
 
@@ -780,9 +780,9 @@
             emitGetVirtualRegister(op2, rightRegs);
         ArithProfile* arithProfile = mathIC->arithProfile();
         if (arithProfile && shouldEmitProfiling())
-            callOperationWithResult(profiledFunction, resultRegs, leftRegs, rightRegs, arithProfile);
+            callOperationWithResult(profiledFunction, NoPtrTag, resultRegs, leftRegs, rightRegs, arithProfile);
         else
-            callOperationWithResult(nonProfiledFunction, resultRegs, leftRegs, rightRegs);
+            callOperationWithResult(nonProfiledFunction, NoPtrTag, resultRegs, leftRegs, rightRegs);
     } else
         addSlowCase(mathICGenerationState.slowPathJumps);
 
@@ -820,11 +820,11 @@
     ArithProfile* arithProfile = mathIC->arithProfile();
     if (arithProfile && shouldEmitProfiling()) {
         if (mathICGenerationState.shouldSlowPathRepatch)
-            mathICGenerationState.slowPathCall = callOperationWithResult(reinterpret_cast<J_JITOperation_EJMic>(profiledRepatchFunction), resultRegs, srcRegs, TrustedImmPtr(mathIC));
+            mathICGenerationState.slowPathCall = callOperationWithResult(reinterpret_cast<J_JITOperation_EJMic>(profiledRepatchFunction), NoPtrTag, resultRegs, srcRegs, TrustedImmPtr(mathIC));
         else
-            mathICGenerationState.slowPathCall = callOperationWithResult(profiledFunction, resultRegs, srcRegs, arithProfile);
+            mathICGenerationState.slowPathCall = callOperationWithResult(profiledFunction, NoPtrTag, resultRegs, srcRegs, arithProfile);
     } else
-        mathICGenerationState.slowPathCall = callOperationWithResult(reinterpret_cast<J_JITOperation_EJMic>(repatchFunction), resultRegs, srcRegs, TrustedImmPtr(mathIC));
+        mathICGenerationState.slowPathCall = callOperationWithResult(reinterpret_cast<J_JITOperation_EJMic>(repatchFunction), NoPtrTag, resultRegs, srcRegs, TrustedImmPtr(mathIC));
 
 #if ENABLE(MATH_IC_STATS)
     auto slowPathEnd = label();
@@ -886,11 +886,11 @@
     ArithProfile* arithProfile = mathIC->arithProfile();
     if (arithProfile && shouldEmitProfiling()) {
         if (mathICGenerationState.shouldSlowPathRepatch)
-            mathICGenerationState.slowPathCall = callOperationWithResult(bitwise_cast<J_JITOperation_EJJMic>(profiledRepatchFunction), resultRegs, leftRegs, rightRegs, TrustedImmPtr(mathIC));
+            mathICGenerationState.slowPathCall = callOperationWithResult(bitwise_cast<J_JITOperation_EJJMic>(profiledRepatchFunction), NoPtrTag, resultRegs, leftRegs, rightRegs, TrustedImmPtr(mathIC));
         else
-            mathICGenerationState.slowPathCall = callOperationWithResult(profiledFunction, resultRegs, leftRegs, rightRegs, arithProfile);
+            mathICGenerationState.slowPathCall = callOperationWithResult(profiledFunction, NoPtrTag, resultRegs, leftRegs, rightRegs, arithProfile);
     } else
-        mathICGenerationState.slowPathCall = callOperationWithResult(bitwise_cast<J_JITOperation_EJJMic>(repatchFunction), resultRegs, leftRegs, rightRegs, TrustedImmPtr(mathIC));
+        mathICGenerationState.slowPathCall = callOperationWithResult(bitwise_cast<J_JITOperation_EJJMic>(repatchFunction), NoPtrTag, resultRegs, leftRegs, rightRegs, TrustedImmPtr(mathIC));
 
 #if ENABLE(MATH_IC_STATS)
     auto slowPathEnd = label();

Modified: trunk/Source/_javascript_Core/jit/JITInlines.h (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/JITInlines.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/JITInlines.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2008, 2012-2013, 2015-2016 Apple Inc. All rights reserved.
+ * Copyright (C) 2008-2018 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -120,7 +120,8 @@
 {
     ASSERT(m_bytecodeOffset != std::numeric_limits<unsigned>::max()); // This method should only be called during hot/cold path generation, so that m_bytecodeOffset is set.
     Call nakedCall = nearCall();
-    m_calls.append(CallRecord(nakedCall, m_bytecodeOffset, function.executableAddress()));
+    assertIsNullOrTaggedWith(function.executableAddress(), NearCallPtrTag);
+    m_calls.append(CallRecord(nakedCall, m_bytecodeOffset, FunctionPtr(function)));
     return nakedCall;
 }
 
@@ -128,7 +129,8 @@
 {
     ASSERT(m_bytecodeOffset != std::numeric_limits<unsigned>::max()); // This method should only be called during hot/cold path generation, so that m_bytecodeOffset is set.
     Call nakedCall = nearTailCall();
-    m_calls.append(CallRecord(nakedCall, m_bytecodeOffset, function.executableAddress()));
+    assertIsNullOrTaggedWith(function.executableAddress(), NearCallPtrTag);
+    m_calls.append(CallRecord(nakedCall, m_bytecodeOffset, FunctionPtr(function)));
     return nakedCall;
 }
 
@@ -149,35 +151,35 @@
     storePtr(callFrameRegister, &m_vm->topCallFrame);
 }
 
-ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheck(const FunctionPtr function)
+ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheck(const FunctionPtr function, PtrTag tag)
 {
     updateTopCallFrame();
-    MacroAssembler::Call call = appendCall(function);
+    MacroAssembler::Call call = appendCall(function, tag);
     exceptionCheck();
     return call;
 }
 
 #if OS(WINDOWS) && CPU(X86_64)
-ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheckAndSlowPathReturnType(const FunctionPtr function)
+ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheckAndSlowPathReturnType(const FunctionPtr function, PtrTag  tag)
 {
     updateTopCallFrame();
-    MacroAssembler::Call call = appendCallWithSlowPathReturnType(function);
+    MacroAssembler::Call call = appendCallWithSlowPathReturnType(function, tag);
     exceptionCheck();
     return call;
 }
 #endif
 
-ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithCallFrameRollbackOnException(const FunctionPtr function)
+ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithCallFrameRollbackOnException(const FunctionPtr function, PtrTag tag)
 {
     updateTopCallFrame(); // The callee is responsible for setting topCallFrame to their caller
-    MacroAssembler::Call call = appendCall(function);
+    MacroAssembler::Call call = appendCall(function, tag);
     exceptionCheckWithCallFrameRollback();
     return call;
 }
 
-ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheckSetJSValueResult(const FunctionPtr function, int dst)
+ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheckSetJSValueResult(const FunctionPtr function, PtrTag tag, int dst)
 {
-    MacroAssembler::Call call = appendCallWithExceptionCheck(function);
+    MacroAssembler::Call call = appendCallWithExceptionCheck(function, tag);
 #if USE(JSVALUE64)
     emitPutVirtualRegister(dst, returnValueGPR);
 #else
@@ -186,9 +188,9 @@
     return call;
 }
 
-ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheckSetJSValueResultWithProfile(const FunctionPtr function, int dst)
+ALWAYS_INLINE MacroAssembler::Call JIT::appendCallWithExceptionCheckSetJSValueResultWithProfile(const FunctionPtr function, PtrTag tag, int dst)
 {
-    MacroAssembler::Call call = appendCallWithExceptionCheck(function);
+    MacroAssembler::Call call = appendCallWithExceptionCheck(function, tag);
     emitValueProfilingSite();
 #if USE(JSVALUE64)
     emitPutVirtualRegister(dst, returnValueGPR);

Modified: trunk/Source/_javascript_Core/jit/JITPropertyAccess.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/JITPropertyAccess.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/JITPropertyAccess.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -275,7 +275,7 @@
     
     emitGetVirtualRegister(base, regT0);
     emitGetVirtualRegister(property, regT1);
-    Call call = callOperation(operationGetByValOptimize, dst, regT0, regT1, byValInfo);
+    Call call = callOperation(operationGetByValOptimize, NoPtrTag, dst, regT0, regT1, byValInfo);
 
     m_byValCompilationInfo[m_byValInstructionIndex].slowPathTarget = slowPath;
     m_byValCompilationInfo[m_byValInstructionIndex].returnAddress = call;
@@ -492,7 +492,7 @@
     emitGetVirtualRegister(property, regT1);
     emitGetVirtualRegister(value, regT2);
     bool isDirect = Interpreter::getOpcodeID(currentInstruction->u.opcode) == op_put_by_val_direct;
-    Call call = callOperation(isDirect ? operationDirectPutByValOptimize : operationPutByValOptimize, regT0, regT1, regT2, byValInfo);
+    Call call = callOperation(isDirect ? operationDirectPutByValOptimize : operationPutByValOptimize, NoPtrTag, regT0, regT1, regT2, byValInfo);
 
     m_byValCompilationInfo[m_byValInstructionIndex].slowPathTarget = slowPath;
     m_byValCompilationInfo[m_byValInstructionIndex].returnAddress = call;
@@ -1291,8 +1291,8 @@
         patchBuffer.link(m_exceptionChecks, byValInfo->exceptionHandler);
 
     for (const auto& callSite : m_calls) {
-        if (callSite.to)
-            patchBuffer.link(callSite.from, FunctionPtr(callSite.to, SlowPathPtrTag));
+        if (callSite.callee)
+            patchBuffer.link(callSite.from, callSite.callee);
     }
     gen.finalize(patchBuffer);
 
@@ -1345,8 +1345,8 @@
     patchBuffer.link(slowCases, CodeLocationLabel(MacroAssemblerCodePtr::createFromExecutableAddress(returnAddress.value())).labelAtOffset(byValInfo->returnAddressToSlowPath));
     patchBuffer.link(done, byValInfo->badTypeJump.labelAtOffset(byValInfo->badTypeJumpToDone));
     if (needsLinkForWriteBarrier) {
-        ASSERT(m_calls.last().to == operationWriteBarrierSlowPath);
-        patchBuffer.link(m_calls.last().from, operationWriteBarrierSlowPath, SlowPathPtrTag);
+        ASSERT(m_calls.last().callee.executableAddress() == operationWriteBarrierSlowPath);
+        patchBuffer.link(m_calls.last().from, FunctionPtr(operationWriteBarrierSlowPath, SlowPathPtrTag));
     }
     
     bool isDirect = Interpreter::getOpcodeID(currentInstruction->u.opcode) == op_put_by_val_direct;
@@ -1381,8 +1381,8 @@
         patchBuffer.link(m_exceptionChecks, byValInfo->exceptionHandler);
 
     for (const auto& callSite : m_calls) {
-        if (callSite.to)
-            patchBuffer.link(callSite.from, FunctionPtr(callSite.to, SlowPathPtrTag));
+        if (callSite.callee)
+            patchBuffer.link(callSite.from, callSite.callee);
     }
     gen.finalize(patchBuffer);
 

Modified: trunk/Source/_javascript_Core/jit/JITPropertyAccess32_64.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/JITPropertyAccess32_64.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/JITPropertyAccess32_64.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -568,7 +568,7 @@
     poke(regT1, pokeOffset++);
     poke(regT0, pokeOffset++);
     poke(TrustedImmPtr(byValInfo), pokeOffset++);
-    Call call = appendCallWithExceptionCheck(isDirect ? operationDirectPutByValOptimize : operationPutByValOptimize);
+    Call call = appendCallWithExceptionCheck(isDirect ? operationDirectPutByValOptimize : operationPutByValOptimize, NoPtrTag);
 #else
     // The register selection below is chosen to reduce register swapping on ARM.
     // Swapping shouldn't happen on other platforms.
@@ -575,7 +575,7 @@
     emitLoad(base, regT2, regT1);
     emitLoad(property, regT3, regT0);
     emitLoad(value, regT5, regT4);
-    Call call = callOperation(isDirect ? operationDirectPutByValOptimize : operationPutByValOptimize, JSValueRegs(regT2, regT1), JSValueRegs(regT3, regT0), JSValueRegs(regT5, regT4), byValInfo);
+    Call call = callOperation(isDirect ? operationDirectPutByValOptimize : operationPutByValOptimize, NoPtrTag, JSValueRegs(regT2, regT1), JSValueRegs(regT3, regT0), JSValueRegs(regT5, regT4), byValInfo);
 #endif
 
     m_byValCompilationInfo[m_byValInstructionIndex].slowPathTarget = slowPath;

Modified: trunk/Source/_javascript_Core/jit/Repatch.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/Repatch.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/Repatch.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -1067,11 +1067,15 @@
     
     RELEASE_ASSERT(callCases.size() == calls.size());
     for (CallToCodePtr callToCodePtr : calls) {
+#if CPU(ARM_THUMB2)
         // Tail call special-casing ensures proper linking on ARM Thumb2, where a tail call jumps to an address
         // with a non-decorated bottom bit but a normal call calls an address with a decorated bottom bit.
         bool isTailCall = callToCodePtr.call.isFlagSet(CCallHelpers::Call::Tail);
-        patchBuffer.link(
-            callToCodePtr.call, FunctionPtr(tagCodePtr(isTailCall ? callToCodePtr.codePtr.dataLocation() : callToCodePtr.codePtr.executableAddress(), CodeEntryPtrTag)));
+        void* target = isTailCall ? callToCodePtr.codePtr.dataLocation() : callToCodePtr.codePtr.executableAddress();
+        patchBuffer.link(callToCodePtr.call, FunctionPtr(MacroAssemblerCodePtr(target)));
+#else
+        patchBuffer.link(callToCodePtr.call, FunctionPtr(callToCodePtr.codePtr.retagged(CodeEntryPtrTag, NearCallPtrTag)));
+#endif
     }
     if (isWebAssembly || JITCode::isOptimizingJIT(callerCodeBlock->jitType()))
         patchBuffer.link(done, callLinkInfo.callReturnLocation().labelAtOffset(0));

Modified: trunk/Source/_javascript_Core/jit/SlowPathCall.h (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/SlowPathCall.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/SlowPathCall.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2013 Apple Inc. All rights reserved.
+ * Copyright (C) 2013-2018 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
@@ -34,11 +34,12 @@
 
 class JITSlowPathCall {
 public:
-    JITSlowPathCall(JIT* jit, Instruction* pc, SlowPathFunction stub)
+    JITSlowPathCall(JIT* jit, Instruction* pc, SlowPathFunction slowPathFunction)
         : m_jit(jit)
-        , m_stub(stub)
+        , m_slowPathFunction(slowPathFunction)
         , m_pc(pc)
     {
+        assertIsCFunctionPtr(slowPathFunction);
     }
 
     JIT::Call call()
@@ -61,8 +62,9 @@
         m_jit->move(JIT::callFrameRegister, JIT::argumentGPR0);
         m_jit->move(JIT::TrustedImmPtr(m_pc), JIT::argumentGPR1);
 #endif
-        JIT::Call call = m_jit->call(NoPtrTag);
-        m_jit->m_calls.append(CallRecord(call, m_jit->m_bytecodeOffset, m_stub.value()));
+        PtrTag tag = ptrTag(SlowPathPtrTag, nextPtrTagID());
+        JIT::Call call = m_jit->call(tag);
+        m_jit->m_calls.append(CallRecord(call, m_jit->m_bytecodeOffset, FunctionPtr(m_slowPathFunction, tag)));
 
 #if CPU(X86) && USE(JSVALUE32_64)
         m_jit->addPtr(MacroAssembler::TrustedImm32(16), MacroAssembler::stackPointerRegister);
@@ -82,7 +84,7 @@
 
 private:
     JIT* m_jit;
-    FunctionPtr m_stub;
+    SlowPathFunction m_slowPathFunction;
     Instruction* m_pc;
 };
 

Modified: trunk/Source/_javascript_Core/jit/ThunkGenerators.cpp (229766 => 229767)


--- trunk/Source/_javascript_Core/jit/ThunkGenerators.cpp	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/jit/ThunkGenerators.cpp	2018-03-20 18:10:16 UTC (rev 229767)
@@ -415,8 +415,9 @@
 #endif
     jit.move(JSInterfaceJIT::callFrameRegister, JSInterfaceJIT::argumentGPR0);
 #endif
-    jit.move(JSInterfaceJIT::TrustedImmPtr(FunctionPtr(operationVMHandleException, NoPtrTag).value()), JSInterfaceJIT::regT3);
-    jit.call(JSInterfaceJIT::regT3, NoPtrTag);
+    PtrTag tag = ptrTag(ExceptionHandlerPtrTag, nextPtrTagID());
+    jit.move(JSInterfaceJIT::TrustedImmPtr(tagCFunctionPtr(operationVMHandleException, tag)), JSInterfaceJIT::regT3);
+    jit.call(JSInterfaceJIT::regT3, tag);
 #if CPU(X86) && USE(JSVALUE32_64)
     jit.addPtr(JSInterfaceJIT::TrustedImm32(8), JSInterfaceJIT::stackPointerRegister);
 #elif OS(WINDOWS)

Modified: trunk/Source/_javascript_Core/runtime/PtrTag.h (229766 => 229767)


--- trunk/Source/_javascript_Core/runtime/PtrTag.h	2018-03-20 17:53:59 UTC (rev 229766)
+++ trunk/Source/_javascript_Core/runtime/PtrTag.h	2018-03-20 18:10:16 UTC (rev 229767)
@@ -42,6 +42,8 @@
     CodeEntryWithArityCheckPtrTag,
     ExceptionHandlerPtrTag,
     JITCodePtrTag,
+    JITOperationPtrTag,
+    JITThunkPtrTag,
     NativeCodePtrTag,
     SlowPathPtrTag,
 
@@ -52,8 +54,10 @@
     YarrBacktrackPtrTag,
 };
 
+uintptr_t nextPtrTagID();
+
 #if !USE(POINTER_PROFILING)
-inline uintptr_t uniquePtrTagID() { return 0; }
+inline uintptr_t nextPtrTagID() { return 0; }
 
 template<typename... Arguments>
 inline constexpr PtrTag ptrTag(Arguments&&...) { return NoPtrTag; }
@@ -94,6 +98,16 @@
 template<typename PtrType, typename = std::enable_if_t<std::is_pointer<PtrType>::value>>
 inline PtrType untagCFunctionPtr(PtrType ptr, PtrTag) { return ptr; }
 
+template<typename PtrType> void assertIsCFunctionPtr(PtrType) { }
+template<typename PtrType> void assertIsNullOrCFunctionPtr(PtrType) { }
+
+template<typename PtrType> void assertIsNotTagged(PtrType) { }
+template<typename PtrType> void assertIsTagged(PtrType) { }
+template<typename PtrType> void assertIsNullOrTagged(PtrType) { }
+
+template<typename PtrType> void assertIsTaggedWith(PtrType, PtrTag) { }
+template<typename PtrType> void assertIsNullOrTaggedWith(PtrType, PtrTag) { }
+
 #endif // !USE(POINTER_PROFILING)
 
 } // namespace JSC
_______________________________________________
webkit-changes mailing list
[email protected]
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to