tqchen commented on code in PR #97:
URL: https://github.com/apache/tvm-rfcs/pull/97#discussion_r1064788523


##########
rfcs/0097-unify-packed-and-object.md:
##########
@@ -0,0 +1,677 @@
+Authors: @cloud-mxd, @junrushao,  @tqchen
+
+- Feature Name: Further Unify Packed and Object in TVM Runtime
+- Start Date: 2023-01-08
+- RFC PR: [apache/tvm-rfcs#0097](https://github.com/apache/tvm-rfcs/pull/97)
+- GitHub Issue: [apache/tvm#0000](https://github.com/apache/tvm/issues/0000)
+
+## Summary
+
+This RFC proposes to further unify our PackedFunc and Object in TVM Runtime. 
The key improvements include: unifying `type_code`, solidifying AnyValue 
support for both stack and object values, open doors for small-string and 
NLP-preprocessing, and enable universal container.
+
+## Motivation
+
+FFI is one of the main component of the TVM. We use PackedFunc convention to 
safely type erase values and pass things around. In order to support a general 
set of data structures both for compilation purposes, we also have Object 
system, which is made to be aware in the Packed API. 
+
+The object supports reference counting, dynamic type casting and checking as 
well as structural equality/hashing/serialization in the compiler.
+Right now most of the things of interest are Object, this including containers 
like Map, Array. PackedFunc itself, Module and various IR objects.
+Object requires heap allocation and reference counting, which can be optimized 
through pooling. They are suitable for most of the deep learning runtime needs, 
+such as containers as long as they are infrequent.
+In the meantime, we still need to operate with values on stack. Specifically, 
when we pass around int, float values. 
+It can be wasteful to invoke heap allocations/or even pooling if the 
operations is meant to be low cost. As a result, the FFI mechanism also serves 
additional ways to be able to pass **stack values** directly around without 
object.
+
+This post summarizes lessons from us and other related projects and needs 
around the overall TVM FFI and Object system. And seek to use these lessons to 
further solidify the current system. We summarize some of the needs and 
observations as follows:
+
+### N0: First class stack small string and AnyValue
+
+**Lesson from matxscript:** Data preprocessing is an important part of ML 
pipeline. Pre-processing in NLP involves strings and containers. Additionally, 
when translating programs written by users (in python), there may not be 
sufficient type annotations. We can common get the one of the programs below
+
+```cpp
+// This can be part of data processing code translated 
+// from user that comes without type annotation
+AnyValue unicode_split_any(const AnyValue& word) {
+  List ret;
+  for (size_t i = 0; i < word.size(); ++i) {
+     AnyValue res = word[i];
+     ret.push_back(res);   
+  }
+  return ret;
+}
+// This is a better typed execution code
+// Note that word[i] returns a UCS4String container to match python semantics 
+// Use UCS4String stores unicode in a fixed-length 4 bytes value to ease random
+// access to the elements. 
+List unicode_split(const UCS4String& word) {
+  List ret;
+  for (size_t i = 0; i < word.size(); ++i) {
+     UCS4String res = word[i];
+     ret.push_back(res);   
+  }
+  return ret;
+}
+```
+
+- Need a base AnyValue to support both stack values and object.
+    - This is to provide a safety net of translation.
+- The AnyValue needs to accommodate small-string(on stack) to enable fast 
string processing. Specifically, note that the particular example creates a 
`UCS4String res` for every character of the word. If we run heap allocation for 
each invocation, or even do reference countings, this can become expensive.
+
+While it is possible to rewrite the program through stronger typing and get 
more efficient code. It is important to acknowledge the need to efficient 
erased runtime support (with minimum overhead), especially given many ML user 
comes from python.
+
+### N1: Universal Container
+
+In the above exmaple it is important to note that the container `List` should 
hold any values. While it is possible to also provide different variant of 
specialized containers(such as `vector<int>`), to interact with a language like 
python, it would be nice to have a single universal container across the 
codebase. We also experienced similar issues in our compilation stack. As an 
example, while it is possible to use Array to hold IR nodes such as Expr, we 
cannot use it to hold POD int values, or other POD data types such as 
DLDataType.
+
+Having an efficient universal container helps to simplify conversions across 
language as well. For example, list from python will be able to be turn into a 
single container without worrying about content type. The execution runtime 
will also be able to directly leverage the universal container to support all 
possible cases that a developer might write. 
+
+### N2: Further Unify POD Value, Object and AnyValue
+
+TVM currently do have an AnyValue. Specifically `TVMRetValue` is used to hold 
managed result for C++ PackedFunc return can serve as any value. Additionally, 
if the value is object. `ObjectRef` serves as a nice way that comes with 
various mechanisms, including structural equality hashing.
+If we create Boxed Object for each stack values, e.g. Integer to represent 
int. We will be able to effectively represent every value in Object as well.
+Both TVMRetValue and Object leverages a code field in the beginning of the 
data structure to identify the type. TVMRetValue’s code are statically 
assigned, Object’s code contains a statically assigned segment for runtime 
objects and dynamically assigned (that are indexed by type_key) for other 
objects.
+
+There are two interesting regimes of operation that comes with
+
+- R0: On one hand, if we are operating on the regime of no need for frequent 
stack value operations. It is desirable to simply use Object. Because object is 
more compact on register (8byte ptr value), can obtain underlying container 
pointers easily for weak reference
+    
+    ```cpp
+    void ObjectOperation(ObjectRef obj) {
+      if (auto* IntImmNode int_ptr = obj.as<IntImmNode>()) {
+        LOG(INFO) << int_ptr->value;
+      }
+    }
+    ```
+    
+- R1: On the other hand, when we operate on frequent processing that is also 
not well-typed (as the `unicode_split` example). It is important to also 
support a AnyValue that comes with stack value support.
+
+As a point of reference, python use object as base for everything. But that 
indeed creates the overhead for str, int (which we seek to eliminate). Java and 
C# support both stack values, and their object counter part. This is a 
processing called 
[boxing](https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/types/boxing-and-unboxing)
 that enables most of the runtime container to store values as object.
+
+Right now we have both mechanism. It would be **desirable to further unify the 
Object and AnyValue** to support both R0 and R1. Additionally, it would be nice 
to have automatic conversions if we decide that two mechanisms are supported. 
Say a caller pass in a boxed int value, the callee should be able to easily get 
int out from it(or treat it as an int) without having to do explicit casting. 
So the same routine can be implemented via either R0 or R1 that is transparent 
to the caller.
+
+- This is also important for compilers and runtimes, as different compiler and 
runtime might have their own considerations operating under R0/R1.
+
+## Guide-level explanation and Design Goals
+
+We have the following design goals:
+
+- G0: Automatic switching between object focused scenario and stack-mixed that 
requires AnyValue.
+- G1: Enable efficient string processing, specifically small-string support 
for NLP use-cases.
+- G2: Enable efficient universal container (e.g. Array that stores everything).

Review Comment:
   The intention is to have both Array and List follow the same container, but 
still enables specialization



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to