gemini-code-assist[bot] commented on code in PR #429:
URL: https://github.com/apache/tvm-ffi/pull/429#discussion_r2771441186
##########
docs/packaging/cpp_tooling.rst:
##########
@@ -190,19 +249,14 @@ The following settings help CMake Tools integrate with
TVM-FFI and generate the
}
}
-
.. important::
Make sure ``Python_EXECUTABLE`` and ``tvm_ffi_ROOT`` match the virtual
environment you intend to use.
-
-clangd
-------
-
-Create a ``.clangd`` file at your project root and point it to the CMake
-compilation database. The snippet below also removes NVCC flags that clangd
-does not understand:
+**clangd.** Create a ``.clangd`` file at the project root pointing to the CMake
+compilation database. The snippet below also strips NVCC flags that clangd
+does not recognize:
Review Comment:

For better document structure and consistency, consider using a level 3
subheading for 'clangd' instead of bolded text.
```suggestion
clangd
~~~~~~
Create a ``.clangd`` file at the project root pointing to the CMake
compilation database. The snippet below also strips NVCC flags that clangd
does not recognize:
```
##########
docs/packaging/cpp_tooling.rst:
##########
@@ -135,47 +137,104 @@ On non-Apple platforms, this is currently a no-op.
:DESTINATION: Install destination directory relative to
``CMAKE_INSTALL_PREFIX``.
-CMake Example
-~~~~~~~~~~~~~
+Set ``tvm_ffi_ROOT``
+""""""""""""""""""""
-.. code-block:: cmake
+If ``find_package(tvm_ffi CONFIG REQUIRED)`` fails because CMake cannot locate
+the package, pass ``tvm_ffi_ROOT`` explicitly:
- find_package(tvm_ffi CONFIG REQUIRED) # requires tvm_ffi_ROOT
- tvm_ffi_configure_target(my-shared-lib) # configure TVM-FFI linkage
- install(TARGETS my-shared-lib DESTINATION .)
- tvm_ffi_install(my-shared-lib DESTINATION .) # install extra artifacts
+.. code-block:: bash
+ cmake -S . -B build \
+ -Dtvm_ffi_ROOT="$(tvm-ffi-config --cmakedir)"
-Set ``tvm_ffi_ROOT``
-~~~~~~~~~~~~~~~~~~~~
+.. note::
+
+ When packaging Python wheels with scikit-build-core, ``tvm_ffi_ROOT`` is
+ discovered automatically from the active Python environment.
-For a pure C++ build, CMake may fail when it reaches
-.. code-block:: cmake
+GCC/NVCC
+~~~~~~~~
+
+For quick prototyping or CI scripts without CMake, invoke ``g++`` or ``nvcc``
+directly with flags from ``tvm-ffi-config``.
+The examples below are from the :doc:`Quick Start <../get_started/quickstart>`
tutorial:
- find_package(tvm_ffi CONFIG REQUIRED)
+.. tabs::
-if it cannot locate the TVM-FFI package. In that case, set
-``tvm_ffi_ROOT`` to the TVM-FFI CMake package directory.
+ .. group-tab:: C++
+
+ .. literalinclude:: ../../examples/quickstart/raw_compile.sh
+ :language: bash
+ :start-after: [cpp_compile.begin]
+ :end-before: [cpp_compile.end]
+
+ .. group-tab:: CUDA
+
+ .. literalinclude:: ../../examples/quickstart/raw_compile.sh
+ :language: bash
+ :start-after: [cuda_compile.begin]
+ :end-before: [cuda_compile.end]
+
+The three ``tvm-ffi-config`` flags provide:
+
+:``--cxxflags``: Include paths and compile definitions (``-I...``, ``-D...``)
+:``--ldflags``: Library search paths (``-L...``, ``-Wl,-rpath,...``)
+:``--libs``: Libraries to link (``-ltvm_ffi``)
+
+**RPATH handling.** The resulting shared library links against
``libtvm_ffi.so``,
+so the dynamic linker must be able to find it at load time:
+
+- **Python distribution.** ``import tvm_ffi`` preloads ``libtvm_ffi.so`` into
the
+ process before any user library is loaded, so the RPATH requirement is
already
+ satisfied without additional linker flags.
+- **Pure C++ distribution.** You must ensure ``libtvm_ffi.so`` is on the
library
+ search path. Either set ``-Wl,-rpath,$(tvm-ffi-config --libdir)`` at link
time,
+ or place ``libtvm_ffi.so`` alongside your binary.
+
+
+Library Distribution
+--------------------
+
+When distributing pre-built shared libraries on Linux, glibc symbol versioning
+can cause load-time failures on systems with a different glibc version.
+The standard solution is the `manylinux <https://github.com/pypa/manylinux>`_
+approach: **build on old glibc, run on new**.
+
+**Build environment.** Use a manylinux Docker image:
.. code-block:: bash
- cmake -S . -B build \
- -Dtvm_ffi_ROOT="$(tvm-ffi-config --cmakedir)"
+ docker pull quay.io/pypa/manylinux2014_x86_64
+Build host and device code inside the container. For CUDA:
-.. note::
+.. code-block:: bash
- When packaging Python wheels with scikit-build-core, ``tvm_ffi_ROOT`` is
- discovered automatically from the active Python environment, so you usually
- do not need to set it explicitly.
+ nvcc -shared -Xcompiler -fPIC your_kernel.cu -o kernel.so \
+ $(tvm-ffi-config --cxxflags) \
+ $(tvm-ffi-config --ldflags) \
+ $(tvm-ffi-config --libs)
+
+**Verify glibc requirements.** Inspect the minimum glibc version your binary
requires:
+
+.. code-block:: bash
+
+ objdump -T your_kernel.so | grep GLIBC_
+
+The ``apache-tvm-ffi`` wheel is already manylinux-compatible, so linking
against
+it inside a manylinux build environment produces portable binaries.
-VSCode/Cursor
--------------
+Editor Setup
+------------
-The following settings help CMake Tools integrate with TVM-FFI and generate the
-``compile_commands.json`` used by clangd:
+The following configuration enables code completion and diagnostics in
+VSCode, Cursor, or any editor backed by clangd.
+
+**CMake Tools (VSCode/Cursor).** Add these workspace settings so CMake Tools
+can locate TVM-FFI and generate ``compile_commands.json``:
Review Comment:

For better document structure and consistency with other sections, consider
using a level 3 subheading for 'CMake Tools (VSCode/Cursor)' instead of bolded
text.
```suggestion
CMake Tools (VSCode/Cursor)
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Add these workspace settings so CMake Tools
can locate TVM-FFI and generate ``compile_commands.json``:
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]