commits
Thread
Date
Earlier messages
Later messages
Messages by Thread
Re: [I] [Bug] [Relax] InternalError: Check failed: (*it).second == var [tvm]
via GitHub
Re: [I] [Bug] [Relax] InternalError: Check failed: (*it).second == var [tvm]
via GitHub
(tvm) branch main updated: [LLVM] Fix for getHostCPUFeatures API change (#17199)
tqchen
[PR] [LLVM] Fix for getHostCPUFeatures API change [tvm]
via GitHub
Re: [PR] [LLVM] Fix for getHostCPUFeatures API change [tvm]
via GitHub
Re: [PR] [LLVM] Fix for getHostCPUFeatures API change [tvm]
via GitHub
(tvm) branch nightly updated (6704175fc7 -> 1b6c00d756)
github-bot
[PR] [Relax] Disable fusion for fetching from the packed params in FuseOps [tvm]
via GitHub
Re: [PR] [Relax] Disable fusion for fetching from the packed params in FuseOps [tvm]
via GitHub
(tvm) branch main updated: [Disco] Implement SocketSession (#17182)
wuwei
(tvm) branch main updated: [Cython][FFI] Fix crash when call del operator for handle (#17190)
tqchen
[I] [Bug] onnx quantize op support MatMulIntegerToFloat QLinearConv [tvm]
via GitHub
(tvm) tag v0.17.0 created (now eeebcfa0ad)
ysh329
[I] [RESULT][VOTE] Release Apache TVM v0.17.0 [tvm]
via GitHub
Re: [I] [RESULT][VOTE] Release Apache TVM v0.17.0 [tvm]
via GitHub
(tvm) branch nightly updated (89b91e2b11 -> 6704175fc7)
github-bot
(tvm) branch main updated: Pass to eliminate redundant branch and overcompute (#17170)
sanirudh
[I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
Re: [I] [Bug] Error opening FastRPC channel [tvm]
via GitHub
[I] [Bug] Conflicting code is generated in lib0.c causing conflicting types for tvmgen_default_run and tvmgen_default__tvm_main__ [tvm]
via GitHub
Re: [I] [Bug] Conflicting code is generated in lib0.c causing conflicting types for tvmgen_default_run and tvmgen_default__tvm_main__ [tvm]
via GitHub
(tvm) branch main updated: Add support for `torch.nn.functional.max_pool2d` (#17189)
tqchen
Re: [PR] Add support for `torch.nn.functional.max_pool2d` [tvm]
via GitHub
(tvm) branch main updated: [TIR][Analyzer] Simplify `x==x` expressions for all dtypes (#17158)
tqchen
(tvm) branch main updated: [Disco] Cross-group and p2p send/receive primitives (#17191)
tqchen
Re: [PR] [CLML][CI] Fix for few clml regression issues [tvm]
via GitHub
(tvm) branch main updated: [CLML][CI] Fix for few clml regression issues (#17117)
srk
(tvm) branch nightly updated (162d43a997 -> 89b91e2b11)
github-bot
Re: [I] [Bug] Building erros for hexagon_launcher [tvm]
via GitHub
Re: [I] [Bug] Building erros for hexagon_launcher [tvm]
via GitHub
Re: [I] [Bug] Building erros for hexagon_launcher [tvm]
via GitHub
Re: [I] [Bug] Building erros for hexagon_launcher [tvm]
via GitHub
Re: [I] [Bug] Building erros for hexagon_launcher [tvm]
via GitHub
Re: [I] [Bug] Building erros for hexagon_launcher [tvm]
via GitHub
(tvm) branch main updated: [KVCache] Partial layers support (#17192)
ruihangl
[PR] [KVCache] Partial layers support [tvm]
via GitHub
Re: [PR] [KVCache] Partial layers support [tvm]
via GitHub
(tvm) branch main updated: Remove and replace deprecated `distutils.util.strtobool()` (#17185)
tqchen
(tvm) branch main updated: [DLIGHT][GPU] Add OpenCL dequant matmul schedule (#17187)
tqchen
[PR] [Disco] Cross-group and p2p send/receive primitives [tvm]
via GitHub
Re: [PR] [Disco] Cross-group and p2p send/receive primitives [tvm]
via GitHub
[PR] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
Re: [PR] [Cython][FFI] Fix crash when call del operator for handle [tvm]
via GitHub
(tvm) branch main updated: [Disco] Group-wise operation (#17180)
ruihangl
(tvm) branch main updated: [MetaSchedule] Replace `xgboost.rabit` with `xgboost.collective` because it's deprecated (#17166)
tqchen
Re: [PR] [MetaSchedule] Replace `xgboost.rabit` with `xgboost.collective` because it's deprecated [tvm]
via GitHub
(tvm) branch nightly updated (3c7adfb1f7 -> 162d43a997)
github-bot
(tvm) branch main updated: [Relax][PyTorch] Add support for torch.einsum (#17186)
ruihangl
(tvm) branch main updated: Add `packaging` to `python/gen_requirements.py` (#17188)
tqchen
(tvm) branch main updated: [Hexagon] [CMake] Fix v66 build issue (#17169)
tqchen
(tvm) branch main updated: [FFI] Add python signal handler for ctypes FFI (#17181)
tqchen
Re: [PR] [FFI] Add python signal handler for ctypes FFI [tvm]
via GitHub
(tvm) branch main updated: [Relax][PyTorch] Add support for torch.permute (#17184)
wuwei
(tvm) branch main updated: [Relax] Integrate cuDNN attention (#17157)
wuwei
(tvm) branch main updated: [MetaSchedule]Add a testcase for padded conv2d in meta_schedule (#17171)
cbalint13
[PR] Add `packaging` to `python/gen_requirements.py` [tvm]
via GitHub
Re: [PR] Add `packaging` to `python/gen_requirements.py` [tvm]
via GitHub
Re: [PR] [DLIGHT][GPU] Add OpenCL dequant matmul schedule [tvm]
via GitHub
Re: [PR] [DLIGHT][GPU] Add OpenCL dequant matmul schedule [tvm]
via GitHub
[PR] [Relax][PyTorch] Add support for torch.einsum [tvm]
via GitHub
Re: [PR] [Relax][PyTorch] Add support for torch.einsum [tvm]
via GitHub
Re: [PR] [Relax][PyTorch] Add support for torch.einsum [tvm]
via GitHub
Re: [PR] [Relax][PyTorch] Add support for torch.einsum [tvm]
via GitHub
[PR] Remove and replace deprecated `distutils.util.strtobool()` [tvm]
via GitHub
Re: [PR] Remove and replace deprecated `distutils.util.strtobool()` [tvm]
via GitHub
(tvm) branch main updated: [Relay][FQ2I]: Use appropriate dtype while quantizing relay.op.nn.pad… (#17177)
sanirudh
[PR] [Relax][PyTorch] Add support for torch.permute [tvm]
via GitHub
Re: [PR] [Relax][PyTorch] Add support for torch.permute [tvm]
via GitHub
Re: [PR] [Relax][PyTorch] Add support for torch.permute [tvm]
via GitHub
Re: [PR] [Relax][PyTorch] Add support for torch.permute [tvm]
via GitHub
[I] [Bug] pytorch relax frontend failed to import models with torch.permute [tvm]
via GitHub
Re: [I] [Bug] pytorch relax frontend failed to import models with torch.permute [tvm]
via GitHub
[PR] [Disco] Implement SocketSession [tvm]
via GitHub
Re: [PR] [Disco] Implement SocketSession [tvm]
via GitHub
Re: [PR] [Disco] Implement SocketSession [tvm]
via GitHub
svn commit: r70452 - in /dev/tvm/tvm-v0.17.0-rc0: ./ apache-tvm-src-v0.17.0.rc0.tar.gz apache-tvm-src-v0.17.0.rc0.tar.gz.asc apache-tvm-src-v0.17.0.rc0.tar.gz.sha512
ysh329
svn commit: r70451 - /dev/tvm/tvm-v0.17.0-rc0/
ysh329
[PR] [Disco] Group allreduce and allgather [tvm]
via GitHub
Re: [PR] [Disco] Group-wise operation [tvm]
via GitHub
[I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
Re: [I] [VOTE] Release Apache TVM v0.17.0.rc0 [tvm]
via GitHub
svn commit: r70439 - in /dev/tvm/tvm-v0.17.0-rc0: ./ apache-tvm-src-v0.17.0.rc0.tar.gz apache-tvm-src-v0.17.0.rc0.tar.gz.asc apache-tvm-src-v0.17.0.rc0.tar.gz.sha512
ysh329
[I] [Release] v0.17.0 Release Candidate Notes [tvm]
via GitHub
Re: [I] [Release] v0.17.0 Release Candidate Notes [tvm]
via GitHub
(tvm) branch nightly updated (070546eb4a -> 3c7adfb1f7)
github-bot
(tvm) branch main updated: Use `packaging.version.parse` instead of `distutils.version.LooseVersion` (#17173)
tqchen
[PR] [Relay][FQ2I]: Use appropriate dtype while quantizing relay.op.nn.pad… [tvm]
via GitHub
Re: [PR] [Relay][FQ2I]: Use appropriate dtype while quantizing relay.op.nn.pad… [tvm]
via GitHub
(tvm) branch nightly updated (73078f11dc -> 070546eb4a)
github-bot
(tvm) branch main updated: [TVMJS] Check DataType.NUMPY2STR when saving array (#17174)
lunderberg
[I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
Re: [I] [Relax][Bug] Cannot find PackedFunc tir_zeros [tvm]
via GitHub
[I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
Re: [I] [Relax][Bug] CodeGenVM cannot handle this intrinsic tensor_to_shape [tvm]
via GitHub
[PR] [TVMJS] Check DataType.NUMPY2STR when saving array [tvm]
via GitHub
Re: [PR] [TVMJS] Check DataType.NUMPY2STR when saving array [tvm]
via GitHub
(tvm) branch main updated: [Relax] [ONNX] Add support for Sign and Not (#17167)
tqchen
(tvm) branch main updated: [Meta Schedule][XGBoost] enable custom callback func test with xgboost>=1.6.0 (#17168)
syfeng
(tvm) branch main updated: [Relax][BugFix] Fix a bug about the IR construction in test file (#17121)
syfeng
Re: [PR] [Relax][BugFix] Fix a bug about the IR construction in test file [tvm]
via GitHub
Re: [PR] [Relax][BugFix] Fix a bug about the IR construction in test file [tvm]
via GitHub
Re: [PR] [Relax][BugFix] Fix a bug about the IR construction in test file [tvm]
via GitHub
[PR] Use `packaging.version.parse` instead of `distutils.version.LooseVersion` [tvm]
via GitHub
Re: [PR] Use `packaging.version.parse` instead of `distutils.version.LooseVersion` [tvm]
via GitHub
[PR] [MetaSchedule]Add a testcase for padded conv2d in meta_schedule [tvm]
via GitHub
Re: [PR] [MetaSchedule]Add a testcase for padded conv2d in meta_schedule [tvm]
via GitHub
Re: [PR] [MetaSchedule]Add a testcase for padded conv2d in meta_schedule [tvm]
via GitHub
Re: [PR] [MetaSchedule]Add a testcase for padded conv2d in meta_schedule [tvm]
via GitHub
[PR] Pass to eliminate redundant branch and overcompute [tvm]
via GitHub
Re: [PR] Pass to eliminate redundant branch and overcompute [tvm]
via GitHub
Re: [PR] Pass to eliminate redundant branch and overcompute [tvm]
via GitHub
Re: [PR] Pass to eliminate redundant branch and overcompute [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
Re: [PR] [Hexagon] [CMake] Fix v66 build issue [tvm]
via GitHub
[PR] [Meta Schedule][XGBoost] enable callback func test with xgboost>=1.6.0 [tvm]
via GitHub
Re: [PR] [Meta Schedule][XGBoost] enable custom callback func test with xgboost>=1.6.0 [tvm]
via GitHub
[PR] [Relax] [ONNX] Add support for Sign and Not [tvm]
via GitHub
Re: [PR] [Relax] [ONNX] Add support for Sign and Not [tvm]
via GitHub
Re: [PR] [Relax] [ONNX] Add support for Sign and Not [tvm]
via GitHub
Re: [PR] [CI][AArch64] Enable ONNX and PyTorch tests on AArch64 [tvm]
via GitHub
Re: [PR] [docs] Add tvm.driver.tvmc module to Python documentation [tvm]
via GitHub
(tvm) branch nightly updated (b654852b15 -> 73078f11dc)
github-bot
(tvm) branch main updated (51d7c5e47a -> 73078f11dc)
wuwei
(tvm) branch main updated: [Hexagon] Support RPC execution of existing shared lib (#17162)
cbalint13
(tvm) branch main updated: [Relax] Fix fuseOps via pattern (#17160)
tqchen
Re: [PR] [Relax] Fix fuseOps via pattern [tvm]
via GitHub
[I] [Bug] [Relax] [TIR] FuseOps and FuseTIR cannot fuse no-op reshape into other `PrimFunc`s, nor can they eliminate that [tvm]
via GitHub
[I] [Bug] tvm/src/contrib/torch/tvm_module_wrapper/RuntimeModuleWrapperTVM.cc:32:10: fatal error: ../../support/base64.h: No such file or directory [tvm]
via GitHub
Re: [I] [Bug] tvm/src/contrib/torch/tvm_module_wrapper/RuntimeModuleWrapperTVM.cc:32:10: fatal error: ../../support/base64.h: No such file or directory [tvm]
via GitHub
Re: [I] [Bug] tvm/src/contrib/torch/tvm_module_wrapper/RuntimeModuleWrapperTVM.cc:32:10: fatal error: ../../support/base64.h: No such file or directory [tvm]
via GitHub
Re: [I] [Bug] tvm/src/contrib/torch/tvm_module_wrapper/RuntimeModuleWrapperTVM.cc:32:10: fatal error: ../../support/base64.h: No such file or directory [tvm]
via GitHub
[I] [Bug] Building for qualcomm hexagon dsp V66 architeture [tvm]
via GitHub
Re: [I] [Bug] Building for qualcomm hexagon dsp V66 architeture [tvm]
via GitHub
Re: [I] [Bug] Building for qualcomm hexagon dsp V66 architeture [tvm]
via GitHub
Re: [I] [Bug] Building for qualcomm hexagon dsp V66 architeture [tvm]
via GitHub
Re: [I] [Bug] Building for qualcomm hexagon dsp V66 architeture [tvm]
via GitHub
Re: [PR] [Hexagon] Support RPC execution of existing shared lib [tvm]
via GitHub
Re: [PR] [Hexagon] Support RPC execution of existing shared lib [tvm]
via GitHub
Re: [PR] [Hexagon] remove #if defined(__hexagon__) where it is no longer needed [tvm]
via GitHub
[PR] Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
Re: [PR] [TIR]Fix Inlining of Non-Output Consumers in TileWithTensorIntrin with Padding [tvm]
via GitHub
(tvm) tag v0.18.dev0 created (now 9a9386de08)
ysh329
(tvm) tag v0.17.0.rc0 created (now eeebcfa0ad)
ysh329
Earlier messages
Later messages