Re: [PR] [CI] Revert CUDA, PyTorch and ONNX upgrade [tvm]

2026-02-16 Thread via GitHub


mshr-h commented on PR #18787:
URL: https://github.com/apache/tvm/pull/18787#issuecomment-3912285748

   closing as skip works


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



Re: [PR] [CI] Revert CUDA, PyTorch and ONNX upgrade [tvm]

2026-02-16 Thread via GitHub


mshr-h closed pull request #18787: [CI] Revert CUDA, PyTorch and ONNX upgrade
URL: https://github.com/apache/tvm/pull/18787


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



Re: [PR] [CI] Revert CUDA, PyTorch and ONNX upgrade [tvm]

2026-02-16 Thread via GitHub


tqchen commented on code in PR #18787:
URL: https://github.com/apache/tvm/pull/18787#discussion_r2813735610


##
docker/install/ubuntu_install_onnx.sh:
##
@@ -20,25 +20,25 @@ set -e
 set -u
 set -o pipefail
 
-# Get the Python version
-PYTHON_VERSION=$(python3 -c "import sys; 
print(f'{sys.version_info.major}.{sys.version_info.minor}')")
-
 # Set default value for first argument
 DEVICE=${1:-cpu}
 
 # Install the onnx package
 pip3 install \
-onnx==1.20.1 \
-onnxruntime==1.23.2 \
-onnxoptimizer==0.4.2
+future \

Review Comment:
   Let us wait and see if skip works



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



Re: [PR] [CI] Revert CUDA, PyTorch and ONNX upgrade [tvm]

2026-02-16 Thread via GitHub


tqchen commented on PR #18787:
URL: https://github.com/apache/tvm/pull/18787#issuecomment-3909830648

   yes, i think it is ok to skip opencl tests for now


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



Re: [PR] [CI] Revert CUDA, PyTorch and ONNX upgrade [tvm]

2026-02-16 Thread via GitHub


mshr-h commented on PR #18787:
URL: https://github.com/apache/tvm/pull/18787#issuecomment-3909777844

   I'm trying to skip all opencl tests and see if it passes. 
https://ci.tlcpack.ai/blue/organizations/jenkins/tvm-gpu/detail/PR-18775/36/pipeline


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



Re: [PR] [CI] Revert CUDA, PyTorch and ONNX upgrade [tvm]

2026-02-16 Thread via GitHub


mshr-h commented on PR #18787:
URL: https://github.com/apache/tvm/pull/18787#issuecomment-3909742895

   all of the opencl tests are failing. @tqchen 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]



Re: [PR] [CI] Revert CUDA, PyTorch and ONNX upgrade [tvm]

2026-02-16 Thread via GitHub


tqchen commented on PR #18787:
URL: https://github.com/apache/tvm/pull/18787#issuecomment-3909711315

   do we know which was the test failing? I feel it is important for the CI to 
be up to date, so for the case of opencl, perhapas we can temp skip some of the 
tests?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


-
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]