Skip to content

Commit

Permalink
Revert package changes
Browse files Browse the repository at this point in the history
  • Loading branch information
MasterJH5574 committed Feb 12, 2025
1 parent 1ec0c6e commit 205f5f6
Show file tree
Hide file tree
Showing 5 changed files with 8 additions and 20 deletions.
4 changes: 2 additions & 2 deletions ci/jenkinsfile.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@
import org.jenkinsci.plugins.pipeline.modeldefinition.Utils

run_cpu = "bash ci/bash.sh mlcaidev/ci-cpu:4d61e5d -e GPU cpu -e MLC_CI_SETUP_DEPS 1"
run_cuda = "bash ci/bash.sh mlcaidev/ci-cu128:4d61e5d -e GPU cuda-12.8 -e MLC_CI_SETUP_DEPS 1"
run_rocm = "bash ci/bash.sh mlcaidev/ci-rocm63:4d61e5d -e GPU rocm-6.3 -e MLC_CI_SETUP_DEPS 1"
run_cuda = "bash ci/bash.sh mlcaidev/ci-cu121:4d61e5d -e GPU cuda-12.1 -e MLC_CI_SETUP_DEPS 1"
run_rocm = "bash ci/bash.sh mlcaidev/ci-rocm57:4d61e5d -e GPU rocm-5.7 -e MLC_CI_SETUP_DEPS 1"

pkg_cpu = "bash ci/bash.sh mlcaidev/package-rocm61:5b6f876 -e GPU cpu -e MLC_CI_SETUP_DEPS 1"
pkg_cuda = "bash ci/bash.sh mlcaidev/package-cu128:5b6f876 -e GPU cuda-12.8 -e MLC_CI_SETUP_DEPS 1"
Expand Down
4 changes: 2 additions & 2 deletions ci/task/test_model_compile.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ pip install --force-reinstall wheels/*.whl

if [[ ${GPU} == cuda* ]]; then
TARGET=cuda
pip install --pre -U --no-index -f https://mlc.ai/wheels mlc-ai-nightly-cu128
pip install --pre -U --no-index -f https://mlc.ai/wheels mlc-ai-nightly-cu123
export LD_LIBRARY_PATH=/usr/local/cuda/compat/:$LD_LIBRARY_PATH
elif [[ ${GPU} == rocm* ]]; then
TARGET=rocm
pip install --pre -U --no-index -f https://mlc.ai/wheels mlc-ai-nightly-rocm63
pip install --pre -U --no-index -f https://mlc.ai/wheels mlc-ai-nightly-rocm57
elif [[ ${GPU} == metal ]]; then
TARGET=metal
pip install --pre -U --force-reinstall -f https://mlc.ai/wheels mlc-ai-nightly-cpu
Expand Down
2 changes: 1 addition & 1 deletion ci/task/test_unittest.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ if [[ -n ${MLC_CI_SETUP_DEPS:-} ]]; then
# Install dependency
pip install --force-reinstall wheels/*.whl
pip install --quiet pytest
pip install --pre -U --no-index -f https://mlc.ai/wheels mlc-ai-nightly-cu128
pip install --pre -U --no-index -f https://mlc.ai/wheels mlc-ai-nightly-cu123
export LD_LIBRARY_PATH=/usr/local/cuda/compat/:$LD_LIBRARY_PATH
fi

Expand Down
4 changes: 3 additions & 1 deletion cmake/gen_cmake_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,9 @@
cmake_config_str += f"set(CMAKE_CUDA_ARCHITECTURES {user_input})\n"
break
else:
print(f"Invalid input: {user_input}. FlashInfer requires 80, 86, 89, 90, 100 or 120")
print(
f"Invalid input: {user_input}. FlashInfer requires 80, 86, 89, 90, 100 or 120"
)

print("\nWriting the following configuration to config.cmake...")
print(cmake_config_str)
Expand Down
14 changes: 0 additions & 14 deletions docs/install/tvm.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,13 +53,6 @@ A nightly prebuilt Python package of Apache TVM Unity is provided.
conda activate your-environment
python -m pip install --pre -U -f https://mlc.ai/wheels mlc-ai-nightly-cu123
.. tab:: CUDA 12.8

.. code-block:: bash
conda activate your-environment
python -m pip install --pre -U -f https://mlc.ai/wheels mlc-ai-nightly-cu128
.. tab:: ROCm 6.1

.. code-block:: bash
Expand All @@ -74,13 +67,6 @@ A nightly prebuilt Python package of Apache TVM Unity is provided.
conda activate your-environment
python -m pip install --pre -U -f https://mlc.ai/wheels mlc-ai-nightly-rocm62
.. tab:: ROCm 6.3

.. code-block:: bash
conda activate your-environment
python -m pip install --pre -U -f https://mlc.ai/wheels mlc-ai-nightly-rocm63
.. tab:: Vulkan

Supported in all Linux packages.
Expand Down

0 comments on commit 205f5f6

Please sign in to comment.