Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

trtllm-serve: command not found #2526

Open
Wonder-donbury opened this issue Dec 3, 2024 · 1 comment
Open

trtllm-serve: command not found #2526

Wonder-donbury opened this issue Dec 3, 2024 · 1 comment

Comments

@Wonder-donbury
Copy link

Wonder-donbury commented Dec 3, 2024

(using nvcr.io/nvidia/tritonserver:24.10-trtllm-python-py3)

I've cloned your main to a folder, and I tried to use trtllm-serve, it says

root@llm-inference-ubuntu:/# trtllm-serve
bash: trtllm-serve: command not found

I think the path to the trtllm-serve hasn't been designated yet.

@pulkitmehtaworkmetacube
Copy link

I installed using pip install --extra-index-url https://pypi.nvidia.com/ tensorrt-llm and I am able to use trtllm-serve. Please check .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants