-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed TensorRT-LLM Benchmark #2694
Comments
@FrankD412, could you please take a look at this? |
@maulikmadhavi -- I'm taking a look to see if I can reproduce this. |
@maulikmadhavi -- I think the issue here is that you're mounting your code repository here:
The container build with the |
I dont typically use the Docker instances but I have seen the same issue. I simply update my LD_LIBRARY_PATH
|
@FrankD412 Thanks for your test.
|
@sbaby171 thanks for sharing your env path settings. I agree that many times it is good to have dockerless setup. |
Hi @FrankD412 The purpose of using mount dir to save trt engine with different parameters. It works with
I believe there should be some conflicts caused when I use the local repo Thanks |
Glad you found a solution. Normally, I do something similar. You could also map another directory for |
System Info
Who can help?
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Follow the steps on as per performance benchmarking link
Expected behavior
Running docker using
make
=>make -C docker release_run
actual behavior
Running docker using
docker
=>docker run --rm --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 --gpus=device=1 -it -p 8000:8000 -v <path-to-TensorRT-LLM/>:/app/ 89fg611dcfd
Stays here for quire longer time, upon Ctrl+C
additional notes
=> Need to map port and directory to save time and repeated HF model downloads
Thanks
The text was updated successfully, but these errors were encountered: