You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
By the moment setup.py script imports onxx_runtime only to extract version information. But this import leads to import of tensorrt, which requires cuda runtime libraries to be available. Thus makes it problematic to build docker images with onxx_runtime inside in CI/CD env without GPUs and cuda runtime.
Some refactoring is required to extract version from package without import of tensorrt, so build in CI/CD without GPU will be possible.
Log when tensorrt is not available on any command requiring setup.py (even not installing)
ERROR: Command errored out with exit status 1:
command: /home/dener/.pyenv/versions/eris-ct-worker/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/setup.py'"'"'; __file__='"'"'/home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-ywrl82zv
cwd: /home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/
Complete output (11 lines):
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/setup.py", line 18, in <module>
import onnx_tensorrt
File "/home/dener/.pyenv/versions/3.8.5/envs/eris-ct-worker/src/onnx-tensorrt/onnx_tensorrt/__init__.py", line 23, in <module>
from . import backend
File "/home/dener/.pyenv/versions/3.8.5/envs/eris-ct-worker/src/onnx-tensorrt/onnx_tensorrt/backend.py", line 22, in <module>
from .tensorrt_engine import Engine
File "/home/dener/.pyenv/versions/3.8.5/envs/eris-ct-worker/src/onnx-tensorrt/onnx_tensorrt/tensorrt_engine.py", line 21, in <module>
import tensorrt as trt
ModuleNotFoundError: No module named 'tensorrt'
Log on the situation when there is no cuda runtime available:
ERROR: Command errored out with exit status 1:
command: /usr/bin/python3 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/setup.py'"'"'; __file__='"'"'/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-crge83fu
cwd: /tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/
Complete output (13 lines):
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/setup.py", line 18, in <module>
import onnx_tensorrt
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/onnx_tensorrt/__init__.py", line 23, in <module>
from . import backend
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/onnx_tensorrt/backend.py", line 22, in <module>
from .tensorrt_engine import Engine
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/onnx_tensorrt/tensorrt_engine.py", line 22, in <module>
import pycuda.driver
File "/usr/local/lib/python3.8/dist-packages/pycuda/driver.py", line 62, in <module>
from pycuda._driver import * # noqa
ImportError: libcuda.so.1: cannot open shared object file: No such file or directory
----------------------------------------
The text was updated successfully, but these errors were encountered:
I haven't tried, but I'm sure it is still reproducible on master, as I see that the issue persists in code.
Let me describe step-by-step what happens for more clarity:
You invoke python3 setup.py install to setup package. Moreover here you can invoke any setup.py command (get version, dependencies list, etc.)
python evaluates setup.py script
setup.py imports onnx_tensorrt package herejust to get its versionhere
python processes onnx_tensorrt/__init__.py and imports onxx_tensorrt/backend.py here
python processes onxx_tensorrt/backend.py and imports tensorrt here
tensorrt requires cuda runtime to be imported
So this prevents you on installing package without tensorrt and cuda runtime available. That means you can't build docker image in CI on machine without GPU for further use on machines with GPU. Also you can't do dependency management or use any setup.py features without trt and cuda runtime.
By the moment I have ugly hacks in my Dockerfiles to solve the problem:
To solve the issue, as I said above, refactoring is required to extract version from package without import of tensorrt. Here is more info on how to handle python package versioning correctly.
RuRo
linked a pull request
Oct 26, 2021
that will
close
this issue
By the moment setup.py script imports onxx_runtime only to extract version information. But this import leads to import of tensorrt, which requires cuda runtime libraries to be available. Thus makes it problematic to build docker images with onxx_runtime inside in CI/CD env without GPUs and cuda runtime.
Some refactoring is required to extract version from package without import of tensorrt, so build in CI/CD without GPU will be possible.
Log when tensorrt is not available on any command requiring setup.py (even not installing)
Log on the situation when there is no cuda runtime available:
The text was updated successfully, but these errors were encountered: