Container build not installing/finding dependencies contained in model.tar file #167
Description
Issue: Inference.py dependencies aren't installed in SageMaker tensorflow serving container.
Resulting error: ModuleNotFoundError: No module named 'nltk'
Versioning details
Sagemaker env: conda_python3
Tensorflow version: 2.3.0
Tensorflow serving container versions: 2.0 (also tried 2.1, 2.2, 2.3)
Directory structure containing model & dependencies (prior to tarring)
+-- 1
| +-- variables
| +-- +-- variables.data-00000-of-00001
| +-- +-- variables.index
| +-- saved_model.pb
| +-- code
| +-- +-- inference.py
| +-- +-- requirements.txt
| +-- +-- word_vectors.txt
| +-- +-- bigram.pkl
I have also tried deploying from a separate directory that has a code>lib>external_module, which contains the nltk module itself rather than a requirements file. Neither of these approaches work - both return the same module not found error.
Deployment from SageMaker notebook using the Python SDK:
tensorflow_serving_model = Model(model_data = model_data
,role=role
,framework_version='2.0'
,entry_point='inference.py') #running without the entry point works as expected
tensorflow_serving_model.deploy(initial_instance_count=1,
instance_type='ml.c4.xlarge')
requirements.txt
nltk==3.4.5
more_itertools==8.2.0
gensim==3.8.3
Note: There are no issues with the model file. When I instantiate the tensorflow_serving_model.Model() instance without specifying the inference.py entry point, my model runs successfully and I get predictions back after passing an ndarry.
Thoughts on how to get nltk (and other dependencies) loaded on the serving container? Thank you!!