-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update URL config protocol for loading TF models #1505
Conversation
Seems like the tests failed because keras is not able to load the model... While this PR is for changing the protocol for loading models for TF>2.18 which comes with keras 3 the testing environment is still using TF 2.15 so I guess this is why it is failing.
|
We can try to update it to tensorflow>2.18, after
|
Is it normal to see warning
|
This happens in my environment as well and the warning appears only once in a session. The warning is about the optimizer which has nothing to do when loading the model for inference. I think it is fine. |
Before you submit this PR: make sure to put all operations-related information in a wiki-note, a PR should be about code and is publicly accessible
What does the code in this PR do / what does it improve?
Make the loading of tensorflow model (for position reconstruction) compatible with the newer versions of tensorflow.
Can you briefly describe how it works?
.keras
, load the file directly withtf.keras.models.load_model
..tar.gz
, extract the tarball. Then first execute theregistration.py
in it, then load the.keras
file in it withtf.keras.models.load_model
.Can you give a minimal working example (or illustrate with a figure)?
Please include the following if applicable:
Notes on testing
All italic comments can be removed from this template.