You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run the following code:
print(graph.get_operation_by_name("input_image").outputs[0].shape)
print(grpah.get_operation_by_name("output_image").outputs[0].shape)
and got the following results:
(256,256,3)
()
So why is the output empty? This leads to the results that the converted saved model can not apply to triton inference server or tf-serving.
In order to serve with tf-serving, the model needs to be converted into savedmodel. How to convert the ckpt model into savedmodel?
The text was updated successfully, but these errors were encountered: