-
Notifications
You must be signed in to change notification settings - Fork 460
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inputs to the model #43
Comments
Hi tom, I just downloaded and re-installed the latest version, there is new folder generated with .pb file for each model, I think it might make things easier |
finally I worked it out, here is the code for 1 case inference @tom-samsung import numpy as np
import tensorflow.compat.v1 as tf
#To make tf 2.0 compatible with tf1.0 code, we disable the tf2.0 functionalities
tf.disable_eager_execution()
from tensorflow.python.client import session
from tensorflow.python.framework import importer
from tensorflow.python.framework import ops
from tensorflow.python.summary import summary
from tensorflow.python.tools import saved_model_utils
from tensorflow.core.framework import graph_pb2 as gpb
from google.protobuf import text_format as pbtf
def extract_tensors(signature_def, graph):
output = dict()
for key in signature_def:
value = signature_def[key]
if isinstance(value, tf.TensorInfo):
output[key] = graph.get_tensor_by_name(value.name)
return output
def extract_input_name(signature_def, graph):
input_tensors = extract_tensors(signature_def['serving_default'].inputs, graph)
#Assuming one input in model.
name_list = []
for key in list(input_tensors.keys()):
name_list.append(input_tensors.get(key).name)
return name_list
def extract_output_name(signature_def, graph):
output_tensors = extract_tensors(signature_def['serving_default'].outputs, graph)
#Assuming one output in model.
name_list = []
for key in list(output_tensors.keys()):
name_list.append(output_tensors.get(key).name)
return name_list
def ass_input_dict(tensor_input_sample):
dict_input = {str(i+1)+":0" : [tensor_input_sample[i]] for i in range(len(tensor_input_sample))}
return dict_input
checkpoint_path = "/tmp/run/tuner-1/160/saved_model/assets/"
with tf.Session(graph=tf.Graph()) as sess:
serve = tf.saved_model.load(sess, tags=["serve"], export_dir=checkpoint_path)
#print(type(model)) <class 'tensorflow.core.protobuf.meta_graph_pb2.MetaGraphDef'>
#input_tensor_name = extract_input_name(serve.signature_def, sess.graph)
output_tensor_name = extract_output_name(serve.signature_def, sess.graph)
input_dict = ass_input_dict(sen_vec.detach().numpy())
prediction = sess.run(output_tensor_name, feed_dict=input_dict)
print(prediction) |
Hey @Xiaoping777 |
I have a problem with understanding the inputs to the model. I set my experiments on a dataset with 2496 columns using csv file. I provided label_index and record_defaults via a list.
Parameters that I set in single_trainer.SingleTrainer (as according to readme) were label_index, logits_dimension, record_defaults, filename, spec.
After the experiments were done, I started to look at the graphs to understand how to use them in a pipeline where keras models are used (I wanted to wrap selected graph with keras lambda layer and use for inference).
When looking at the graph I see
import/record_defaults_0
import/record_defaults_1
import/record_defaults_2
import/record_defaults_3
import/record_defaults_4
...
up to 2496 as it should be. I also see:
import/Phoenix/search_generator_0/Input/input_layer/1_1/ExpandDims/dim
import/Phoenix/search_generator_0/Input/input_layer/1_1/ExpandDims
import/Phoenix/search_generator_0/Input/input_layer/1_1/Shape
import/Phoenix/search_generator_0/Input/input_layer/1_1/strided_slice/stack
import/Phoenix/search_generator_0/Input/input_layer/1_1/strided_slice/stack_1
import/Phoenix/search_generator_0/Input/input_layer/1_1/strided_slice/stack_2
import/Phoenix/search_generator_0/Input/input_layer/1_1/strided_slice
import/Phoenix/search_generator_0/Input/input_layer/1_1/Reshape/shape/1
import/Phoenix/search_generator_0/Input/input_layer/1_1/Reshape/shape
import/Phoenix/search_generator_0/Input/input_layer/1_1/Reshape
for all 2496 inputs.
but I only see :
input_1
input_2
input_3
input_4
...
input_21
21 inputs instead of 2496. Could you please help me to understand this situation?
My final goal is to do something like that:
but unfortunately I do not understand inputs at this point.
Thank you for any help!
The text was updated successfully, but these errors were encountered: