Skip to content
This repository has been archived by the owner on Jul 24, 2024. It is now read-only.

when using the saved_model format for storing pretrained model, model loading causes many unnecessary warnings. #193

Open
aynesss opened this issue Feb 14, 2022 · 0 comments

Comments

@aynesss
Copy link

aynesss commented Feb 14, 2022

I use TF2.4.1
I would like to finetune the savedModel of pretrained .I write this :

target_model_path = "gs://simclr-checkpoints-tf2/simclrv1/pretrain/1x/saved_model/"
def create_model():
    baseModel = tf.saved_model.load(target_model_path)
    baseModel= tf.saved_model.save('./saved_model.h5')

    headModel = baseModel.output
    model_output = tf.keras.layers.Dense(3, activation="softmax", name="output")(headModel)  

    # compile the model
    model = tf.keras.Model(inputs=baseModel.input, outputs=model_output)
    for layer in model.layers:
        layer.trainable = True 

    model.compile(optimizer=tf.keras.optimizers.SGD(learning_rate=0.001),
              loss=tf.keras.losses.CategoricalCrossentropy(),  # from_logits=True
              metrics=[tf.keras.metrics.CategoricalAccuracy()])  # TopKCategoricalAccuracy(k=1)
    return model

I only add a classification layer. But model loading causes many unnecessary warnings like this :

WARNING:absl:Importing a function (__inference___call___15879) with ops with custom gradients. Will likely fail if a gradient is requested.
WARNING:absl:Importing a function (__inference___call___15879) with ops with custom gradients. Will likely fail if a gradient is requested.

Any ideas what could be the possible reason behind this? This https://colab.research.google.com/github/google-research/simclr/blob/master/tf2/colabs/finetuning.ipynb#scrollTo=6WXspghpERRG also have same problem.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant