Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

parametric umap fails to save model #867

Open
niederle opened this issue May 25, 2022 · 5 comments · May be fixed by #933
Open

parametric umap fails to save model #867

niederle opened this issue May 25, 2022 · 5 comments · May be fixed by #933

Comments

@niederle
Copy link

Hello,

I played around with the parametric umap going through the mnist notebook.

when running
embedder.save('/path/to/my/model')

I get the this output followed by the following error message:

WARNING:tensorflow:Compiled the loaded model, but the compiled metrics have yet to be built. model.compile_metrics will be empty until you train or evaluate the model.
INFO:tensorflow:Assets written to: C:\Users\niederle.SCREENING-PC-4\_deleteLater\model\encoder\assets
Keras encoder model saved to C:\Users\niederle.SCREENING-PC-4\_deleteLater\model\encoder
INFO:tensorflow:Assets written to: C:\Users\niederle.SCREENING-PC-4\_deleteLater\model\parametric_model\assets
Keras full model saved to C:\Users\niederle.SCREENING-PC-4\_deleteLater\model\parametric_model
WARNING:tensorflow:Compiled the loaded model, but the compiled metrics have yet to be built. model.compile_metrics will be empty until you train or evaluate the model.
INFO:tensorflow:Assets written to: ram://b7a7c013-4fdb-4e8a-a035-80904aa57781/assets
---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
Input In [42], in <cell line: 1>()
----> 1 embedder.save('C:\\Users\\niederle.SCREENING-PC-4\\_deleteLater\\model')

File ~\anaconda3\envs\analysis_env\lib\site-packages\umap\parametric_umap.py:415, in ParametricUMAP.save(self, save_location, verbose)
    413 model_output = os.path.join(save_location, "model.pkl")
    414 with open(model_output, "wb") as output:
--> 415     pickle.dump(self, output, pickle.HIGHEST_PROTOCOL)
    416 if verbose:
    417     print("Pickle of ParametricUMAP model saved to {}".format(model_output))

File ~\anaconda3\envs\analysis_env\lib\site-packages\umap\parametric_umap.py:379, in ParametricUMAP.__getstate__(self)
    377 def __getstate__(self):
    378     # this function supports pickling, making sure that objects can be pickled
--> 379     return dict(
    380         (k, v)
    381         for (k, v) in self.__dict__.items()
    382         if should_pickle(k, v) and k != "optimizer"
    383     )

File ~\anaconda3\envs\analysis_env\lib\site-packages\umap\parametric_umap.py:382, in <genexpr>(.0)
    377 def __getstate__(self):
    378     # this function supports pickling, making sure that objects can be pickled
    379     return dict(
    380         (k, v)
    381         for (k, v) in self.__dict__.items()
--> 382         if should_pickle(k, v) and k != "optimizer"
    383     )

File ~\anaconda3\envs\analysis_env\lib\site-packages\umap\parametric_umap.py:873, in should_pickle(key, val)
    871     pickled = codecs.encode(pickle.dumps(val), "base64").decode()
    872     # unpickle object
--> 873     unpickled = pickle.loads(codecs.decode(pickled.encode(), "base64"))
    874 except (
    875     pickle.PicklingError,
    876     tf.errors.InvalidArgumentError,
   (...)
    881     AttributeError,
    882 ) as e:
    883     warn("Did not pickle {}: {}".format(key, e))

File ~\anaconda3\envs\analysis_env\lib\site-packages\keras\saving\pickle_utils.py:48, in deserialize_model_from_bytecode(serialized_model)
     46       with tf.io.gfile.GFile(dest_path, "wb") as f:
     47         f.write(archive.extractfile(name).read())
---> 48 model = save_module.load_model(temp_dir)
     49 tf.io.gfile.rmtree(temp_dir)
     50 return model

File ~\anaconda3\envs\analysis_env\lib\site-packages\keras\utils\traceback_utils.py:67, in filter_traceback.<locals>.error_handler(*args, **kwargs)
     65 except Exception as e:  # pylint: disable=broad-except
     66   filtered_tb = _process_traceback_frames(e.__traceback__)
---> 67   raise e.with_traceback(filtered_tb) from None
     68 finally:
     69   del filtered_tb

File ~\anaconda3\envs\analysis_env\lib\site-packages\tensorflow\python\saved_model\load.py:915, in load_partial(export_dir, filters, tags, options)
    912   loader = Loader(object_graph_proto, saved_model_proto, export_dir,
    913                   ckpt_options, options, filters)
    914 except errors.NotFoundError as err:
--> 915   raise FileNotFoundError(
    916       str(err) + "\n You may be trying to load on a different device "
    917       "from the computational device. Consider setting the "
    918       "`experimental_io_device` option in `tf.saved_model.LoadOptions` "
    919       "to the io_device such as '/job:localhost'.")
    920 root = loader.get(0)
    921 root.graph_debug_info = loader.adjust_debug_info_func_names(debug_info)

FileNotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for ram://3bbf5d07-6468-48ed-be1d-e056e006589b/variables/variables
 You may be trying to load on a different device from the computational device. Consider setting the `experimental_io_device` option in `tf.saved_model.LoadOptions` to the io_device such as '/job:localhost'.

I am running the code on a windows10 machine in a conda environment consisting of the following packages:

#
# Name                    Version                   Build  Channel
absl-py                   1.0.0                    pypi_0    pypi
asttokens                 2.0.5              pyhd3eb1b0_0    anaconda
astunparse                1.6.3                    pypi_0    pypi
backcall                  0.2.0              pyhd3eb1b0_0    anaconda
blas                      1.0                         mkl
bottleneck                1.3.2            py38h2a96729_1
brotli                    1.0.9                ha925a31_2    anaconda
ca-certificates           2022.4.26            haa95532_0    anaconda
cachetools                5.1.0                    pypi_0    pypi
certifi                   2021.10.8        py38haa95532_2    anaconda
charset-normalizer        2.0.12                   pypi_0    pypi
colorama                  0.4.4              pyhd3eb1b0_0    anaconda
cycler                    0.11.0             pyhd3eb1b0_0    anaconda
debugpy                   1.5.1            py38hd77b12b_0    anaconda
decorator                 5.1.1              pyhd3eb1b0_0    anaconda
entrypoints               0.4              py38haa95532_0    anaconda
executing                 0.8.3              pyhd3eb1b0_0    anaconda
flatbuffers               1.12                     pypi_0    pypi
fonttools                 4.25.0             pyhd3eb1b0_0    anaconda
freetype                  2.10.4               hd328e21_0    anaconda
gast                      0.4.0                    pypi_0    pypi
google-auth               2.6.6                    pypi_0    pypi
google-auth-oauthlib      0.4.6                    pypi_0    pypi
google-pasta              0.2.0                    pypi_0    pypi
grpcio                    1.46.3                   pypi_0    pypi
h5py                      3.7.0                    pypi_0    pypi
icc_rt                    2019.0.0             h0cc432a_1
icu                       58.2             vc14hc45fdbb_0  [vc14]  anaconda
idna                      3.3                      pypi_0    pypi
importlib-metadata        4.11.4                   pypi_0    pypi
intel-openmp              2021.3.0          haa95532_3372
ipykernel                 6.9.1            py38haa95532_0    anaconda
ipython                   8.2.0            py38haa95532_0    anaconda
jedi                      0.18.1           py38haa95532_1    anaconda
joblib                    1.1.0              pyhd8ed1ab_0    conda-forge
jpeg                      9e                   h2bbff1b_0    anaconda
jupyter_client            7.2.2            py38haa95532_0    anaconda
jupyter_core              4.9.2            py38haa95532_0    anaconda
keras                     2.9.0                    pypi_0    pypi
keras-preprocessing       1.1.2                    pypi_0    pypi
kiwisolver                1.3.2            py38hd77b12b_0    anaconda
libclang                  14.0.1                   pypi_0    pypi
libpng                    1.6.37               h2a8f88b_0    anaconda
libtiff                   4.2.0                hd0e1b90_0    anaconda
libwebp                   1.2.2                h2bbff1b_0    anaconda
libzlib                   1.2.11            h8ffe710_1013    conda-forge
llvmlite                  0.37.0           py38h57a6900_0    conda-forge
lz4-c                     1.9.3                h2bbff1b_1    anaconda
markdown                  3.3.7                    pypi_0    pypi
matplotlib                3.5.1            py38haa95532_1    anaconda
matplotlib-base           3.5.1            py38hd77b12b_1    anaconda
matplotlib-inline         0.1.2              pyhd3eb1b0_2    anaconda
mkl                       2021.3.0           haa95532_524
mkl-service               2.4.0            py38h2bbff1b_0
mkl_fft                   1.3.0            py38h277e83a_2
mkl_random                1.2.2            py38hf11a4ad_0
munkres                   1.1.4                      py_0    anaconda
nest-asyncio              1.5.5            py38haa95532_0    anaconda
numba                     0.54.1           py38h5858985_0    conda-forge
numexpr                   2.7.3            py38hb80d3ca_1
numpy                     1.20.3           py38ha4e8547_0
numpy-base                1.20.3           py38hc2deb75_0
oauthlib                  3.2.0                    pypi_0    pypi
openssl                   1.1.1n               h2bbff1b_0    anaconda
opt-einsum                3.3.0                    pypi_0    pypi
packaging                 21.3               pyhd3eb1b0_0    anaconda
pandas                    1.3.3            py38h6214cd6_0
parso                     0.8.3              pyhd3eb1b0_0    anaconda
pickleshare               0.7.5           pyhd3eb1b0_1003    anaconda
pillow                    9.0.1            py38hdc2b20a_0    anaconda
pip                       21.0.1           py38haa95532_0
prompt-toolkit            3.0.20             pyhd3eb1b0_0    anaconda
protobuf                  3.19.4                   pypi_0    pypi
pure_eval                 0.2.2              pyhd3eb1b0_0    anaconda
pyasn1                    0.4.8                    pypi_0    pypi
pyasn1-modules            0.2.8                    pypi_0    pypi
pygments                  2.11.2             pyhd3eb1b0_0    anaconda
pynndescent               0.5.4              pyh6c4a22f_0    conda-forge
pyparsing                 3.0.9                    pypi_0    pypi
pyqt                      5.9.2            py38ha925a31_4    anaconda
python                    3.8.12               h6244533_0
python-dateutil           2.8.2              pyhd3eb1b0_0
python_abi                3.8                      2_cp38    conda-forge
pytz                      2021.3             pyhd3eb1b0_0
pywin32                   302              py38h2bbff1b_2    anaconda
pyzmq                     22.3.0           py38hd77b12b_2    anaconda
qt                        5.9.7            vc14h73c81de_0  [vc14]  anaconda
requests                  2.27.1                   pypi_0    pypi
requests-oauthlib         1.3.1                    pypi_0    pypi
rsa                       4.8                      pypi_0    pypi
scikit-learn              0.24.2           py38hf11a4ad_1
scipy                     1.7.1            py38hbe87c03_2
setuptools                58.0.4           py38haa95532_0
sip                       6.5.1            py38hd77b12b_0    anaconda
six                       1.16.0             pyhd3eb1b0_0
sqlite                    3.36.0               h2bbff1b_0
stack_data                0.2.0              pyhd3eb1b0_0    anaconda
tbb                       2021.3.0             h2d74725_0    conda-forge
tensorboard               2.9.0                    pypi_0    pypi
tensorboard-data-server   0.6.1                    pypi_0    pypi
tensorboard-plugin-wit    1.8.1                    pypi_0    pypi
tensorflow                2.9.1                    pypi_0    pypi
tensorflow-estimator      2.9.0                    pypi_0    pypi
tensorflow-io-gcs-filesystem 0.26.0                   pypi_0    pypi
termcolor                 1.1.0                    pypi_0    pypi
threadpoolctl             3.0.0              pyh8a188c0_0    conda-forge
tk                        8.6.11               h2bbff1b_0    anaconda
toml                      0.10.2             pyhd3eb1b0_0    anaconda
tornado                   6.1              py38h2bbff1b_0    anaconda
traitlets                 5.1.1              pyhd3eb1b0_0    anaconda
typing-extensions         4.2.0                    pypi_0    pypi
umap-learn                0.5.1            py38haa244fe_1    conda-forge
urllib3                   1.26.9                   pypi_0    pypi
vc                        14.2                 h21ff451_1
vs2015_runtime            14.27.29016          h5e58377_2
wcwidth                   0.2.5              pyhd3eb1b0_0    anaconda
werkzeug                  2.1.2                    pypi_0    pypi
wheel                     0.37.0             pyhd3eb1b0_1
wincertstore              0.2              py38haa95532_2
wrapt                     1.14.1                   pypi_0    pypi
xz                        5.2.5                h62dcd97_0    anaconda
zipp                      3.8.0                    pypi_0    pypi
zlib                      1.2.11            h8ffe710_1013    conda-forge
zstd                      1.4.5                h04227a9_0    anaconda

I have no idea what goes wrong and whether I can do anything to solve the issue. I would be happy about feedback and I hope this is the right place to ask for help.

@timsainb
Copy link
Collaborator

I'm not sure what's going on here, I haven't seen this error before. You're running the same notebook in the repo?

FileNotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for ram://3bbf5d07-6468-48ed-be1d-e056e006589b/variables/variables
 You may be trying to load on a different device from the computational device. Consider setting the `experimental_io_device` option in `tf.saved_model.LoadOptions` to the io_device such as '/job:localhost'.

It sounds like a low-level tensorflow issue with finding a variable that is supposed to be stored in RAM. Do you get the same error when you try saving other tensorflow models?

@niederle
Copy link
Author

Yes, I ran the notebook from the repo.

If I run the example code of saving a tensorflow model I do not get any error.
Also no problems, if I run the example code to save a keras model.
I that the way to test?

@jonathan-conder-sm jonathan-conder-sm linked a pull request Oct 20, 2022 that will close this issue
@kurose635
Copy link

jonathan-conder-sm
Thanks!
I could save model with pkl file.
However, Warning is turned out:tensorflow:No training configuration found in save file, so the model was not compiled. Compile it manually.

And then, load file.
load_embedder = load_ParametricUMAP("load file name")
additional_embedding = load_embedder.transform(test_images)

raise ValueError(f'Input {input_index} of layer "{layer_name}" is '
ValueError: Input 0 of layer "sequential" is incompatible with the layer: expected shape=(None, 28, 28, 1), found shape=(1000, 784)

Please how to solve.

@konan-ai
Copy link

Does anyone know how to save and load the base model?

The ParametricUMAP module is useless without this functionality.

@FareedFarag
Copy link

FareedFarag commented Oct 18, 2023

I found a quick fix for this issue if you're experiencing the "FileNotFoundError: Unsuccessful TensorSliceReader constructor: ... ".

The error indicates that Keras models shouldn't be pickled and instead should be saved in a supported format like .keras, .h5, etc. The issue arises when it's trying to pickle the whole ParametricUMAP (embedder) object. However, the encoder/decoder and the parametric umap models should be saved without issues.

The should_pickle function doesn't account for the FileNotFoundError exception, which in turns results in the script crashing in the try block without catching the exception. Here is a slighly modified version of the funciton:

import umap
import pickle, codecs
from numba import TypingError
from warnings import warn

def should_pickle(key, val):
    """
    Checks if a dictionary item can be pickled
    Parameters
    ----------
    key : try
        key for dictionary element
    val : None
        element of dictionary

    Returns
    -------
    picklable: bool
        whether the dictionary item can be pickled
    """
    print(f"KEY TO PICKLE: {key}")
    try:
        ## make sure object can be pickled and then re-read
        # pickle object
        pickled = codecs.encode(pickle.dumps(val), "base64").decode()
        # unpickle object
        unpickled = pickle.loads(codecs.decode(pickled.encode(), "base64"))
    except (
        pickle.PicklingError,
        tf.errors.InvalidArgumentError,
        TypeError,
        tf.errors.InternalError,
        tf.errors.NotFoundError,
        OverflowError,
        TypingError,
        AttributeError,
    ) as e:
        warn("Did not pickle {}: {}".format(key, e))
        return False
    except ValueError as e:
        warn(f"Failed at pickling {key}:{val} due to {e}")
        return False
    except FileNotFoundError as e:
        warn(f"Failed at pickling {key}:{val} due to {e}")
        return False
    return True 

Then override the function as follows:
umap.parametric_umap.should_pickle = should_pickle

Now you should be able to save the model as follows:
embedder.save(save_location=filePath)

When you load the model back using umap.parametric_umap.load_ParametricUMAP(), you should be able to transform new samples. One thing that didn't get pickled are the weights of the parametric umap's optimizer. However, you don't need them for transforming new samples as the embedding has already been fit

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants