Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request for dependency update, especially 'protobuf <= 3.20.3' #98

Open
DENGARDEN opened this issue Sep 10, 2024 · 4 comments
Open

Request for dependency update, especially 'protobuf <= 3.20.3' #98

DENGARDEN opened this issue Sep 10, 2024 · 4 comments

Comments

@DENGARDEN
Copy link

Hi, I'm enjoying your genet package for DeepPrime inference, but the package is limiting its extensibility due to its older dependencies.
Especially, for my use cases, "protobuf<=3.20.3" package is causing problems.
In my research, version 3.20.x is at the end-of-support status, so it should be updated for future access.
Can I request the dependency update, or may I post a pull request for this matter?

Thank you for your contribution to open science!

@Goosang-Yu
Copy link
Owner

Thank you for the great suggestion!

The reason I restricted the protobuf version is to ensure compatibility with TensorFlow <= 2.10. I’m currently considering a few aspects of this. TensorFlow no longer supports Windows from version 2.11 onward. Although genet currently does not support Windows, I plan to extend that support in the future, which is why I’m maintaining TensorFlow version compatibility for now.

If you have any ideas or code suggestions related to this, I would really appreciate your input!

@DENGARDEN
Copy link
Author

Then, the core is, that expanding support to native Windows users with GPU support (not WSL2) will be a key issue.
How about adapting the inference stage to ONNX runtime for Windows native users?

@Goosang-Yu
Copy link
Owner

Sounds like a great idea!

I haven't used ONNX before, so could you explain how to use it after the conversion? It would be a good idea to start by converting only the DeepSpCas9 model and comparing the results with the TensorFlow version to see if the values match. If you convert DeepSpCas9 to ONNX and share it with me, I will compare the results with the original model.

I've heard that sometimes the output values might slightly change after converting to the ONNX format. I'm curious to know how much of a difference this might make. Have you encountered this in your experience?

@DENGARDEN
Copy link
Author

Apologies, as I have limited experience with TensorFlow, this was just one strategy I considered. Successfully converting to ONNX would be beneficial, given the numerous "great legacy" prediction models available for CRISPR systems. I'll attempt it and post the outcomes here. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants