Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a plan to fix thread-safe problem in inferring? #26

Open
YiFang99 opened this issue Jul 17, 2024 · 3 comments
Open

Is there a plan to fix thread-safe problem in inferring? #26

YiFang99 opened this issue Jul 17, 2024 · 3 comments
Labels
inference Something about inference

Comments

@YiFang99
Copy link

No description provided.

@YiFang99
Copy link
Author

Is this model going to support batch generation for multimodal?

@JoyBoy-Su
Copy link
Collaborator

Thanks a lot for your interest! We will add this to our TODO list!

@JoyBoy-Su JoyBoy-Su added the inference Something about inference label Jul 17, 2024
@YiFang99
Copy link
Author

Really great work! Looking forward to more modules!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
inference Something about inference
Projects
None yet
Development

No branches or pull requests

2 participants