-
Notifications
You must be signed in to change notification settings - Fork 788
Issues: mlc-ai/web-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Tracking][WebLLM] Function calling (beta) and Embeddings
#526
opened Aug 4, 2024 by
CharlieFRuan
5 of 7 tasks
Phi 3 Mini output near random (Phi-3-mini-4k-instruct-q4f16_1-MLC)
#519
opened Jul 28, 2024 by
cdrini
Custom model outputs garbage in firefox nightly, works fine in chrome.
#518
opened Jul 26, 2024 by
gulan28
Runing LLM in a webworker fails due to loglevel dependency
status: tracking
Tracking work in progress
#511
opened Jul 21, 2024 by
jauniusmentimeter
How to let the user cancel loading the model and stop it from fetching params
#499
opened Jul 11, 2024 by
JohnReginaldShutler
Microsoft just released a more capable new version over Phi 3 Mini
#495
opened Jul 2, 2024 by
flatsiedatsie
Inconsistent and unreliable outputs on mobile as opposed to on pc/laptop for -1k models
#485
opened Jun 20, 2024 by
JohnReginaldShutler
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.