Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request for non-quantized pre-trained models #53

Closed
yf-fan-org opened this issue Jan 17, 2025 · 3 comments
Closed

Request for non-quantized pre-trained models #53

yf-fan-org opened this issue Jan 17, 2025 · 3 comments

Comments

@yf-fan-org
Copy link

yf-fan-org commented Jan 17, 2025

Is it possible to provide a non-quantized model, as I have found that quantized models sometimes don't work in real tests, and I'd like to test how accurate non-quantized is

@yf-fan-org
Copy link
Author

My native language is not English, and sometimes it's hard to detect when the pronunciation is not quite right, and I wondered if I could improve it by adding some additional audio data. Thank you very much for your answer!
@kahrendt

@Nold360
Copy link

Nold360 commented Jan 17, 2025

My native language is not English, and sometimes it's hard to detect when the pronunciation is not quite right, and I wondered if I could improve it by adding some additional audio data. Thank you very much for your answer! @kahrendt

I wonder if fine-tuning the models would be possible.. 🤔 maybe that would help?
#38

@kahrendt
Copy link
Owner

In my benchmarking tests, the quantization doesn't reduce the model accuracy in a consistent way. Sometimes it performs slightly better than the floating point model and sometimes slightly worse. When I mean slightly different, I mean the probability cutoff to only have a 5% false rejection rate differs from by 0.01.

There isn't an easy way to run the non-quantized models; they won't work on an ESP32. They really only work using the test functions in the mWW code itself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants