confidence metric broken #13
Labels
bug
Something isn't working
good first issue
Good for newcomers
help wanted
Extra attention is needed
in commit 3836f92, we switched from doing embedding inference on the server to using openai embeddings, since we don't have enough space on render.
the confidence metric seems to be much smaller / lower variance on average. tweak the current formulas so that the confidence metric displayed in the inbox makes sense and is reasonably normalized
The text was updated successfully, but these errors were encountered: