Replies: 1 comment 4 replies
-
@MacBeef thanks for the feedback 😃 V2 has a local embedding model. So the Smart View feature will work offline. Still working on the local chat model though 🌴 |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I really hope smart connections get LM server intergration so I can use localAI. Obsidian should be able to run offline imo.
Beta Was this translation helpful? Give feedback.
All reactions