Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat with Documents #166

Open
Utopiah opened this issue Jan 13, 2025 · 1 comment
Open

Chat with Documents #166

Utopiah opened this issue Jan 13, 2025 · 1 comment

Comments

@Utopiah
Copy link

Utopiah commented Jan 13, 2025

Hi there, how can the equivalent of "Chat with Documents" as explained in the documentation https://lmstudio.ai/docs/basics/rag be done here?

@ryan-the-crayon
Copy link
Contributor

This functionality is still in active development. However, that said, it is indeed possible (but not well documented yet). See this script presented at the NYU hackathon event:

import { LMStudioClient } from "@lmstudio/sdk";
import { readFile } from "fs/promises";
import { basename } from "path";
import { createInterface } from "readline/promises";

const client = new LMStudioClient();

const documentPath = "./Frozen Script.txt";
const rl = createInterface({
    input: process.stdin,
    output: process.stdout,
});

const question = await rl.question("Enter a question: ");
rl.close();
const document = await client.files.uploadTempFile(
    basename(documentPath),
    await readFile(documentPath)
);

const nomic = await client.embedding.getOrLoad("text-embedding-nomic-embed-text-v1.5@q8_0");
const results = await client.retrieval.retrieve(question, [document], {
    embeddingModel: nomic,
});

const prompt = `
Given the following citation, please answer user's question:

----- Citation -----
${results.entries[0].content}
----- End of Citation -----

Question: ${question}
`;

const llama = await client.llm.getOrLoad("llama-3.2-3b-instruct");
const prediction = llama.respond([
    {
        role: "user",
        content: prompt,
    },
]);

for await (const { content } of prediction) {
    process.stdout.write(content);
}

const { stats } = await prediction;
console.info("\nStats:");
console.info(stats);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants