This repository is the typescript version of mistralai/client-js
You can use the Mistral JavaScript/Typescript client to interact with the Mistral AI API.
Besides, the examples in this repository are still offical examples.
You can install the library in your project using:
npm i mistral-ts
You can watch a free course on using the Mistral JavaScript client here.
import MistralClient from 'mistralai-ts';
const apiKey = process.env.MISTRAL_API_KEY || 'your_api_key';
const client = new MistralClient(apiKey);
or in commonjs
const {MistralClient} = 'mistralai-ts';
const apiKey = process.env.MISTRAL_API_KEY || 'your_api_key';
const client = new MistralClient(apiKey);
const listModelsResponse = await client.listModels();
const listModels = listModelsResponse.data;
listModels.forEach((model) => {
console.log('Model:', model);
});
const chatStreamResponse = await client.chatStream({
model: 'mistral-tiny',
messages: [{role: 'user', content: 'What is the best French cheese?'}],
});
console.log('Chat Stream:');
for await (const chunk of chatStreamResponse) {
if (chunk.choices[0].delta.content !== undefined) {
const streamText = chunk.choices[0].delta.content;
process.stdout.write(streamText);
}
}
const chatResponse = await client.chat({
model: 'mistral-tiny',
messages: [{role: 'user', content: 'What is the best French cheese?'}],
});
console.log('Chat:', chatResponse.choices[0].message.content);
const input = [];
for (let i = 0; i < 1; i++) {
input.push('What is the best French cheese?');
}
const embeddingsBatchResponse = await client.embeddings({
model: 'mistral-embed',
input: input,
});
console.log('Embeddings Batch:', embeddingsBatchResponse.data);
// Create a new file
const file = fs.readFileSync('file.jsonl');
const createdFile = await client.files.create({ file });
// List files
const files = await client.files.list();
// Retrieve a file
const retrievedFile = await client.files.retrieve({ fileId: createdFile.id });
// Delete a file
const deletedFile = await client.files.delete({ fileId: createdFile.id });
// Create a new job
const createdJob = await client.jobs.create({
model: 'open-mistral-7B',
trainingFiles: [trainingFile.id],
validationFiles: [validationFile.id],
hyperparameters: {
trainingSteps: 10,
learningRate: 0.0001,
},
});
// List jobs
const jobs = await client.jobs.list();
// Retrieve a job
const retrievedJob = await client.jobs.retrieve({ jobId: createdJob.id });
// Cancel a job
const canceledJob = await client.jobs.cancel({ jobId: createdJob.id });
You can run the examples in the examples directory by installing them locally:
cd examples
npm install .
Running the examples requires a Mistral AI API key.
Get your own Mistral API Key: https://docs.mistral.ai/#api-access
MISTRAL_API_KEY='your_api_key' node chat_with_streaming.js
Set your Mistral API Key as an environment variable. You only need to do this once.
# set Mistral API Key (using zsh for example)
$ echo 'export MISTRAL_API_KEY=[your_api_key]' >> ~/.zshenv
# reload the environment (or just quit and open a new terminal)
$ source ~/.zshenv
You can then run the examples without appending the API key:
node chat_with_streaming.js
After the env variable setup the client will find the MISTRAL_API_KEY
by itself
import MistralClient from 'mistralai-ts';
const client = new MistralClient();
- 20240705---sync the progress from the last PR
- 20240506---init the project from offical js version