v3.4.0: Documentation, support for OpenAI o1 models, general updates
This release includes new features, new documentation, and bug fixes.
New Features
Support for OpenAI® o1 models
You can now use the OpenAI models o1-mini and o1-preview to generate text from MATLAB®. When you create an openAIChat
object, set the ModelName
name-value argument to "o1-mini"
or "o1-preview"
.
Temporarily override model parameters when generating text
You can now set model parameters such as MaxNumToken
and ResponseFormat
for a single API call by using the corresponding name-value arguments of the generate
function. The generate
function will then use the specified parameter for text generation instead of the corresponding model parameter of the openAIChat
, azureChat
, or ollamaChat
input.
For a full list of supported model parameters, see generate.
Support for min-p sampling in Ollama™
You can now set the minimum probability ratio to tune the frequency of improbable tokens when generating text using Ollama models. You can do this in two different ways:
- When you create an
ollamaChat
object, specify theMinP
name-value argument. - When you generate text using the
generate
function with anollamaChat
object, specify theMinP
name-value argument.
New Documentation
There is now detailed documentation for the features included in LLMs with MATLAB:
- openAIChat
- azureChat
- ollamaChat
- generate
- openAIFunction
- addParameter
- openAIImages
- openAIImages.generate
- edit
- createVariation
- messageHistory
- addSystemMessage
- addUserMessage
- addUserMessageWithImages
- addToolMessage
- addResponseMessage
- removeMessage
You can find these pages in a new directory functions
inside the doc
directory.
Full Changelog: v3.3.0...v3.4.0