Skip to content

v3.4.0: Documentation, support for OpenAI o1 models, general updates

Compare
Choose a tag to compare
@ccreutzi ccreutzi released this 27 Sep 14:35
· 40 commits to main since this release

This release includes new features, new documentation, and bug fixes.

New Features

Support for OpenAI® o1 models

You can now use the OpenAI models o1-mini and o1-preview to generate text from MATLAB®. When you create an openAIChat object, set the ModelName name-value argument to "o1-mini" or "o1-preview".

Temporarily override model parameters when generating text

You can now set model parameters such as MaxNumToken and ResponseFormat for a single API call by using the corresponding name-value arguments of the generate function. The generate function will then use the specified parameter for text generation instead of the corresponding model parameter of the openAIChat, azureChat, or ollamaChat input.

For a full list of supported model parameters, see generate.

Support for min-p sampling in Ollama™

You can now set the minimum probability ratio to tune the frequency of improbable tokens when generating text using Ollama models. You can do this in two different ways:

  1. When you create an ollamaChat object, specify the MinP name-value argument.
  2. When you generate text using the generate function with an ollamaChat object, specify the MinP name-value argument.

New Documentation

There is now detailed documentation for the features included in LLMs with MATLAB:

You can find these pages in a new directory functions inside the doc directory.

Full Changelog: v3.3.0...v3.4.0