You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
some mistral models are supported in vertex openai compliant endpoint,
whereas some of them like mistral-large need the request body to be transformed to mistral format for invoking with the rawPredict endpoint (url="https://$GOOGLE_REGION-aiplatform.googleapis.com/v1/projects/$GOOGLE_PROJECT_ID/locations/$GOOGLE_REGION/publishers/mistralai/models/$MODEL:rawPredict")
The self deployed models which are openai compliant are already supported with the endpoints.endpointId format of specifying model id, no changes need to be done here
use existing mistral integration transformers for transforming request and response for the larger mistral models
Context for your Request
No response
Your Twitter/LinkedIn
No response
The text was updated successfully, but these errors were encountered:
What Would You Like to See with the Gateway?
some mistral models are supported in vertex openai compliant endpoint,
whereas some of them like mistral-large need the request body to be transformed to mistral format for invoking with the rawPredict endpoint (url="https://$GOOGLE_REGION-aiplatform.googleapis.com/v1/projects/$GOOGLE_PROJECT_ID/locations/$GOOGLE_REGION/publishers/mistralai/models/$MODEL:rawPredict")
check this documentation for reference: https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/call-vertex-using-openai-library
Changes needed to be done:
endpoints.endpointId
format of specifying model id, no changes need to be done hereContext for your Request
No response
Your Twitter/LinkedIn
No response
The text was updated successfully, but these errors were encountered: