IBM watsonx.ai Models
See Model Configuration for a full list of available model properties.
Overview
Use the procedure outlined below to configure and use IBM watsonx.ai models within Profound AI.
Selecting a Model to Use
IBM watsonx.ai provides various models to choose from. The model you select should be based on the capabilities you intend to use within Profound AI:
Granite models are best suited for fast backend / batch processing via the Ask AI Agent plugin.
For an interactive chat experience that includes Data Access, Data Analytics, and custom Agent Routines, use a larger model such as
mistralai/mistral-large.
Install IBM watsonx.ai Node.js SDK
From the command line within your Profound AI installation directory, issue the following command:
npm install @ibm-cloud/watsonx-aiSteps
Login to IBM watsonx.ai.
Create or select a project.
In the project, generate a new API key.
Edit your Profound AI configuration file to add a model with a
watsonxprovider property.If
apiVersionandendpointproperties are not specified in the configuration, default IBM watsonx.ai values will be used for those properties.
Example
The example below shows a sample configuration for the mistralai/mistral-large IBM watsonx model:
models: {
"IBM watsonx": {
provider: "watsonx",
model: "mistralai/mistral-large",
apiKey: "...",
projectId: "....",
stream: true
}
}