IBM watsonx.ai Models
Setup an Account
Login to IBM watsonx.ai.
Create or select a project.
Create an API Key.
Install IBM watsonx.ai Node.js SDK
From the command line, within your Profound AI installation directory, issue the following command:
npm install @ibm-cloud/watsonx-ai
Configure the Model
Edit your Profound AI configuration file to add a model with a “watsonx” provider.
For example:
models: {
"IBM watsonx": {
provider: "watsonx",
model: "mistralai/mistral-large",
apiKey: "...",
projectId: "....",
stream: true
}
}
If apiVersion
and endpoint
properties are not specified in the configuration, default watsonx.ai values will be used for those properties.
Selecting a model to use
IBM watsonx.ai provides various models to choose from. Which model you select should be based on the capabilities you intend to use within Profound AI.
Granite models are best suited for fast backend / batch processing via the Ask AI Agent plugin.
For an interactive chat experience that includes Data Access, Data Analytics, and custom Agent Routines, use a larger model, such as “mistralai/mistral-large”.