IBM watsonx.ai Models

IBM watsonx.ai Models

See Model Configuration for a full list of available model properties.

Overview

Use the procedure outlined below to configure and use IBM watsonx.ai models within Profound AI.

Selecting a Model to Use

IBM watsonx.ai provides various models to choose from. The model you select should be based on the capabilities you intend to use within Profound AI:

  • Granite models are best suited for fast backend / batch processing via the Ask AI Agent plugin.

  • For an interactive chat experience that includes Data Access, Data Analytics, and custom Agent Routines, use a larger model such as mistralai/mistral-large.

Install IBM watsonx.ai Node.js SDK

From the command line within your Profound AI installation directory, issue the following command:

npm install @ibm-cloud/watsonx-ai

Steps

  1. Login to IBM watsonx.ai.

  2. Create or select a project.

  3. In the project, generate a new API key.

  4. Edit your Profound AI configuration file to add a model with a watsonx provider property.

    1. If apiVersion and endpoint properties are not specified in the configuration, default IBM watsonx.ai values will be used for those properties.

Example

The example below shows a sample configuration for the mistralai/mistral-large IBM watsonx model:

models: { "IBM watsonx": { provider: "watsonx", model: "mistralai/mistral-large", apiKey: "...", projectId: "....", stream: true } }