...
Code Block | ||
---|---|---|
| ||
models: { "My Custom Model": { endpoint: "https://myserver", apiFormat: "completion", model: "Llama-2-70b-chat-hf-function-calling-v2", apiKey: "......" } } |
The configuration above assumes above configuration is based on the assumption that the model’s API adheres to the model's API follows the format commonly used by the OpenAI completion API formatChat Completion API. This applies, for example, if you're hosting an open-source large language model locally with a tool like Ollama.
Using a custom provider
If the model's API requires uniquely formatted input and output, you can create a custom provider. For example:
...