Configuration

Assistant configuration

When you deploy MOSTLY AI, the Assistant is included as part of the deployment of Kubernetes services and containers. For the Assistant to work, you need to configure access to the Language Learning Model (LLM) and DataLLM. The Assistant can use any LLM chosen by your organization.

📑

Note: You can only configure the Assistant with the Super Admin account.

Enable the Assistant

Enabling and configuring the Assistant is a one-time operation.

Steps

  1. From the Account menu, select Assistant settings.

    MOSTLY AI - Assistant configuration
  2. On the Assistant settings page, select Enable Assistant.

  3. Adjust the rest of the configuration option as listed in the sections below.

  4. When done, click Save.

Configuration parameters

MOSTLY AI uses LiteLLM (opens in a new tab) to integrate with different LLM providers. For a list of supported LLMs, see Providers (opens in a new tab) in the LiteLLM documentation. Most of the Assistant configuration parameters are based on what LiteLLM requires.

LLM Model

For LLM Model, set the LLM model you want to use from your LLM provider. For example, openai/gpt-4o. For example, here are the available OpenAI models (opens in a new tab) and Mistral AI (opens in a new tab) models. You can also use any model and provider supported by LiteLLM.

LLM API Key

For LLM API Key, enter your LLM API key. Navigate to the documentation of your LLM provider to learn how to get an API key.

LLM Extra Variables

For LLM extra variables, define any additional variables that your LLM provider requires. See the LiteLLM docs for details.

For example, AWS Bedrock requires:

  • aws_region_name=us-east-1
  • aws_access_key_id=XXX
  • aws_secret_access_key=XXX

DataLLM API Key

For DataLLM API key, go to https://data.mostly.ai (opens in a new tab) and get an API key for DataLLM.

The key is required when Data consumers at your organization need to use DataLLM to generate data out of nothing (mock data) or enrich existing datasets and the Assistant will help them run DataLLM Python client code.

System Instructions

The System instructions define the MOSTLY AI guidance and examples and how your LLM replies to questions in the context of MOSTLY AI Assistant.

This is pre-defined by MOSTLY AI and can be adjusted if you have a specific use case.

Custom Instructions

Define a custom set of instructions for your MOSTLY AI Assistant to match the needs of your organization.

Steps

  1. Switch to Custom.
  2. Define a set of instructions that modify the existing ones defined by MOSTLY AI. MOSTLY AI - Assistant custom instructions
💡

Update the existing instructions as much as possible. Overwriting the existing instructions will completely change how the Assistant replies to prompts.