Updated on 2025-01-23 GMT+08:00

Configuring Information on the LLM Engine

The session summary function based on large language models needs to be implemented by invoking the workflows of the LLM Engine. This section describes how to create the workflow and prompt templates.

Prerequisites

Contact the system administrator to sign in to the AICC, choose Call Center > Tenant Management, and enable the LLM Engine and Automatic Conversation Summary features.

Procedure

  1. Contact Huawei technical support to download the AICC_*.*.*_ODFS.zip package from https://support.huawei.com.
  2. Decompress the AICC_*.*.*_ODFS.zip package.

    *.*.* indicates the AICC version number.

  3. Go to the /flowinfo/LLM_Agent_Assistant.zip/ directory to obtain the workflow.zip package.
  4. Decompress the workflow.zip package.
  5. Obtain the textSummary.tar.gz package.
  6. Sign in to the AICC as a tenant space administrator, choose Configuration Center > Chatbot Management > Large Model Service > Workflow Management > Workflow Orchestration, and click the import button to import the textSummary.tar.gz package.
  7. Sign in to the AICC as a tenant space administrator, choose Configuration Center > Chatbot Management > Large Model Service > Large Model Zone > Prompt Template, and click the Customized Template tab.
  8. Click the Automatic Conversation Summary scenario and click the Automatic Conversation Summary prompt template. The Template Details page is displayed. Click Unpublish to change the status of the template to Not released.
  9. Click the Automatic Conversation Summary prompt template and then click Edit. On the editing page that is displayed, select an available LLM. For details about how to configure a model, see Configuring a Model. Turn on the Model Parameters switch and set temperature to 0.2 and max_tokens to 300.
  10. Click Save and Publish to publish the prompt template.