Help Center/
Cognitive Engagement Center/
User Guide/
Tenant Administrator Guide/
Robot Management and Configuration Guide/
Large Language Model Service Zone/
LLM-based Application Services/
Refining Response Based on a Large Language Model/
Configuring Information on the LLM Engine
Updated on 2025-01-23 GMT+08:00
Configuring Information on the LLM Engine
Prerequisites
Contact the system administrator to sign in to the AICC, choose
, and enable the LLM Engine feature for the tenant space.Procedure
- Contact Huawei technical support to download the AICC_*.*.*_ODFS.zip package from https://support.huawei.com.
- Decompress the AICC_*.*.*_ODFS.zip package.
*.*.* indicates the AICC version number.
- Go to the /flowinfo/LLM_Agent_Assistant.zip/ directory to obtain the workflow.zip package.
- Decompress the workflow.zip package.
- Obtain the textRefine.tar.gz package.
- Sign in to the AICC as a tenant space administrator, choose
, and click the import button to import the textRefine.tar.gz package.Figure 1 Import success page
- Choose and click the Customized Template tab.
- Select the Refine response scenario and click the Refine response prompt template. The Template Details page is displayed. Click Unpublish to change the status of the template to Not released.
- Edit the Refine response prompt template and select an available model from the Large Model drop-down list. For details about how to configure a model, see Configuring a Model.
- Click Save and Publish to publish the prompt template.
Parent topic: Refining Response Based on a Large Language Model
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbot