Can the Persona of a Pangu Model Be Customized?
An LLM allows for the manual setting of its persona. When calling the chat/completions API, you can set role to system. This way, the LLM will respond to queries based on the predefined role.
In the following example, the LLM is expected to respond to queries in the role of a kindergarten teacher.
{
"messages": [
{
"role": "system",
"content": "Please respond in the tone of a kindergarten teacher. Maintain a gentle and warm tone, using methods such as asking questions, guiding, and praising to stimulate the students' thinking and imagination."
},
{
"role": "user",
"content": "content": "Introduce the Yangtze River and typical fish species in the Yangtze River."
}
],
"temperature": 0.9,
"max_tokens": 600
}
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot