Analyzing the Model's Reasoning Logic
Understanding inner workings of LLMs can be challenging due to their "black-box" nature. However, you can analyze their reasoning process by posing follow-up questions based on their responses or by checking their understanding of the task requirements.
- Asking follow-up questions
If the LLM gives an inaccurate answer, you can ask a follow-up question to identify the cause of the error and determine how to optimize the prompt, thereby steering the LLM towards more accurate responses.
Example: Use reverse questioning to uncover the logic behind model answers and eliminate ambiguity for clearer understanding.
Before the modification:If your work tenure is 5 years or less, you are entitled to 5 days of annual leave. If it exceeds 5 years, your annual leave equals your years of service.Question: How many days of annual leave are you entitled to if you have worked for 3 years?Model answer:15 daysFollow-up question:Why is it 15 days?Model answer:According to the rule provided, if the work tenure is 5 years or less, the annual leave is 5 days per year. If the work tenure exceeds 5 years, the annual leave equals the number of years of service. Therefore, for an employee with 3 years of service, the annual leave should be: 3 years × 5 days/year = 15 days.
After the modification:If your work tenure is 5 years or less, you are entitled to 5 days of annual leave. If it exceeds 5 years, your annual leave equals your years of service.Question: How many days of annual leave per year are you entitled to if you have worked for 3 years?Model answer:5 days
When asking questions, you need to specify the object clearly, for example, "xxx above." For example, "Why do you think the xxx above belongs to the xxx category? Why isn't xxx above classified into the xxx category?" This will help the LLM understand that the user is referring to the information or situation previously discussed. Otherwise, it may perceive the user's question as an unrelated new query.
- Checking its understanding of the task requirements
You can request the LLM to reiterate the task requirements provided in the prompt to check whether it has understood the requirements correctly.
For example, "There is a task called xxx. I will provide you with..., and you need to..." \n\nDo you understand this task? Please provide a detailed explanation, including examples if necessary."
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot