Updated on 2024-11-13 GMT+08:00

Managing Test Cases

You can import test cases from the local PC to the test case library in CodeArts TestPlan, and export test cases from the test case library. You can also add test cases in batches, manage test cases through the feature directory, associate test cases with requirements, comment on test cases, filter test cases, customize the columns to be displayed in the test case list, and set test case fields.

Importing a Manual Test Case

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click the Manual Test tab, click Import on the right of the page, and choose Import from File from the drop-down list.

    Alternatively, click the All Cases tab, click Import on the right of the page, and choose Import from File from the drop-down list. In the displayed dialog box, set Execution Type to manual test and auto function test.

  4. Decide whether to allow the uploaded cases to overwrite the existing cases with the same IDs, and select the corresponding option.

    YES: If the uploaded cases have the same IDs with the existing cases, the existing ones will be overwritten.

    NO: All cases will be uploaded to the case list.

  5. In the dialog box that is displayed, click Download Template.

    Enter the test case information based on the format requirements in the template, return to the Testing Case page, upload the created test case file, and click OK.

    • Max. 5,000 test cases can be imported each time.
    • The size of a single file cannot exceed 5 MB.
    • Currently, CodeArts TestPlan supports the Excel format. If the data does not meet the import criteria, a message is displayed, asking you to download the error report. Modify the data and import it again.

Importing an Automated API Test Case

In CodeArts TestPlan, you can generate test cases by importing files in any of the following formats:

  • Postman: Postman Collection v2.1 standard, Postman Collection JSON file
  • Swagger: Swagger 2.0 and 3.0 standards, YAML file
  • Excel: Excel file based on the given template

Importing a Postman or Swagger File

Only one test case can be imported at a time. In addition, only test steps can be generated for the imported test case. Preparations and followups are not supported.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click the Auto API Test tab and choose Import > Import from File on the right of the page. The Import Case from File window is displayed.
  4. Select Postman or Swagger.

    Drag a file from the local PC to the window, or click Click or drag to add a file and select a file from the local PC. Click Next.

  5. In the displayed list, select the items for which you want to generate test cases based on the step sequence and click Save.

Importing an Excel File

Max. 5,00 test cases can be imported each time using an Excel file.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click the Auto API Test tab and choose Import > Import from File on the right of the page.

    Alternatively, click the All Cases tab, click Import on the right of the page, and choose Import from File from the drop-down list. In the displayed dialog box, set Execution Type to Auto API Test.

  4. Select Excel and click Download Template.

  5. Open the Excel template on the local PC and edit the test case information based on the comments in the template headers. The columns marked with asterisks (*) are required.

    The fields in the template are as follows.

    Field

    Mandatory

    Description

    Case Name

    Yes

    The value can contain only 1 to 128 characters and allows letters, digits, and special characters (-_/|*&`'^~;:(){}=+,×...—!@#$%.[]<>?–").

    Case Description

    No

    Max. 500 characters.

    Request Type

    Yes

    Only GET, POST, PUT, and DELETE are supported.

    Request Header Parameter

    No

    Format: key=value.

    If there are multiple parameters, separate them with &, that is, key=value&key1=value1.

    Request Address

    Yes

    The request protocol can be HTTP or HTTPS. The format is https://ip:port/pathParam?query=1.

    Environment Group

    No

    Environment parameter group.

    IP Variable Name

    No

    Generates the variable name in the corresponding Environment Group, extracts the content of the Request Address, and generates the corresponding global variable.

    Request Body Type

    No

    The value can be raw, json, or formdata, which corresponds to the text, JSON request body, or form parameter format on the page, respectively.

    If this parameter is not set, the JSON format is used by default.

    Request Body

    No

    If the request body type is formdata, the request body format is key=value. If there are multiple parameters, separate them with &, that is, key=value&key2=value2.

    When cases are imported using an Excel file, formdata does not support the request body in file format.

    Checkpoint Matching Mode

    No

    Supports Exact match and Fuzzy match. Exact match indicates Equals, and Fuzzy match indicates Contains.

    Expected Checkpoint Value

    No

    Target Value of the corresponding checkpoint.

  6. Save the edited Excel file and drag it from the local PC to the Import Case from File window, or click Click or drag to add a file. and select a file from the local PC. Click Next.
  7. View the import result.

    • Import successful: New test cases are displayed in the list. The number of new test cases is the same as the number of rows in the Excel file.
    • Import failed: A failure message is displayed in the upper right corner.

      Download the error list from the Import Case from File window. Modify the Excel file based on the error causes, and import again.

Importing a Custom Automated Test Case

In CodeArts TestPlan, you can generate test cases by importing files in any of the following formats:

  • Excel: Excel file based on the given template

Max. 5,000 test cases can be imported each time using an Excel file.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click the custom execution mode tab, click Import on the right, choose Import from File from the drop-down list, and click Download Template.
  4. Open the Excel template on the local PC and edit the test case information based on the comments in the template headers. The columns marked with asterisks (*) are required.

    The fields in the template are as follows.

    Field

    Description

    Case Name

    Name of a test case (mandatory). Describe the test scenario or function of the test case. The value contains 1 to 128 characters.

    ID

    ID of a test case. The value contains 1 to 128 characters.

    Script Path

    Relative path of the script file in the repository.

    Processor

    Person who needs to complete the test case.

    Status

    Status of a test case. The value can be New, Designing, Testing, or Completed.

    Result

    Result of a test case. The value can be Succeeded, Failed, Pending check, Not available, Blocked, or other custom results.

    Type

    Type of a test case. The value can be Function Test, Performance Test, Compatibility Test, Usability Test, Reliability Test, Security Test, and Serviceability Test.

    Level

    Test case level based on the importance of the scenario or function.

    • L0: verification of underlying functions. Each module should have 10 to 20 test cases. L0 test cases account for 5% of all test cases.
    • L1: verification of basic functions for inherited features or before sprint acceptance. L1 test cases account for 20% of all test cases.
    • L2: verification of important features for manual tests in non-regression versions. L2 test cases account for 60% of all test cases.
    • L3: verification of minor and non-important functions, and exception tests on basic and important functions. L3 test cases account for 10% to 15% of all test cases.
    • L4: verification of special input, scenarios, and threshold conditions. L4 test cases account for less than 5% of all cases.

    Sprint

    Sprint to test the current test case.

    Module

    Module to which a test case belongs. The module list comes from the project settings.

    Requirement ID

    ID of the requirement to be associated with the test case.

    Requirement Name

    Name of the requirement to be associated with the test case.

    Description

    Description of the test case.

    Prerequisite

    Prerequisites for executing the current test case.

    Folder

    The feature directory to which the test case belongs.

    Test Step

    Step description and expected result.

    Expected Result

    Target Value of the corresponding checkpoint.

  5. Save the edited Excel file and drag it from the local PC to the Import Case from File window, or click Click or drag to add a file. and select a file from the local PC. Click Next.
  6. View the import result.

    • Import successful: New test cases are displayed in the list. The number of new test cases is the same as the number of rows in the Excel file.
    • Import failed: A failure message is displayed in the upper right corner.

      Download the error list from the Import Case from File window. Modify the Excel file based on the error causes, and import again.

Exporting Test Cases

  1. Click an execution type tab, choose More on the right and choose Export from the drop-down list.
  2. In the dialog box that is displayed, select the case export scope. You can select Export All.

    Select Partial Export (set Start position to the case in the first row of the table and End position to the case in the last row of the table by default), and click OK.

  3. Open the exported Excel file on the local PC and view the exported test case.

Adding Test Cases in Batches

In CodeArts TestPlan, you can add test cases to test plans in batches from the test case library, including manual test cases and automated API test cases.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click Test case library in the upper left corner of the page and select the target test plan from the drop-down list.
  4. Click the Manual Test or Auto API Test tab, click Import in the right area and select Add Existing Cases from the drop-down list.
  5. In the displayed dialog box, select a test case and click OK.

    • Test cases that already exist in the test plan cannot be added again.
    • All test cases related to requirements in the test plan can be added.

Reviewing Test Cases Online

You can review the created test cases.

Creating a Review

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click the tab of the corresponding test type, click on the right of the case to be reviewed, and click Create Review.
  4. In the dialog box that is displayed, configure the following information and click Confirm.

    Configuration Item

    Mandatory

    Description

    Name

    Yes

    By default, the name of a new review is the same as the test case name.

    Test Case Modification Time

    Yes

    By default, the time is set to the current system date.

    Review Auto-Close

    Yes

    You can enable or disable the automatic closure function for the review.

    • Yes: The review will be automatically closed after it is created.
    • No: The test case will be manually reviewed and closed by the person to whom the review is assigned to.

    Expected Closure Time

    Yes

    If you select No for Review Auto-Close, you can select the expected closure time.

    Review Comment

    Yes

    Enter review information containing a maximum of 1,000 characters.

    Assigned To

    Yes

    If you select No for Review Auto-Close, you can select a person to whom the review is assigned to close.

Batch Review

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. In the test case list, select the test cases to be reviewed in batches.
  4. Click Batch Review.
  5. In the dialog box that is displayed, configure the following information and click Confirm.

    • Review Auto-Close: If this parameter is set to Yes, the review status is automatically set to Closed. If this parameter is set to No, the test case will be manually reviewed and closed by the person to whom the review is assigned to.
    • Expected Closure Time: If you select No for Review Auto-Close, you can select the expected closure time.
    • Review Comment: Enter review information containing a maximum of 1000 characters.
    • Assigned to: If you select No for Review Auto-Close, you can select a person to whom the review is assigned to close.

Checking Review Records

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click Review Record. The created review record is displayed on the page.
  4. Click the search bar on the review record page and select a filter field.
  5. Enter a keyword to search for the corresponding review record.
  6. You can delete, edit, and close reviews that are not closed.

    • To delete an unclosed review, click in the Operation column and click OK.
    • To edit an unclosed review record, click in the Operation. In the displayed dialog box, edit the review.
    • To close an unclosed review, click in the Operation column. In the Close Review dialog box, close the review.

Test Cases and Requirements

Associating a Test Case with a Requirement

Test cases can be associated only with Epic, Feature, and Story work items of a Scrum project and default Requirement work items of a Kanban project.

CodeArts TestPlan supports association between test cases and requirements. The procedure is as follows:

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Select a test case to be associated with a requirement, click on the right, and select Associated Requirement. Alternatively, on the All Cases tab page, click the icon in the Operation column of the case to be associated with a requirement.

    To associate multiple test cases with one requirement, select the required test cases in the list and click Associated Requirements in Batches at the bottom of the page.

  4. In the displayed dialog box, select the requirement to be associated. You can select a requirement on the Current Plan or All requirements tab page. Click OK.

Adding Test Cases Related to Requirements

  • Prerequisites

    The requirements in the test plan have been associated with test cases in the test case library.

The procedure for adding test cases related to requirements is the same as that in Adding Test Cases in Batches. In the dialog box that is displayed, select the Select all test cases related to the requirements in this test plan check box.

Managing Test Cases Based on Requirements

On the Testing Case page, click the Requirements directory in the left pane.

  • By default, all associated requirements belong to the Requirements directory.
  • Click a requirement in the Requirements directory to view all cases associated with the requirement.
  • Click on the right of the requirement name to view the requirement details or create a test case associated with the requirement.

Setting Requirement Change Notification

When a requirement is associated with a test case and the requirement is modified in CodeArts Req, a red dot is displayed after the requirement name on the Testing Case page. You need to supplement or modify the test case associated with the requirement.

Test Cases and Bugs

When a test case fails to be executed, the test case is usually associated with a bug. You can create a bug or associate the test case with an existing bug.

The following uses manual test cases as an example.

Test cases can be associated only with Bug work items of a Scrum project and default Bug work items of a Kanban project.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Select t a test case to be associated with a bug. You can create or associate a bug using either of the following methods:

    • Perform the following operations on the Testing Case page:
      • Click in the Operation column to associate an existing bug in the current project.
      • Click in the Operation column, and select Create and Associate Defects to create a bug as prompted.
    • Click a test case to associate it with a bug.

      Click the test case name. On the page that is displayed, select Defects and click Create and Associate Defects.

  4. After a defect is created or associated, view the defect information on the Defects tab page. You can click to disassociate the current bug.

Commenting on a Test Case

You can comment on test cases.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Select a test case to be commented on, click the test case name, and click the Details tab.
  4. Enter your comments in the Comments text box at the bottom of the page and click Save.

    The comments that are successfully saved are displayed below the Comments text box.

Filtering Test Cases

CodeArts TestPlan supports custom filtering of test cases. The following procedure uses the Testing Case > Manual Test as an example.

Using the Default Filter Criteria

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. On the Manual Test tab page, select an option from the All cases or All drop-down list.

    • All cases: displays all cases in the current test plan or case library.
    • My cases: displays all cases whose Processor is the current login user.
    • Unassociated with test suites: displays test cases that are not associated with any test suite.

You can click the two drop-down list boxes and filter all cases, your test cases, or test cases that are not associated with test suites.

Setting Advanced Filter Criteria

If the default filter criteria do not meet your requirements, you can customize filter criteria.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. Click Advanced Filter above the test case list. Common filter criteria are displayed on the page.
  4. Set filter criteria as required and click Filter. Test cases that meet the filter criteria are displayed on the page.

    You can also click Save and Filter. In the dialog box that is displayed, enter the filter name and click OK. The saved filter is added to the All cases drop-down list.

  5. (Optional) If advanced filter criteria still do not meet requirements, click Add Filter, select a filter field from the drop-down list box as required. The filtering field is displayed on the page. Repeat 4 to complete the filtering. You can add custom filter fields for advanced filtering.

Customizing Test Case List Columns

CodeArts TestPlan supports customizing columns to be displayed in the test case list. The following procedure uses manual test cases as an example.

  1. Log in to the CodeArts homepage, search for your target project, and click the project name to access the project.
  2. In the navigation pane, choose Testing > Testing Case.
  3. On the Manual Test tab page, click in the last column of the test case list. In the dialog box that is displayed, select the fields to be displayed, deselect the fields to be hidden, and drag the selected fields to rearrange their display sequence. You can also add custom headers to the test case list.

Searching for a Test Case

You can search for test cases by name, ID, or description.

  1. Create a test case.
  2. In the search box above the test case list, enter a keyword of the name, ID, or description.
  3. Click .
  4. The required test cases are filtered and displayed in the list.