Step 2: Develop Data
This step describes how to use the data in BI reports to analyze the 10 products users like most and 10 products users dislike most. Jobs are periodically executed and the results are exported to tables every day for data analysis.
Analyze 10 Products Users Like Most
- Log in to the DataArts Studio console. Locate an instance and click Access. On the displayed page, locate a workspace and click DataArts Factory.
Figure 1 DataArts Factory
- Create a DLI SQL script used to create data tables by entering DLI SQL statements in the editor.
Figure 2 Creating a script
- In the SQL editor, enter the following SQL statements and click Execute to calculate the 10 products users like most from the original data table in the OBS bucket and save the result to the top_like_product table.
INSERT OVERWRITE table top_like_product SELECT product.brand as brand, COUNT(product.brand) as like_count FROM action JOIN product ON (action.product_id = product.product_id) WHERE action.type = 'like' group by brand ORDER BY like_count desc LIMIT 10
Figure 3 Script for analyzing the 10 products users like most
- After debugging the script, click Save to save the script and name it top_like_product. Click Submit to submit the script version. This script will be referenced later in Developing and Scheduling a Job.
- After the script is saved and executed successfully, you can use the following SQL statement to view data in the top_like_product table. You can also download or dump the table data by referring to Figure 4.
SELECT * FROM top_like_product
Analyze 10 Products Users Dislike Most
- Log in to the DataArts Studio console. Locate an instance and click Access. On the displayed page, locate a workspace and click DataArts Factory.
Figure 5 DataArts Factory
- Create a DLI SQL script used to create data tables by entering DLI SQL statements in the editor.
Figure 6 Creating a script
- In the SQL editor, enter the following SQL statements and click Execute to calculate the 10 products users dislike most from the original data table in the OBS bucket and save the result to the top_bad_comment_product table.
INSERT OVERWRITE table top_bad_comment_product SELECT DISTINCT product_id, comment_num, bad_comment_rate FROM comment WHERE comment_num > 3 ORDER BY bad_comment_rate desc LIMIT 10
Figure 7 Script for analyzing the 10 products users dislike most
- After debugging the script, click Save and Submit to save the script and name it top_bad_comment_product. This script will be referenced later in Developing and Scheduling a Job.
- After the script is saved and executed successfully, you can use the following SQL statement to view data in the top_bad_comment_product table. You can also download or dump the table data by referring to Figure 8.
SELECT * FROM top_bad_comment_product
Developing and Scheduling a Job
Assume that the BI reports in the OBS bucket are changing every day. To update the analysis result every day, use the job orchestration and scheduling functions of DataArts Factory.
- Log in to the DataArts Studio console. Locate an instance and click Access. On the displayed page, locate a workspace and click DataArts Factory.
Figure 9 DataArts Factory
- Create a batch job named BI_analysis.
Figure 10 Creating a job
Figure 11 Configuring the job
- Open the created job, drag two Dummy nodes and two DLI SQL nodes to the canvas, select and drag , and orchestrate the job shown in Figure 12.
Key nodes:
- Begin (Dummy node): serves only as a start identifier.
- top_like_product (DLI SQL node): In Node Properties, associates with the DLI SQL script top_like_product developed in Analyze 10 Products Users Like Most.
- top_bad_comment_product (DLI SQL node): In Node Properties, associates with the DLI SQL script top_bad_comment_product developed in Analyze 10 Products Users Dislike Most.
- Finish (Dummy node): serves only as an end identifier.
- Click to test the job.
- If the job runs properly, click Scheduling Setup in the right pane and configure the scheduling policy for the job.
Figure 13 Configuring scheduling
Note:
- Scheduling Type: Select Run periodically.
- Scheduling Properties: The job is executed at 01:00 every day from Feb 09 to Feb 28, 2022.
- Dependency Properties: You can configure a dependency job for this job. You do not need to configure it in this practice.
- Cross-Cycle Dependency: Select Independent on the previous schedule cycle.
- Click Save and Submit and Execute. Then the job will be automatically executed every day and the BI report analysis result is automatically saved to the top_like_product and top_bad_comment_product tables, respectively.
- If you want to check the job execution result, choose Monitoring > Monitor Instance in the left navigation pane.
Figure 14 Viewing the job execution status
You can also configure notifications to be sent through SMS messages or emails, when a job encounters exceptions or fails.
Now you have learned the data development process based on e-commerce BI reports. In addition, you can analyze the age distribution and gender ratio of users and their browsing, purchase, and evaluation of products to provide valuable information for marketing decision-making, advertising, credit rating, brand monitoring, and user behavior prediction.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.