หน้านี้ยังไม่พร้อมใช้งานในภาษาท้องถิ่นของคุณ เรากำลังพยายามอย่างหนักเพื่อเพิ่มเวอร์ชันภาษาอื่น ๆ เพิ่มเติม ขอบคุณสำหรับการสนับสนุนเสมอมา

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
On this page

Show all

Developing and Scheduling an Import GES Job

Updated on 2024-08-30 GMT+08:00

This section describes how to invoke a data integration job through data development to periodically synchronize raw data in MySQL to OBS and MRS Hive and convert the data to standard GES vertex/edge data sets. Then, graph metadata is automatically generated based on the standard vertex and edge data sets. In this way, graph data (vertex data sets, edge data sets, and metadata) is periodically imported to GES.

Figure 1 Business scenarios

Procedure

Assume that the raw data tables in the MySQL database are updated every day. If you want to update the latest graph data generated based on the raw data to GES every day, perform the following steps to compile jobs and periodically schedule the jobs:

  1. On the DataArts Studio console, locate a workspace and click DataArts Factory.
  2. Create a data development batch processing job and name it import_ges.

    Figure 2 Creating a job

  3. On the job development page, drag one Dummy node, eight CDM Job nodes, and two Import GES nodes to the canvas, select and drag , as shown in Figure 3.

    The Dummy node serves only as an end identifier. The CDM Job nodes are used to invoke data integration jobs created in Creating a Data Integration Job. The Import GES nodes are used to import graph data to GES.
    Figure 3 Compiling a job

  4. Configure the eight CDM Job nodes in the job. Invoke the created data integration job to convert the raw data to the standard GES vertex/edge data sets and synchronize the data to OBS and MRS Hive.

    Figure 4 Configuring CDM nodes

    CDM node description:

    • vertex_user_rds2hive (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job vertex_user_rds2hive.
    • vertex_user_rds2obs (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job vertex_user_rds2obs.
    • edge_friends_rds2hive (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job edge_friends_rds2hive.
    • edge_friends_rds2obs (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job edge_friends_rds2obs.
    • vertex_movie_rds2hive (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job vertex_movie_rds2hive.
    • vertex_movie_rds2obs (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job vertex_movie_rds2obs.
    • edge_rate_rds2hive (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job edge_rate_rds2hive.
    • edge_rate_rds2obs (CDM Job node): In Node Properties, select the CDM cluster in Creating a Data Integration Job and associate it with the CDM job edge_rate_rds2obs.

  5. Configure the two Import GES nodes in the job separately. Only one vertex table and one edge table can be selected for one Import GES node to generate metadata. Therefore, two Import GES nodes are used in this practice to import data in sequence.

    Import GES node description:
    • Import_GES_user-friend: In Node Properties, after a graph name is selected, set the edge data set and vertex data set to the edge_friends edge table and vertex_user vertex table, respectively. In addition, select Overwrite previous repetitive edges. Otherwise, a large number of duplicate edges are generated after periodic scheduling.

      Set Metadata Source to New and click next to Metadata. The New dialog box is displayed, as shown in Figure 6. In the New dialog box, select the edge_friends edge table and vertex_user vertex table in MRS, set Output Directory to the directory where the OBS vertex tables and edge tables are located, and click Create. The system automatically populates the Metadata field to the OBS directory where the generated metadata schema is located.

    • Import_GES_movie-rate: In Node Properties, after a graph name is selected, set the edge data set and vertex data set to the edge_rate edge table and vertex_movie vertex table, respectively. In addition, select Overwrite previous repetitive edges. Otherwise, a large number of duplicate edges are generated after periodic scheduling.

      Set Metadata Source to New and click the generation button next to Metadata. The New dialog box is displayed, as shown in Figure 6. In the New dialog box, select the edge_rate edge table and vertex_movie vertex table in MRS, set Output Directory to the directory where the OBS vertex tables and edge tables are located, and click Create. The system automatically populates the Metadata field to the OBS directory where the generated metadata schema is located.

    Figure 5 Configuring the Import GES node
    Figure 6 New

  6. After configuring the job, click to test it.

    Figure 7 Testing the job

  7. If the job runs properly, click Scheduling Setup in the right pane and configure the scheduling policy for the job.

    Figure 8 Configuring scheduling type

    Parameter descriptions:

    • From: From 00:00 on April 1, 2023, the job is executed at 00:00 every day.
    • Dependency Properties: You can configure a dependency job for this job. You do not need to configure it in this practice.
    • Cross-Cycle Dependency: Select Independent on the previous schedule cycle.

  8. Click to save and submit configuration and click to execute the scheduling job. The job will be automatically executed every day, and daily data will be automatically imported to GES.
  9. If you want to check the job execution result, choose Monitoring > Monitor Instance in the left navigation pane.

    Figure 9 Viewing the job execution status

เราใช้คุกกี้เพื่อปรับปรุงไซต์และประสบการณ์การใช้ของคุณ การเรียกดูเว็บไซต์ของเราต่อแสดงว่าคุณยอมรับนโยบายคุกกี้ของเรา เรียนรู้เพิ่มเติม

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback