DataArts Studio
DataArts Studio

    All results for "" in this service

      All results for "" in this service

      • What's New
      • Service Overview
        • DataArts Studio Infographics
        • What Is DataArts Studio?
        • Basic Concepts
        • Functions
        • Advantages
        • Application Scenarios
        • Versions
        • Billing
        • Permission Management
        • Permissions
        • Notes and Constraints
        • Related Services
      • Data Governance Methodology
        • Purpose
        • Intended Audience
        • Data Governance Framework
          • Framework
          • Data Governance Modules
          • Module Relationships
        • Data Governance Organizational Architecture
          • Framework
          • Responsibilities
        • Measurement and Evaluation System
          • Data Governance Methods
          • Measurement Dimensions
          • Measurement and Scoring Rules
        • Huawei Data Governance Cases
          • Thoughts
          • Practices
          • Effects
        • Thoughts on Data Governance and the Impact of COVID-19
        • Implementation of the Data Governance Methodology
      • Getting Started
        • Quick Start Guide
        • Beginners: DLI-powered Data Development Based on E-commerce BI Reports
          • Scenario
          • Step 1: Prepare Data
          • Step 2: Develop Data
          • Step 3: Unsubscribe from Services
        • Novices: DWS-powered Data Integration and Development Based on Movie Scores
          • Scenario
          • Step 1: Prepare Data
          • Step 2: Integrate Data
          • Step 3: Develop Data
          • Step 4: Unsubscribe from Services
        • Experienced Users: MRS Hive-powered Data Governance Based on Taxi Trip Data
          • Example Scenario
          • Step 1: Design a Process
          • Step 2: Prepare Data
          • Step 3: DataArts Migration
          • Step 4: Metadata Collection
          • Step 5: Design Data Architecture
          • Step 6: Develop Data
          • Step 7: DataArts Quality
          • Step 8: View Data Assets
          • Step 9: Unsubscribe from Services
      • User Guide
        • DataArts Studio development process
        • Buying and Configuring a DataArts Studio Instance
          • Buying a DataArts Studio Instance
          • Buying a DataArts Studio Incremental Package
            • Introduction to Incremental Packages
            • Buying a DataArts Migration Incremental Package
            • Buying a DataArts Migration Resource Group Incremental Package
            • Buying a DataArts DataService Exclusive Cluster Incremental Package
          • Accessing the DataArts Studio Instance Console
          • Creating and Configuring a Workspace in Simple Mode
            • Creating a Workspace in Simple Mode
            • Setting Workspace Quotas
            • (Optional) Changing the Job Log Storage Path
          • (Optional) Creating and Using a Workspace in Enterprise Mode
            • Introduction to the Enterprise Mode
            • Creating a Workspace in Enterprise Mode
            • Operations Supported for Different Roles in Enterprise Mode
              • Service Process in Enterprise Mode
              • Admin Operations
              • Developer Operations
              • Deployer Operations
              • Operator Operations
          • Managing DataArts Studio Resources
            • Associating a Real-Time Migration Resource Group with Workspaces
        • Authorizing Users to Use DataArts Studio
          • Creating an IAM User and Assigning DataArts Studio Permissions
          • Authorizing the Use of Real-Time Data Migration
          • Adding Workspace Members and Assigning Roles
        • Management Center
          • Data Sources Supported by DataArts Studio
          • Creating a DataArts Studio Data Connection
          • Configuring DataArts Studio Data Connection Parameters
            • DWS Connection Parameters
            • DLI Connection Parameters
            • MRS Hive Connection Parameters
            • MRS HBase Connection Parameters
            • MRS Kafka Connection Parameters
            • MRS Spark Connection Parameters
            • MRS ClickHouse Connection Parameters
            • MRS Hetu Connection Parameters
            • MRS Impala Connection Parameters
            • MRS Ranger Connection Parameters
            • MRS Presto Connection Parameters
            • Doris Connection Parameters
            • OpenSource ClickHouse Connection Parameters
            • RDS Connection Parameters
            • Oracle Connection Parameters
            • Host Connection Parameters
            • Rest Client Connection Parameters
            • Redis Connection Parameters
            • SAP HANA Connection Parameters
            • LTS Connection Parameters
          • Configuring DataArts Studio Resource Migration
          • Configuring Environment Isolation for a DataArts Studio Workspace in Enterprise Mode
          • Typical Scenarios for Using Management Center
            • Creating a Connection Between DataArts Studio and an MRS Hive Data Lake
            • Creating a Connection Between DataArts Studio and a GaussDB(DWS) Data Lake
            • Creating a Connection Between DataArts Studio and a MySQL Database
        • DataArts Migration (CDM Jobs)
          • Overview
          • Notes and Constraints
          • Supported Data Sources
            • Supported Data Sources (2.10.0.300)
            • Supported Data Sources (2.9.3.300)
            • Supported Data Sources (2.9.2.200)
            • Supported Data Types
          • Creating and Managing a CDM Cluster
            • Creating a CDM Cluster
            • Binding or Unbinding an EIP
            • Restarting a CDM Cluster
            • Deleting a CDM Cluster
            • Downloading CDM Cluster Logs
            • Viewing and Modifying CDM Cluster Configurations
            • Managing and Viewing CDM Metrics
              • CDM Metrics
              • Configuring CDM Alarm Rules
              • Querying CDM Metrics
          • Creating a Link in a CDM Cluster
            • Creating a Link Between CDM and a Data Source
            • Configuring Link Parameters
              • OBS Link Parameters
              • PostgreSQL/SQLServer Link Parameters
              • GaussDB(DWS) Link Parameters
              • RDS for MySQL/MySQL Database Link Parameters
              • Oracle Database Link Parameters
              • DLI Link Parameters
              • Hive Link Parameters
              • HBase Link Parameters
              • HDFS Link Parameters
              • FTP/SFTP Link Parameters
              • Redis Link Parameters
              • DDS Link Parameters
              • CloudTable Link Parameters
              • MongoDB Link Parameters
              • Cassandra Link Parameters
              • Kafka Link Parameters
              • DMS Kafka Link Parameters
              • CSS Link Parameters
              • Elasticsearch Link Parameters
              • Dameng Database Link Parameters
              • SAP HANA Link Parameters
              • Shard Link Parameters
              • MRS Hudi Link Parameters
              • MRS ClickHouse Link Parameters
              • ShenTong Database Link Parameters
              • LogHub (SLS) Link Parameters
              • CloudTable OpenTSDB Link Parameters
              • GBASE Link Parameters
              • YASHAN Link Parameters
            • Uploading a CDM Link Driver
            • Creating a Hadoop Cluster Configuration
          • Creating a Job in a CDM Cluster
            • Table/File Migration Jobs
            • Creating an Entire Database Migration Job
            • Configuring CDM Source Job Parameters
              • From OBS
              • From HDFS
              • From HBase/CloudTable
              • From Hive
              • From DLI
              • From FTP/SFTP
              • From HTTP
              • From PostgreSQL/SQL Server
              • From DWS
              • From SAP HANA
              • From MySQL
              • From Oracle
              • From a Database Shard
              • From MongoDB/DDS
              • From Redis
              • From Kafka/DMS Kafka
              • From Elasticsearch or CSS
              • From OpenTSDB
              • From MRS Hudi
              • From MRS ClickHouse
              • From LogHub (SLS)
              • From a ShenTong Database
              • From a Dameng Database
              • From YASHAN
            • Configuring CDM Destination Job Parameters
              • To OBS
              • To HDFS
              • To HBase/CloudTable
              • To Hive
              • To MySQL/SQL Server/PostgreSQL
              • To Oracle
              • To DWS
              • To DDS
              • To Redis
              • To Elasticsearch/CSS
              • To DLI
              • To OpenTSDB
              • To MRS Hudi
              • To MRS ClickHouse
              • To MongoDB
            • Configuring CDM Job Field Mapping
            • Configuring a Scheduled CDM Job
            • Managing CDM Job Configuration
            • Managing a CDM Job
            • Managing CDM Jobs
          • Using Macro Variables of Date and Time
          • Improving Migration Performance
            • How Migration Jobs Work
            • Performance Tuning
            • Reference: Job Splitting Dimensions
            • Reference: CDM Performance Test Data
          • Key Operation Guide
            • Incremental Migration
              • Incremental File Migration
              • Incremental Migration of Relational Databases
              • HBase/CloudTable Incremental Migration
              • MongoDB/DDS Incremental Migration
            • Migration in Transaction Mode
            • Encryption and Decryption During File Migration
            • MD5 Verification
            • Configuring Field Converters
            • Adding Fields
            • Migrating Files with Specified Names
            • Regular Expressions for Separating Semi-structured Text
            • Recording the Time When Data Is Written to the Database
            • File Formats
            • Converting Unsupported Data Types
            • Auto Table Creation
          • Tutorials
            • Creating an MRS Hive Link
            • Creating a MySQL Link
            • Migrating Data from MySQL to MRS Hive
            • Migrating Data from MySQL to OBS
            • Migrating Data from MySQL to DWS
            • Migrating an Entire MySQL Database to RDS
            • Migrating Data from Oracle to CSS
            • Migrating Data from Oracle to DWS
            • Migrating Data from OBS to CSS
            • Migrating Data from OBS to DLI
            • Migrating Data from MRS HDFS to OBS
            • Migrating the Entire Elasticsearch Database to CSS
          • Error Codes
        • DataArts Migration (Offline Jobs)
          • Overview of Offline Jobs
          • Supported Data Sources
          • Creating an Offline Processing Migration Job
          • Configuring an Offline Processing Migration Job
          • Configuring Source Job Parameters
            • From MySQL
            • From Hive
            • From HDFS
            • From Hudi
            • From PostgreSQL
            • From SQLServer
            • From Oracle
            • From DLI
            • From OBS
            • From SAP HANA
            • From Kafka
            • From Rest Client
            • From DWS
            • From FTP/SFTP
            • From Doris
            • From HBase
            • From ClickHouse
            • From Elasticsearch
            • From MongoDB
            • From RestApi
            • From GBase
            • From Redis
            • From LTS
          • Configuring Destination Job Parameters
            • To PostgreSQL
            • To Oracle
            • To MySQL
            • To SQLServer
            • To Hudi
            • To Hive
            • To DLI
            • To Elasticsearch
            • To DWS
            • To OBS
            • To SAP HANA
            • To ClickHouse
            • To Doris
            • To HBase
            • To MongoDB
            • To MRS Kafka
            • To GBase
            • To Redis
            • To HDFS
          • Configuring Field Converters
          • Adding Fields
        • DataArts Migration (Real-Time Jobs)
          • Overview of Real-Time Jobs
          • Notes and Constraints
          • Supported Data Sources
          • Check Before Use
          • Enabling Network Communications
            • Database Deployed in an On-premises IDC
              • Using Direct Connect to Enable Network Communications
              • Using VPN to Enable Network Communications
              • Using a Public Network to Enable Network Communications
            • Database Deployed on Another Cloud
              • Using Direct Connect to Enable Network Communications
              • Using VPN to Enable Network Communications
              • Using a Public Network to Enable Network Communications
            • Database Deployed on Huawei Cloud
              • Enabling Network Communications Directly for the Same Region and Tenant
              • Using a VPC Peering Connection to Enable Network Communications for the Same Region but Different Tenants
              • Using an Enterprise Router to Enable Network Communications for the Same Region but Different Tenants
              • Using a Cloud Connection to Enable Cross-Region Network Communications
          • Creating a Real-Time Migration Job
          • Configuring a Real-Time Migration Job
          • Real-Time Migration Job O&M
            • Viewing Monitoring Metrics
            • Viewing Synchronization Logs
            • Creating an Alarm Rule
            • Modifying Job Configurations
          • Field Type Mapping
            • Mapping Between MySQL and MRS Hudi Field Types
            • Mapping Between MySQL and GaussDB(DWS) Field Types
            • Mapping Between PostgreSQL and GaussDB(DWS) Field Types
            • Mapping Between PostgreSQL and MRS Hudi Field Types
            • Mapping Between Centralized/Distributed GaussDB and GaussDB(DWS) Field Types
            • Mapping Between Centralized/Distributed GaussDB and MRS Hudi Field Types
            • Mapping Between SQL Server and GaussDB(DWS) Field Types
            • Mapping Between SQL Server and Hudi Field Types
            • Mapping Between Oracle and MRS Hudi Field Types
          • Job Performance Optimization
            • Overview
            • Optimizing Job Parameters
            • Optimizing the Parameters of a Job for Migrating Data from MRS Kafka to MRS Hudi
            • Optimizing the Parameters of a Job for Migrating Data from MySQL to MRS Hudi
            • Optimizing the Parameters of a Job for Migrating Data from MySQL to GaussDB(DWS)
            • Optimizing the Parameters of a Job for Migrating Data from MySQL to
            • Optimizing the Parameters of a Job for Migrating Data from DMS for Kafka to OBS
            • Optimizing the Parameters of a Job for Migrating Data from Apache Kafka to MRS Kafka
            • Optimizing the Parameters of a Job for Migrating Data from SQL Server to MRS Hudi
            • Optimizing the Parameters of a Job for Migrating Data from PostgreSQL to GaussDB(DWS)
            • Optimizing the Parameters of a Job for Migrating Data from PostgreSQL to MRS Hudi
            • Optimizing the Parameters of a Job for Migrating Data from Oracle to GaussDB(DWS)
            • Optimizing the Parameters of a Job for Migrating Data from Oracle to MRS Hudi
            • Optimizing the Parameters of a Job for Migrating Data from SQL Server to GaussDB(DWS)
            • Optimizing the Parameters of a Job for Migrating Data from a Centralized/Distributed GaussDB to GaussDB(DWS)
            • Optimizing the Parameters of a Job for Migrating Data from a Centralized/Distributed GaussDB to MRS Hudi
          • Tutorials
            • Overview
            • Migrating a DRS Task to DataArts Migration
            • Configuring a Job for Synchronizing Data from MySQL to MRS Hudi
            • Configuring a Job for Synchronizing Data from MySQL to GaussDB(DWS)
            • Configuring a Job for Synchronizing Data from MySQL to Kafka
            • Configuring a Job for Synchronizing Data from DMS for Kafka to OBS
            • Configuring a Job for Synchronizing Data from Apache Kafka to MRS Kafka
            • Configuring a Job for Synchronizing Data from to Hudi
            • Configuring a Job for Synchronizing Data from SQL Server to MRS Hudi
            • Configuring a Job for Synchronizing Data from SQL Server to GaussDB(DWS)
            • Configuring a Job for Synchronizing Data from PostgreSQL to GaussDB(DWS)
            • Configuring a Job for Synchronizing Data from PostgreSQL to MRS Hudi
            • Configuring a Job for Synchronizing Data from PostgreSQL to
            • Configuring a Job for Synchronizing Data from Oracle to GaussDB(DWS)
            • Configuring a Job for Synchronizing Data from Oracle to MRS Hudi
            • Configuring a Job for Synchronizing Data from Oracle to
            • Configuring a Job for Synchronizing Data from MongoDB to GaussDB(DWS)
            • Configuring a Job for Migrating Data from a Centralized/Distributed GaussDB to GaussDB(DWS)
            • Configuring a Job for Migrating Data from a Centralized/Distributed GaussDB to MRS Hudi
            • Configuring a Job for Synchronizing Data from a Centralized/Distributed GaussDB to
        • DataArts Architecture
          • Overview
          • DataArts Architecture Use Process
          • Adding Reviewers
          • Data Survey
            • Designing Processes
            • Designing Subjects
            • Logical Models
          • Standards Design
            • Creating a Lookup Table
            • Creating Data Standards
          • Model Design
            • Data Warehouse Planning
            • ER Modeling
            • Dimensional Modeling
              • Creating Dimensions
              • Managing Dimension Tables
              • Creating Fact Tables
            • Data Mart
          • Metric Design
            • Business Metrics
            • Technical Metrics
              • Creating Atomic Metrics
              • Creating Derivative Metrics
              • Creating Compound Metrics
              • Creating Time Filters
          • Common Operations
            • Reversing a Database (ER Modeling)
            • Reversing a Database (Dimensional Modeling)
            • Importing/Exporting Data
            • Associating Quality Rules
            • Viewing Tables
            • Modifying Subjects, Directories, and Processes
            • Managing the Configuration Center
            • Review Center
          • Tutorials
            • DataArts Architecture Example
        • DataArts Factory
          • Overview
          • Data Management
            • Data Management Process
            • Creating a Data Connection
            • Creating a Database
            • (Optional) Creating a Database Schema
            • Creating a Table
          • Script Development
            • Script Development Process
            • Creating a Script
            • Developing Scripts
              • Developing an SQL Script
              • Developing a Shell Script
              • Developing a Python Script
            • Submitting a Version
            • Releasing a Script Task
            • (Optional) Managing Scripts
              • Copying a Script
              • Copying the Script Name and Renaming a Script
              • Moving a Script or Script Directory
              • Exporting and Importing Scripts
              • Viewing Script References
              • Deleting a Script
              • Unlocking a Script
              • Changing the Script Owner
              • Unlocking Scripts
          • Job Development
            • Job Development Process
            • Creating a Job
            • Developing a Pipeline Job
            • Developing a Batch Processing Single-Task SQL Job
            • Developing a Real-Time Processing Single-Task MRS Flink SQL Job
            • Developing a Real-Time Processing Single-Task MRS Flink Jar Job
            • Developing a Real-Time Processing Single-Task DLI Spark Job
            • Setting Up Scheduling for a Job
            • Submitting a Version
            • Releasing a Job Task
            • (Optional) Managing Jobs
              • Copying a Job
              • Copying the Job Name and Renaming a Job
              • Moving a Job or Job Directory
              • Exporting and Importing Jobs
              • Configuring Jobs
              • Deleting a Job
              • Unlocking a Job
              • Viewing a Job Dependency Graph
              • Changing the Job Owner
              • Unlocking Jobs
              • Going to Monitor Job page
          • Solution
          • Execution History
          • O&M and Scheduling
            • Overview
            • Monitoring a Job
              • Monitoring a Batch Job
              • Monitoring a Real-Time Job
            • Instance Monitoring
            • Monitoring PatchData
            • Notification Management
              • Managing Notifications
              • Cycle Overview
              • Managing Terminal Subscriptions
            • Managing Backups
            • Operation History
          • Configuration and Management
            • Configuring Resources
              • Configuring Environment Variables
              • Configuring an OBS Bucket
              • Managing Job Tags
              • Configuring a Scheduling Identity
              • Configuring the Number of Concurrently Running Nodes
              • Configuring a Template
              • Configuring a Scheduling Calendar
              • Configuring a Default Item
              • Configuring Task Groups
            • Managing Resources
          • Review Center
          • Download Center
          • Node Reference
            • Node Overview
            • Node Lineages
              • Data Lineage Overview
              • Configuring Data Lineages
              • Viewing Data Lineages
            • CDM Job
            • Data Migration
            • Rest Client
            • Import GES
            • MRS Kafka
            • Kafka Client
            • ROMA FDI Job
            • DLI Flink Job
            • DLI SQL
            • DLI Spark
            • DWS SQL
            • MRS Spark SQL
            • MRS Hive SQL
            • MRS Presto SQL
            • MRS Spark
            • MRS Spark Python
            • MRS ClickHouse
            • MRS Impala SQL
            • MRS Flink Job
            • MRS MapReduce
            • CSS
            • Shell
            • RDS SQL
            • ETL Job
            • Python
            • DORIS SQL
            • ModelArts Train
            • Create OBS
            • Delete OBS
            • OBS Manager
            • Open/Close Resource
            • Data Quality Monitor
            • Subjob
            • For Each
            • SMN
            • Dummy
          • EL Expression Reference
            • Expression Overview
            • Basic Operators
            • Date and Time Mode
            • Env Embedded Objects
            • Job Embedded Objects
            • StringUtil Embedded Objects
            • DateUtil Embedded Objects
            • JSONUtil Embedded Objects
            • Loop Embedded Objects
            • OBSUtil Embedded Objects
            • Examples of Common EL Expressions
            • EL Expression Use Examples
          • Simple Variable Set
          • Usage Guidance
            • Referencing Parameters in Scripts and Jobs
            • Setting the Job Scheduling Time to the Last Day of Each Month
            • Configuring a Yearly Scheduled Job
            • Using PatchData
            • Obtaining the Output of an SQL Node
            • Obtaining the Maximum Value and Transferring It to a CDM Job Using a Query SQL Statement
            • IF Statements
            • Obtaining the Return Value of a Rest Client Node
            • Using For Each Nodes
            • Using Script Templates and Parameter Templates
            • Developing a Python Job
            • Developing a DWS SQL Job
            • Developing a Hive SQL Job
            • Developing a DLI Spark Job
            • Developing an MRS Flink Job
            • Developing an MRS Spark Python Job
        • DataArts Quality
          • Metric Monitoring (Pending Offline)
            • Overview
            • Creating a Metric
            • Creating a Rule
            • Creating a Scenario
            • Viewing a Scenario Instance
          • Monitoring Data Quality
            • Overview
            • Creating a Data Quality Rule
            • Creating a Data Quality Job
            • Creating a Data Comparison Job
            • Viewing Job Instances
            • Viewing Data Quality Reports
          • Tutorials
            • Creating a Business Scenario
            • Creating a Quality Job
            • Creating a Comparison Job
        • DataArts Catalog
          • Viewing the Workspace Data Map
            • Viewing Data Assets in a Workspace
            • Viewing the Asset Overview
            • Viewing Data Assets
            • Managing Asset Tags
          • Configuring Data Access Permissions (To Be Removed)
            • Introduction to Data Permissions (To Be Removed)
            • Configuring Data Catalog Permissions (To Be Removed)
            • Configuring Table Permissions (To Be Removed)
            • Managing Review Center (To Be Removed)
          • Configuring Data Security Policies (To Be Removed)
            • Introduction to Data Security (To Be Removed)
            • Creating a Data Security Level (To Be Removed)
            • Creating a Data Classification (To Be Removed)
            • Creating a Data Masking Policy (To Be Removed)
          • Collecting Metadata of Data Sources
            • Overview
            • Configuring a Metadata Collection Task
            • Viewing Task Monitoring Information
          • Tutorial for Typical Scenarios of DataArts Catalog
            • Configuring an Incremental Metadata Collection Task
            • Viewing Data Lineages Through DataArts Catalog
              • Data Lineage Overview
              • Configuring Data Lineages
              • Viewing Data Lineages
        • DataArts Security
          • Overview
          • Dashboard
          • Unified Permission Governance
            • Permission Governance Process
            • Authorizing dlg_agency
            • Checking the Cluster Version and Permissions
            • Synchronizing IAM Users to the Data Source
            • Controlling Data Access Using Permissions
              • Configuring Workspace Permission Sets
              • Configuring Permission Sets
              • Configuring Roles
              • Managing Members
              • Configuring Row-level Access Control
              • Synchronizing MRS Hive and Hetu Permissions
              • Applying for Permissions and Approving Permissions
              • Enabling Fine-grained Authentication
            • Controlling Service Resource Access
              • Configuring Queue Permissions
              • Configuring Workspace Resource Permission Policies
            • Controlling Ranger Access Using Permissions
              • Configuring Resource Permissions
              • Viewing Permission Reports
          • Sensitive Data Governance
            • Sensitive Data Governance Process
            • Creating Data Security Levels
            • Creating Data Classifications
            • Creating Identification Rules
            • Creating Identification Rule Groups
            • Discovering Sensitive Data
            • Viewing Sensitive Data Distribution
            • Managing Sensitive Data
          • Sensitive Data Protection
            • Overview
            • Static Masking Tasks
              • Managing Masking Algorithms
              • Managing Sample Libraries
              • Managing Masking Policies
              • Managing Static Masking Tasks
            • Dynamic Masking Tasks
              • Managing Dynamic Masking Policies
              • Subscribing to Dynamic Masking Policies
            • Data Watermarks
              • Embedding Data Watermarks
              • Tracing Data Using Watermarks
            • File Watermarks
            • Dynamic Watermarks
          • Data Security Operations
            • Viewing Audit Logs
            • Diagnosing Data Security Risks
          • Managing the Recycle Bin
        • DataArts DataService
          • Overview
          • Specifications
          • Developing APIs in DataArts DataService
            • Buying and Managing an Exclusive Cluster
            • Creating a Reviewer in DataArts DataService
            • Creating an API
              • Generating an API Using Configuration
              • Generating an API Using a Script or MyBatis
            • Debugging an API
            • Publishing an API
            • Managing APIs
              • Managing API Versions
              • Displaying an API
              • Suspending/Restoring an API
              • Unpublishing/Deleting APIs
              • Copying an API
              • Synchronizing APIs
              • Exporting All/Exporting/Importing APIs
            • Orchestrating APIs
              • Overview
              • Configuring an Entry API Operator
              • Configuring a Conditional Branch Operator
              • Configuring a Parallel Processing Operator
              • Configuring an Output Processing Operator
              • Typical API Orchestration Configuration
            • Configuring a Throttling Policy for API Calling
            • Authorizing API Calling
              • Authorizing an API Which Uses App Authentication to Apps
              • Authorizing an API Which Uses IAM Authentication to Apps
              • Authorizing an API Which Uses IAM Authentication Through a Whitelist
          • Calling APIs in DataArts DataService
            • Applying for API Authorization
            • Calling APIs Using Different Methods
              • API Calling Methods
              • (Recommended) Using an SDK to Call an API Which Uses App Authentication
              • Using an API Tool to Call an API Which Uses App Authentication
              • Using an API Tool to Call an API Which Uses IAM Authentication
              • Using an API Tool to Call an API Which Requires No Authentication
              • Using a Browser to Call an API Which Requires No Authentication
          • Viewing API Access Logs
          • Configuring Review Center
        • Audit Log
          • Viewing Traces
          • Key Operations Recorded by CTS
            • Management Center Operations
            • Key CDM Operations Recorded by CTS
            • DataArts Architecture Operations
            • DataArts Factory Operations
            • DataArts Quality Operations
            • DataArts Catalog Operations
            • DataArts DataService Operations
      • Best Practices
        • Advanced Data Migration Guidance
          • Incremental Migration
            • Incremental File Migration
            • Incremental Migration of Relational Databases
            • HBase/CloudTable Incremental Migration
            • MongoDB/DDS Incremental Migration
          • Using Macro Variables of Date and Time
          • Migration in Transaction Mode
          • Encryption and Decryption During File Migration
          • MD5 Verification
          • Configuring Field Converters
          • Adding Fields
          • Migrating Files with Specified Names
          • Regular Expressions for Separating Semi-structured Text
          • Recording the Time When Data Is Written to the Database
          • File Formats
          • Converting Unsupported Data Types
        • Advanced Data Development Guidance
          • Dependency Policies for Periodic Scheduling
            • Comparison Between Traditional Periodic Scheduling Dependency and Natural Periodic Scheduling Dependency
            • Traditional Periodic Scheduling
            • Natural Periodic Scheduling
            • Natural Periodic Scheduling: Same-Period Dependency
            • Natural Periodic Scheduling: Dependency on the Previous Period
          • Scheduling by Discrete Hours and Scheduling by the Nearest Job Instance
          • Using PatchData
          • Setting the Job Scheduling Time to the Last Day of Each Month
          • Obtaining the Output of an SQL Node
          • IF Statements
          • Obtaining the Return Value of a Rest Client Node
          • Using For Each Nodes
          • Invoking DataArts Quality Operators Using DataArts Factory and Transferring Quality Parameters During Job Running
          • Scheduling Jobs Across Workspaces
          • Developing a Flink Jar Job
        • DataArts Studio Data Migration Configuration
          • Overview
          • Management Center Data Migration Configuration
          • DataArts Migration Data Migration Configuration
          • DataArts Architecture Data Migration Configuration
          • DataArts Factory Data Migration Configuration
          • DataArts Quality Data Migration Configuration
          • DataArts Catalog Data Migration Configuration
          • DataArts Security Data Migration Configuration
          • DataArts DataService Data Migration Configuration
        • Least Privilege Authorization
        • How Do I View the Number of Table Rows and Database Size?
        • Comparing Data Before and After Data Migration Using DataArts Quality
        • Configuring Alarms for Jobs in DataArts Factory of DataArts Studio
        • Scheduling a CDM Job by Transferring Parameters Using DataArts Factory
        • Enabling Incremental Data Migration Through DataArts Factory
        • Creating Table Migration Jobs in Batches Using CDM Nodes
        • Automatic Construction and Analysis of Graph Data
          • Scenario
          • Operating Environment and Data Preparation
          • Creating a Data Integration Job
          • Developing and Scheduling an Import GES Job
          • Analyzing Graph Data
        • Simplified Migration of Trade Data to the Cloud and Analysis
          • Scenario
          • Analysis Process
          • Using CDM to Upload Data to OBS
            • Uploading Inventory Data
            • Uploading Incremental Data
          • Analyzing Data
      • SDK Reference
        • SDK Overview
        • REST API SDK Reference
        • DataArts DataService SDK Reference
          • DataArts DataService SDK Overview
          • Preparations for Using an SDK
          • Common Error Codes and Messages for SDK Invocation
          • Calling APIs Through App Authentication
            • Preparation
            • Java
            • Go
            • Python
            • C#
            • JavaScript
            • PHP
            • C++
            • C
            • Android
            • curl
            • Other Programming Languages
      • API Reference
        • Before You Start
          • Overview
          • API Calling
          • Concepts
          • Endpoints
          • Project ID and Account ID
          • Instance ID and Workspace ID
          • Data Asset GUID
          • Constraints
        • API Overview
          • DataArts Migration API Overview
          • DataArts Factory API Overview
          • Management Center API Overview
          • DataArts Architecture API Overview
          • DataArts Quality API Overview
          • DataArts Catalog API Overview
          • DataArts DataService API Overview
          • DataArts Security API Overview
        • Calling APIs
          • Making an API Request
          • Authentication
          • Response
        • DataArts Migration APIs
          • Cluster Management
            • Querying Cluster Details
            • Deleting a Cluster
            • Querying All AZs
            • Querying Supported Versions
            • Querying Version Specifications
            • Querying Details About a Flavor
            • Querying the Enterprise Project IDs of All Clusters
            • Querying the Enterprise Project ID of a Specified Cluster
            • Query a Specified Instance in a Cluster
            • Modifying a Cluster
            • Restarting a Cluster
            • Starting a Cluster
            • Stopping a Cluster (To Be Taken Offline)
            • Creating a Cluster
            • Querying the Cluster List
          • Job Management
            • Querying a Job
            • Deleting a Job
            • Modifying a Job
            • Creating and Executing a Job in a Random Cluster
            • Stopping a Job
            • Creating a Job in a Specified Cluster
            • Starting a Job
            • Querying Job Status
            • Querying Job Execution History
          • Link Management
            • Creating a Link
            • Querying a Link
            • Deleting a Link
            • Modifying a Link
          • Public Data Structures
            • Link Parameter Description
              • Link to a Relational Database
              • Link to OBS
              • Link to HDFS
              • Link to HBase
              • Link to CloudTable
              • Link to Hive
              • Link to an FTP or SFTP Server
              • Link to MongoDB
              • Link to Redis
              • Link to Kafka
              • Link to Elasticsearch/Cloud Search Service
              • Link to DLI
              • Link to DMS Kafka
            • Source Job Parameters
              • From a Relational Database
              • From Object Storage
              • From HDFS
              • From Hive
              • From HBase/CloudTable
              • From FTP/SFTP
              • From HTTP/HTTPS
              • From MongoDB/DDS
              • From Redis
              • From DIS
              • From Kafka
              • From Elasticsearch/Cloud Search Service
            • Destination Job Parameters
              • To a Relational Database
              • To OBS
              • To HDFS
              • To Hive
              • To HBase/CloudTable
              • To DDS
              • To Elasticsearch/Cloud Search Service
              • To DLI
              • To DIS
            • Job Parameter Description
        • DataArts Factory APIs
          • Script Development APIs
            • Creating a Script
            • Modifying a Script
            • Querying Script Details
            • Querying a Script List
            • Querying the Execution Result of a Script Instance
            • Deleting a Script
            • Executing a Script
            • Stopping Executing a Script Instance
          • Resource Management APIs
            • Creating a Resource
            • Modifying a Resource
            • Querying Resource Details
            • Deleting a Resource
            • Querying a Resource List
          • Job Development APIs
            • Creating a Job
            • Modifying a Job
            • Viewing a Job List
            • Viewing Job Details
            • Viewing a Job File
            • Exporting a Job
            • Batch Exporting Jobs
            • Importing a Job
            • Executing a Job Immediately
            • Starting a Job
            • Stopping a Job
            • Deleting a Job
            • Stopping a Job Instance
            • Rerunning a Job Instance
            • Viewing Running Status of a Real-Time Job
            • Viewing a Job Instance List
            • Viewing Job Instance Details
            • Querying System Task Details
        • DataArts Factory APIs (V2)
          • Job Development APIs
            • Creating a PatchData Instance
            • Querying PatchData Instances
            • Stopping a PatchData Instance
            • Changing a Job Name
            • Querying Release Packages
            • Querying Details About a Release Package
            • Configuring Job Tags
            • Querying Alarm Notifications
            • Releasing Task Packages
            • Canceling Task Packages
            • Querying the Instance Execution Status
            • Querying Completed Tasks
            • Querying Instances of a Specified Job
        • Manager API
          • Data Connection Management
            • Querying the Data Source List
            • Creating a Data Connection
            • Testing Data Connection Creation
            • Querying Information About a Data Connection
            • Updating Data Connection Information
            • Deleting a Data Connection
          • Buying an Instance
            • Buying a DataArts Studio Instance
          • Workspace Management
            • Obtaining the Workspace List
            • Creating a Workspace
            • Obtaining Information About a Workspace
          • Instance Management
            • Obtaining the Instance List
          • Workspace User Management
            • Obtaining Workspace Roles
            • Editing a Workspace User or User Group
            • Obtaining Workspace User Information
            • Adding a Workspace User
            • Deleting a Workspace User
          • Metadata Acquisition
            • Obtaining the Database List
            • Obtaining Schemas
            • Obtaining Tables in the Data Source
            • Obtaining Table Fields in the Data Source
          • Instance Specifications Change
            • Changing Specifications
        • DataArts Architecture APIs
          • Overview
            • Overview Statistics
            • Model Statistics
            • Relational Modeling Statistics
            • Standard Coverage Statistics
          • Information Architecture
            • Querying Information About Multiple Types of Tables
          • Data Standards
            • Obtaining Data Standards
            • Creating a Data Standard
            • Deleting Data Standards
            • Modifying Data Standards
            • Viewing Data Standard Details
            • Associated Attributes and Data Standards
          • Data Sources
            • Obtaining Data Connection Information
          • Process Architecture
            • Obtain all BPA directory trees.
            • Querying the BPA List
            • Create BPA
            • Modified the process architecture.
            • Delete BPA
            • Search for BPA Details
          • Data Standard Templates
            • Querying a Data Standard Template
            • Creating a Data Standard Template
            • Modifying a Data Standard Template
            • Deleting a Data Standard Template
            • Standard Template for Initializing Data
          • Approval Management
            • Obtaining an Application
            • Withdrawing an Application
            • Application Processing
            • Batch Publish
            • Taking Services Offline in Batches
            • Create Approver
            • Querying the Approver List
            • Delete Approver
            • Deleting an Entity
            • Obtain the difference between the displayed information and the released entity.
          • Subject Management
            • Searching for a Topic List
            • Deleting a Topic
            • Creating a topic
            • Changing a topic
            • Obtaining Topic Tree Information
            • Find Topic List (New)
            • Deleting a Theme (New)
            • Creating a Topic (New)
            • Modify Theme (New)
          • Subject Levels
            • Obtains the topic level.
            • Modifying or Deleting a Topic Level
          • Catalog Management
            • Obtains all directories.
            • Create a Directory
            • Modifying a Directory
            • Deleting a Directory
          • Atomic Metrics
            • Searching for Atomic Metrics
            • Creating Atomic Metrics
            • Updating an Atomic Metric
            • Deleting Atomic Metrics
            • Viewing Atomic Metric Details
          • Derivative Metrics
            • Searching for Derivative Indicators
            • Creating a Derivative Metric
            • Updating a Derivative Metric
            • Deleting Derivative Metrics
            • Viewing Derivative Indicator Details
          • Compound Metrics
            • Searching for a Composite Metric
            • Creating a Compound Metric
            • Updating a Compound Metric
            • Deleting Compound Metrics
            • Viewing Composite Metric Details
          • Dimensions
            • Search Dimension
            • Creating a Dimension
            • Updating a Dimension
            • Deleting Dimensions
            • Viewing Dimension Details
            • View Dimension Granularity
            • View Reverse Dimension Table Task
          • Filters
            • Searching for a Service Filter
            • Viewing Restriction Details
          • Dimension Tables
            • Searching for a Dimension Table
            • Deleting Dimension Tables
            • Viewing Dimension Table Details
          • Fact Tables
            • Searching for a Fact Table
            • Deleting Fact Tables
            • Viewing Fact Table Details
            • Viewing a Reverse Fact Table Task
          • Summary Tables
            • Searching for a Summary Table
            • Creating Summary Tables
            • Update Summary Table
            • Deleting Summary Tables
            • Viewing SDR Table Details
          • Business Metrics
            • Querying Service Indicator Information
            • Creating a Service Indicator
            • Updating Service Indicators
            • Delete Business Metrics
            • Viewing Indicator Details
            • Viewing Indicator Dimension Information
            • View Metric Owner Information
            • Obtains indicator association information.
          • Version Information
            • Searching for Version Information
            • Comparing Versions
          • ER Modeling
            • Lookup Table Model List
            • Creating a Table Model
            • Updating a Table Model
            • Deleting a Table Model
            • Querying a Relationship
            • Viewing Relationship Details
            • Querying All Relationships in a Model
            • Viewing Table Model Details
            • Obtaining a Model
            • Creating a Model Workspace
            • Updating the Model Workspace
            • Deleting a Model Workspace
            • Viewing Details About a Model
            • Querying Destination Tables and Fields (To Be Offline)
            • Exporting DDL Statements of Tables in a Model
            • Converting a Logical Model to a Physical Model
            • Obtaining the Operation Result
          • Import and Export
            • Import models, relationship modeling, dimension modeling, code tables, service indicators, and process architecture.
            • Importing subjects
            • Querying the Import Result
            • Exporting Service Data
            • Obtain the Excel export result.
          • Customized Items
            • Querying Customized Items
            • Modifying a Customized Item
          • Quality Rules
            • Update the abnormal data output configuration of the table.
            • Clear Quality Rule
          • Data Warehouse Layers
            • Obtaining Data Warehouse Layers
            • Modifying or Deleting a Data Warehouse Layer
          • SQL Statement Previewing
            • Previewing SQL Statements
          • Tag API
            • Add a tag
            • Delete a tag.
          • Lookup Table Management
            • Querying the Lookup List
            • Creating a Lookup Table
            • Deleting Lookup Tables
            • Modifying a Lookup Table
            • Viewing Lookup Table Details
            • Viewing Field Values in the Code Table
            • Edit Code Table Field Value
        • DataArts Quality APIs
          • Catalogs
            • Obtaining Job Catalogs
          • Rule Templates
            • Obtaining the Rule Template List
            • Creating a Rule Template
            • Obtaining Rule Template Details
            • Updating a Rule Template
            • Deleting Rule Templates
          • Quality Jobs
            • Obtaining the Quality Job List
            • Obtaining the Quality Job List
            • Obtaining Quality Job Details
            • Deleting Quality Jobs
          • Comparison Jobs
            • Obtaining the Comparison Job List
            • Obtaining Comparison Job Details
            • Deleting Comparison Jobs
          • O&M
            • Obtaining the Task Execution Result List
            • Obtaining the Instance Result
            • Data quality O&M management operation processing and recording
          • Quality Reports
            • Obtaining a Quality Report Scoring System
            • Obtaining the Quality Report Overview
            • Obtaining the Quality Report Trend
            • Obtaining the Quality Report Rules
            • Obtaining Sub-rule Fields of a Quality Report
            • Obtaining the Technical Report Data
            • Obtaining the Business Report Data
          • Import/Export
            • Export Resources
            • Obtaining the Status of an Import/Export Task
            • Downloading a Resource File
            • Uploading a Resource File
            • Importing Resources
          • Job Instances
            • Stopping Instances
        • DataArts Catalog APIs
          • Logical Assets
            • Querying Logical Assets
            • Querying the Logical Asset Directory Tree
          • Metric Assets
            • Querying the Metric Asset Directory Tree
            • Querying Metric Assets
          • Asset Statistics
            • Obtaining Technical Asset Statistics
            • Obtaining Logical Asset Statistics
          • Asset Management
            • Querying Technical Assets
            • Obtaining Asset Details by GUID
            • Adding or Modifying an Asset
            • Deleting an Asset
          • Asset Classifications
            • Associating Assets with a Classification
            • Associating a Classification with One or More Assets
            • Dissociating a Classification from an Asset
          • Asset Security Levels
            • Associating a Security Level with Assets
            • Associating a Security Level with an Asset
            • Dissociate a Security Level from an Asset
          • Lineages
            • Creating a Lineage
          • Metadata Collection Tasks
            • This API is used to query the metadata collection task list.
            • Create Collection Task
            • Querying Details About a Metadata Collection Task
            • Modifying a collection task
            • This API is used to delete a metadata collection task.
            • Starting, Scheduling, or Stopping a Metadata Collection Task
            • Obtaining Logs of a Metadata Collection Task
          • Data Map
            • Synchronizing Metadata in Real Time (Invitational Test)
            • Displaying Search and Query Tags on Multiple Pages (Invitational Test)
            • User Behavior Analysis (Invitational Test)
            • Asset Search (Invitational Test)
            • Creating or Modifying an Asset (Invitational Test)
            • Asset Details (Invitational Test)
            • Deleting an Asset (Invitational Test)
            • Obtaining an Asset Lineage (Invitational Test)
            • Batch Lineage (Invitational Test)
            • Querying Operators Associated with a Table (Invitational Test)
            • Output Information (Invitational Test)
            • Tagging Assets (Invitational Test)
            • Obtaining Queues (Invitational Test)
            • Previewing Data (Invitational Test)
          • Tags
            • Querying the Tag List
            • Associating Tags with an Asset
          • Lineage Information
            • This API is used to query an unrelated table.
            • Lineage query
            • Lineage import
          • Asset Information
            • Previewing Table Data
          • Asset
            • This API is used to query a profile.
            • Querying the Summary of Specified Fields
        • DataArts DataService APIs
          • API Management
            • Create an API
            • Querying an API List
            • Updating an API
            • Querying API Information
            • Deleting APIs
            • Publishing an API
            • API operations (offline/suspension/resumption)
            • Batch Authorization API (Exclusive Edition)
            • Debugging an API
            • API authorization operations (authorization/authorization cancellation/application/renewal)
            • Querying API Publishing Messages in DLM Exclusive
            • Querying Instances for API Operations in DLM Exclusive
            • Querying API Debugging Messages in DLM Exclusive
            • Importing an Excel File Containing APIs
            • Exporting an Excel File Containing APIs
            • Exporting a .zip File Containing All APIs
            • Downloading an Excel Template
          • Application Management
            • Querying the Application List
            • Reviewing Applications
            • Obtaining Application Details
          • Message Management
            • Querying the Message List
            • Processing Messages
            • Obtaining Message Details
          • Authorization Management
            • Querying Apps Bound to an API
            • Querying Authorized APIs of an App
          • Service Catalog Management
            • Obtaining the List of APIs and Catalogs in a Catalog
            • Obtaining the List of APIs in a Catalog
            • Obtaining the List of Sub-Catalogs in a Catalog
            • Updating a Service Catalog
            • Query the service catalog
            • Creating a Service Catalog
            • Deleting Directories in Batches
            • Moving a Catalog to Another Catalog
            • Moving APIs to Another Catalog
            • Obtaining the ID of a Catalog Through Its Path
            • Obtaining the Path of a Catalog Through Its ID
            • Obtaining the Paths to a Catalog Through Its ID
            • Querying the Service Catalog API List
          • Gateway Management
            • Obtaining a Gateway Instance (Exclusive Edition)
            • Obtaining a Gateway Group
          • App Management
            • Creating an App
            • Querying the App List
            • Updating an App
            • Deleting an Ap
            • Querying the Details About an App
          • Overview
            • Querying and Collecting Statistics on User-related Overview Development Indicators
            • This API is used to query and collect statistics on user-related overview invoking metrics.
            • Querying Top N API Services Invoked
            • Querying Top N Services Used by an App
            • Querying API Statistics Details
            • Querying App Statistics
            • Querying API Dashboard Data Details
            • Querying Data Details of a Specified API Dashboard
            • Querying App Dashboard Data Details
            • Querying Top N APIs Called by a Specified API Application
          • Cluster Management
            • Querying the List of Cluster Overview Information
            • Querying the List of Cluster Details
            • Querying Cluster Details
            • Querying Access Logs of a DataArts DataService Cluster
            • Enabling Log Dump to OBS for a DataArts DataService Cluster
            • Enabling Log Dump to LTS for a DataArts DataService Cluster
        • DataArts Security APIs
          • Permission Management
            • Creating a Permission Set
            • Querying the Privilege Set List
            • Querying a Right Set
            • Deleting a Permission Set
            • Updating one or more permission sets
            • Adding a Member to a Privilege Set
            • Querying the Member List of a Privilege Set
            • Adding Members to a Permission Set
            • Deleting Privilege Set Members in Batches
            • Adding Permissions to a Permission Set
            • Querying the Permission List of a Permission Set
            • Deleting a Right from a Right Set
            • Updating the Rights of a Rights Set
            • Query Permissions Available for a Data Source
            • Querying the URL Information Configured in the Permission Set
            • Querying Data Operations
            • Querying a User's Permissions on a Table
            • Querying Table Permissions
          • Identification Rules
            • Querying the Recognition Rule List
            • Adding identification rules
            • Querying a Specific Identification Rule
            • Deleting a sensitive data identification rule
            • Modifying an Identification Rule
            • API for Deleting Recognition Rules in Batches
            • Modifying the Identification Rule Status
            • Creating a Combined Identification Rule
            • Modifying a Combined Identification Rule
            • Testing a Combined Identification Rule
          • Rule Groups
            • Querying the Rule Group List
            • Creating a Rule Group
            • Modifying a Rule Group
            • Querying a Rule Group
            • This API is used to delete a rule group.
          • Data Permission Query
            • Querying the Configurable Operation Permission of a Role on a Group of Databases and Tables
          • Data Security Levels
            • Obtain the data security level.
            • Creating a Data Confidentiality Level
            • Queries the data security level based on the specified ID.
            • Deletes the data security level of a specified ID.
            • This API is used to change the data security level based on a specified ID.
            • Deleting Data Confidentiality Levels in Batches
          • Permission Applications
            • Querying DataArts Factory Data Connections That Support Fine-grained Authentication
            • Updating the Fine-grained Authentication Status of Data Development Connections in Batches
            • Testing Fine-grained Connectivity of Data Development Connections
          • Sensitive Data Distribution
            • Querying the Sensitive Data Discovery Overview Result (by Category and Security Level)
          • User Synchronization
            • Querying User Synchronization Tasks
            • Querying a User Synchronization Task
          • Queue Permissions
            • Querying Queues Allocated to the Current Workspace
            • Allocating Queues to a Workspace
            • Modifying a Queue in the Current Workspace
            • Deleting a Queue from the Current Workspace
          • Data Classification
            • Importing a Preset Classification
          • Data Security Diagnosis
            • Diagnosing Data Security
            • Query Improper Permission Configurations
            • Querying the Diagnosis Result of the Data Permission Control Module
            • Querying the Diagnosis Result of the Sensitive Data Protection Module
            • Querying Tables for Which No Static Masking Task Has Been Performed
            • Querying the Diagnosis Result of the Data Source Protection Module
          • Workspace Resource Permission Policy Management
            • Querying Workspace Resource Permission Policies
            • Creating a Workspace Resource Permission Policy
            • This API is used to query a workspace resource permission policy.
            • Updating a Workspace Resource Permission Policy
            • Deleting Workspace Resource Permission Policies
          • Security Administrator
            • Obtaining the Security Administrator
            • Creating or Updating a Security Administrator
          • Dynamic Data Masking
            • Querying Dynamic Data Masking Policies
            • Creating a Dynamic Data Masking Policy
            • This API is used to query the details of a data masking policy.
            • Updating a Dynamic Data Masking Policy
            • Deleting Dynamic Masking Policies
          • Permission Approval
            • Obtaining the List of Table Permission Approvers
            • Submitting a Table Permission Application
            • Approving a Service Ticket
            • Rejecting a Service Ticket
            • Obtaining the Approval Service Ticket List
          • Permission Application
            • Withdrawing a Permission Application Service Ticket
          • Data Classifications
            • Querying the Data Classification List
          • Permission Review
            • Approving Service Tickets
            • Rejecting Service Tickets
        • Application Cases
          • Example of Using DataArts Migration APIs
          • Example of Using DataArts Factory APIs
        • Appendix
          • Common Message Headers
          • Parsing a Stream in a Response Message
          • Status Codes
          • Error Codes
            • DataArts Migration Error Codes
            • DataArts Factory Error Codes
      • FAQs
        • Consultation and Billing
          1. How Do I Select a Region and an AZ?
          2. What Is the Relationship Between DataArts Studio and Huawei Horizon Digital Platform?
          3. What Are the Differences Between DataArts Studio and ROMA?
          4. Can DataArts Studio Be Deployed in a Local Data Center or on a Private Cloud?
          5. How Do I Create a Fine-Grained Permission Policy in IAM?
          6. How Do I Isolate Workspaces So That Users Cannot View Unauthorized Workspaces?
          7. What Should I Do If a User Cannot View Workspaces After I Have Assigned the Required Policy to the User?
          8. What Should I Do If Insufficient Permissions Are Prompted When I Am Trying to Perform an Operation as an IAM User?
          9. Can I Delete DataArts Studio Workspaces?
          10. Can I Transfer a Purchased or Trial Instance to Another Account?
          11. Does DataArts Studio Support Version Upgrade?
          12. Does DataArts Studio Support Version Downgrade?
          13. How Do I View the DataArts Studio Instance Version?
          14. Why Can't I Select a Specified IAM Project When Purchasing a DataArts Studio Instance?
          15. What Is the Session Timeout Period of DataArts Studio? Can the Session Timeout Period Be Modified?
          16. Will My Data Be Retained If My Package Expires or My Pay-per-Use Resources Are in Arrears?
          17. How Do I Check the Remaining Validity Period of a Package?
          18. Why Isn't the CDM Cluster in a DataArts Studio Instance Billed?
          19. Why Does the System Display a Message Indicating that the Number of Daily Executed Nodes Has Reached the Upper Limit? What Should I Do?
        • Management Center
          1. Which Data Sources Can DataArts Studio Connect To?
          2. What Are the Precautions for Creating Data Connections?
          3. What Should I Do If Database or Table Information Cannot Be Obtained Through a GaussDB(DWS)/Hive/HBase Data Connection?
          4. Why Are MRS Hive/HBase Clusters Not Displayed on the Page for Creating Data Connections?
          5. What Should I Do If a GaussDB(DWS) Connection Test Fails When SSL Is Enabled for the Connection?
          6. Can I Create Multiple Connections to the Same Data Source in a Workspace?
          7. Should I Select the API or Proxy Connection Type When Creating a Data Connection in Management Center?
          8. How Do I Migrate the Data Development Jobs and Data Connections from One Workspace to Another?
        • DataArts Migration (CDM Jobs)
          1. What Are the Differences Between CDM and Other Data Migration Services?
          2. What Are the Advantages of CDM?
          3. What Are the Security Protection Mechanisms of CDM?
          4. How Do I Reduce the Cost of Using CDM?
          5. Will I Be Billed If My CDM Cluster Does Not Use the Data Transmission Function?
          6. Why Am I Billed Pay per Use When I Have Purchased a Yearly/Monthly CDM Incremental Package?
          7. How Do I Check the Remaining Validity Period of a Package?
          8. Can CDM Be Shared by Different Tenants?
          9. Can I Upgrade a CDM Cluster?
          10. How Is the Migration Performance of CDM?
          11. What Is the Number of Concurrent Jobs for Different CDM Cluster Versions?
          12. Does CDM Support Incremental Data Migration?
          13. Does CDM Support Field Conversion?
          14. What Component Versions Are Recommended for Migrating Hadoop Data Sources?
          15. What Data Formats Are Supported When the Data Source Is Hive?
          16. Can I Synchronize Jobs to Other Clusters?
          17. Can I Create Jobs in Batches?
          18. Can I Schedule Jobs in Batches?
          19. How Do I Back Up CDM Jobs?
          20. What Should I Do If Only Some Nodes in a HANA Cluster Can Communicate with the CDM Cluster?
          21. How Do I Use Java to Invoke CDM RESTful APIs to Create Data Migration Jobs?
          22. How Do I Connect the On-Premises Intranet or Third-Party Private Network to CDM?
          23. Does CDM Support Parameters or Variables?
          24. How Do I Set the Number of Concurrent Extractors for a CDM Migration Job?
          25. Does CDM Support Real-Time Migration of Dynamic Data?
          26. Can I Stop CDM Clusters?
          27. How Do I Obtain the Current Time Using an Expression?
          28. What Should I Do If the Log Prompts that the Date Format Fails to Be Parsed?
          29. What Can I Do If the Map Field Tab Page Cannot Display All Columns?
          30. How Do I Select Distribution Columns When Using CDM to Migrate Data to GaussDB(DWS)?
          31. What Do I Do If the Error Message "value too long for type character varying" Is Displayed When I Migrate Data to DWS?
          32. What Can I Do If Error Message "Unable to execute the SQL statement" Is Displayed When I Import Data from OBS to SQL Server?
          33. What Should I Do If the Cluster List Is Empty, I Have No Access Permission, or My Operation Is Denied?
          34. Why Is Error ORA-01555 Reported During Migration from Oracle to DWS?
          35. What Should I Do If Migration Using a MongoDB Connection Fails?
          36. What Should I Do If a Hive Migration Job Is Suspended for a Long Period of Time?
          37. What Should I Do If an Error Is Reported Because the Field Type Mapping Does Not Match During Data Migration Using CDM?
          38. What Should I Do If a JDBC Connection Timeout Error Is Reported During MySQL Migration?
          39. What Should I Do If a CDM Migration Job Fails After a Link from Hive to GaussDB(DWS) Is Created?
          40. How Do I Use CDM to Export MySQL Data to an SQL File and Upload the File to an OBS Bucket?
          41. What Should I Do If CDM Fails to Migrate Data from OBS to DLI?
          42. What Should I Do If a CDM Connector Reports the Error "Configuration Item [linkConfig.iamAuth] Does Not Exist"?
          43. What Should I Do If Error "Configuration Item [linkConfig.createBackendLinks] Does Not Exist" or "Configuration Item [throttlingConfig.concurrentSubJobs] Does Not Exist" Is Reported?
          44. What Should I Do If Message "CORE_0031:Connect time out. (Cdm.0523)" Is Displayed During the Creation of an MRS Hive Link?
          45. What Should I Do If Message "CDM Does Not Support Auto Creation of an Empty Table with No Column" Is Displayed When I Enable Auto Table Creation?
          46. What Should I Do If I Cannot Obtain the Schema Name When Creating an Oracle Relational Database Migration Job?
          47. What Should I Do If invalid input syntax for integer: "true" Is Displayed During MySQL Database Migration?
        • DataArts Architecture
          1. What Is the Relationship Between Lookup Tables and Data Standards?
          2. What Are the Differences Between ER Modeling and Dimensional Modeling?
          3. What Data Modeling Methods Are Supported by DataArts Architecture?
          4. How Can I Use Standardized Data?
          5. Does DataArts Architecture Support Database Reversing?
          6. What Are the Differences Between the Metrics in DataArts Architecture and DataArts Quality?
          7. Why Doesn't the Table in the Database Change After I Have Modified Fields in an ER or Dimensional Model?
          8. Can I Configure Lifecycle Management for Tables?
          9. How Should I Select a Subject When a Public Dimension (Date, Region, Supplier, or Product) Is Shared by Multiple Subject Areas?
          10. How Can I Create an Atomic Metric Using a Dimension Table Since Only a Fact Table Can Be Selected for Creating an Atomic Metric?
        • DataArts Factory
          1. How Many Jobs Can Be Created in DataArts Factory? Is There a Limit on the Number of Nodes in a Job?
          2. Does DataArts Studio Support Custom Python Scripts?
          3. How Can I Quickly Rectify a Deleted CDM Cluster Associated with a Job?
          4. Why Is There a Large Difference Between Job Execution Time and Start Time of a Job?
          5. Will Subsequent Jobs Be Affected If a Job Fails to Be Executed During Scheduling of Dependent Jobs? What Should I Do?
          6. What Should I Pay Attention to When Using DataArts Studio to Schedule Big Data Services?
          7. What Are the Differences and Relationships Between Environment Variables, Job Parameters, and Script Parameters?
          8. What Should I Do If a Job Log Cannot Be Opened and Error 404 Is Reported?
          9. What Should I Do If the Agency List Fails to Be Obtained During Agency Configuration?
          10. Why Can't I Select Specified Peripheral Resources When Creating a Data Connection in DataArts Factory?
          11. Why Can't I Receive Job Failure Alarm Notifications After I Have Configured SMN Notifications?
          12. Why Is There No Job Running Scheduling Log on the Monitor Instance Page After Periodic Scheduling Is Configured for a Job?
          13. Why Isn't the Error Cause Displayed on the Console When a Hive SQL or Spark SQL Scripts Fails?
          14. What Should I Do If the Token Is Invalid During the Execution of a Data Development Node?
          15. How Do I View Run Logs After a Job Is Tested?
          16. Why Does a Job Scheduled by Month Start Running Before the Job Scheduled by Day Is Complete?
          17. What Should I Do If Invalid Authentication Is Reported When I Run a DLI Script?
          18. Why Cannot I Select a Desired CDM Cluster in Proxy Mode When Creating a Data Connection?
          19. Why Is There No Job Running Scheduling Record After Daily Scheduling Is Configured for the Job?
          20. What Do I Do If No Content Is Displayed in Job Logs?
          21. Why Do I Fail to Establish a Dependency Between Two Jobs?
          22. What Should I Do If an Error Is Reported During Job Scheduling in DataArts Studio, Indicating that the Job Has Not Been Submitted?
          23. What Should I Do If an Error Is Reported During Job Scheduling in DataArts Studio, Indicating that the Script Associated with Node XXX in the Job Has Not Been Submitted?
          24. What Should I Do If a Job Fails to Be Executed After Being Submitted for Scheduling and an Error Displayed: Depend Job [XXX] Is Not Running Or Pause?
          25. How Do I Create Databases and Data Tables? Do Databases Correspond to Data Connections?
          26. Why Is No Result Displayed After a Hive Task Is Executed?
          27. Why Is the Last Instance Status On the Monitor Instance Page Either Successful or Failed?
          28. How Do I Configure Notifications for All Jobs?
          29. What Is the Maximum Number of Nodes That Can Be Executed Simultaneously?
          30. Can I Change the Time Zone of a DataArts Studio Instance?
          31. How Do I Synchronize the Changed Names of CDM Jobs to DataArts Factory?
          32. Why Does the Execution of an RDS SQL Statement Fail and an Error Is Reported Indicating That hll Does Not Exist?
          33. What Should I Do If Error Message "The account has been locked" Is Displayed When I Am Creating a DWS Data Connection?
          34. What Should I Do If a Job Instance Is Canceled and Message "The node start execute failed, so the current node status is set to cancel." Is Displayed?
          35. What Should I Do If Error Message "Workspace does not exists" Is Displayed When I Call a DataArts Factory API?
          36. Why Don't the URL Parameters for Calling an API Take Effect in the Test Environment When the API Can Be Called Properly Using Postman?
          37. What Should I Do If Error Message "Agent need to be updated?" Is Displayed When I Run a Python Script?
          38. Why Is an Execution Failure Displayed for a Node in the Log When the Node Status Is Successful?
          39. What Should I Do If an Unknown Exception Occurs When I Call a DataArts Factory API?
          40. Why Is an Error Message Indicating an Invalid Resource Name Is Displayed When I Call a Resource Creation API?
          41. Why Does a PatchData Task Fail When All PatchData Job Instances Are Successful?
          42. Why Is a Table Unavailable When an Error Message Indicating that the Table Already Exists Is Displayed During Table Creation from a DWS Data Connection?
          43. What Should I Do If Error Message "The throttling threshold has been reached: policy user over ratelimit,limit:60,time:1 minute." Is Displayed When I Schedule an MRS Spark Job?
          44. What Should I Do If Error Message "UnicodeEncodeError: 'ascii' codec can't encode characters in position 63-64: ordinal not in range(128)" Is Displayed When I Run a Python Script?
          45. What Should I Do If an Error Message Is Displayed When I View Logs?
          46. What Should I Do If a Shell/Python Node Fails and Error "session is down" Is Reported?
          47. What Should I Do If a Parameter Value in a Request Header Contains More Than 512 Characters?
          48. What Should I Do If a Message Is Displayed Indicating that the ID Does Not Exist During the Execution of a DWS SQL Script?
          49. How Do I Check Which Jobs Invoke a CDM Job?
          50. What Should I Do If Error Message "The request parameter invalid" Is Displayed When I Use Python to Call the API for Executing Scripts?
          51. What Should I Do If the Default Queue of a New DLI SQL Script in DataArts Factory Has Been Deleted?
          52. Does the Event-based Scheduling Type in DataArts Factory Support Offline Kafka?
          53. What Should I Do If an Error Is Reported When I Submit an MRS Job with an Agency Through an MRS Cluster Connected Using an MRS API or the MRS Tenant Plane?
        • DataArts Quality
          1. What Are the Differences Between Quality Jobs and Comparison Jobs?
          2. How Can I Confirm that a Quality Job or Comparison Job Is Blocked?
          3. How Do I Manually Restart a Blocked Quality Job or Comparison Job?
          4. How Do I View Jobs Associated with a Quality Rule Template?
          5. What Should I Do If the System Displays a Message Indicating that I Do Not Have the MRS Permission to Perform a Quality Job?
        • DataArts Catalog
          1. What Are the Functions of the DataArts Catalog Module?
          2. What Assets Can Be Collected by DataArts Catalog?
          3. What Is Data Lineage?
          4. How Do I Visualize Data Lineages in a Data Catalog?
        • DataArts Security
          1. Why Isn't Data Masked Based on a Specified Rule After a Data Masking Task Is Executed?
          2. What Should I Do If a Message Is Displayed Indicating that Necessary Request Parameters Are Missing When I Approve a GaussDB(DWS) Permission Application?
          3. What Should I Do If Error Message "FATAL: Invalid username/password,login denied" Is Displayed During the GaussDB(DWS) Connectivity Check When Fine-grained Authentication Is Enabled?
          4. What Should I Do If Error Message "Failed to obtain the database" Is Displayed When I Select a Database in DataArts Factory After Fine-grained Authentication Is Enabled?
          5. Why Does the System Display a Message Indicating Insufficient Permissions During Permission Synchronization to DLI?
        • DataArts DataService
          1. What Languages Do DataArts DataService SDKs Support?
          2. What Can I Do If the System Displays a Message Indicating that the Proxy Fails to Be Invoked During API Creation?
          3. What Should I Do If the Background Reports an Error When I Access the Test App Through the Data Service API and Set Related Parameters?
          4. How Many Times Can a Subdomain Name Be Accessed Using APIs Every Day?
          5. Can Operators Be Transferred When API Parameters Are Transferred?
          6. What Should I Do If No More APIs Can Be Created When the API Quota in the Workspace Is Used Up?
          7. How Can I Access APIs of DataArts DataService Exclusive from the Internet?
          8. How Can I Access APIs of DataArts DataService Exclusive Using Domain Names?
          9. What Should I Do If It Takes a Long Time to Obtain the Total Number of Data Records of a Table Through an API If the Table Contains a Large Amount of Data?
      • Videos