Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
Software Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Versions

Updated on 2024-12-05 GMT+08:00
Select a DataArts Studio version with caution based on the functions and specifications you need.
  • After you buy an instance of a specified version, you cannot directly downgrade the version. For example, if you have bought an instance of the enterprise version, you cannot directly downgrade the instance to the starter version. Instead, you will need to back up data of the instance, unsubscribe from it, buy a new instance, and migrate the backup data to the new instance.
  • If your business volume keeps increasing and the instance version you have bought cannot meet your requirements, you can upgrade the instance version. To upgrade the instance version, log in to the DataArts Studio console, locate the target DataArts Studio instance, click Upgrade, and buy a package with higher specifications.

Version Scenarios

The version mode of DataArts Studio has changed in some regions to provide flexible resource configuration and lightweight data governance capabilities. For details about the old and new version modes, see Version Mode.
  • Now you can purchase DataArts Studio instances of the starter, expert, or enterprise version.
  • The version mode change does not affect the DataArts Studio instances you have purchased before, which may be of the starter, basic, advanced, professional, or enterprise version.

Compared with the old version mode, the new version mode provides more favorable prices and more flexible resource scaling. If you want to experience the new version mode, you are advised to buy a new DataArts Studio instance, migrate service data from the original instance to the new instance by referring to DataArts Studio Data Migration Configuration, and then unsubscribe from the original instance.

Table 1 lists the recommended application scenarios of each version.

Table 1 Recommended application scenarios for each DataArts Studio version

Version Mode

Version

Application Scenario

New version mode

Starter

In the initial construction phase, the data lake project mainly manages data ETL tasks in big data development scenarios and does not involve data governance.

Expert

Small- and medium-sized enterprises (SMEs) have full-time data development and governance personnel and require lightweight data governance capabilities, such as data quality, data assets, and data services. Cost-effectiveness is preferred.

Enterprise

A complete data management team and system are available. For medium- and large-sized enterprises, the enterprise information architecture, data standards, data models, and data metrics need to be implemented to match the complete DAYU data governance methodology.

Old version mode

Starter

A primary data lake project with no full-time data development engineers and no data governance needs

Basic

One or two full-time data development engineers, and up to 1,000 data tables

Advanced

Five to ten full-time data development engineers, clear data standards and efficient data quality management, and up to 2,000 data tables

Professional

Large or medium enterprises with a team of 10 to 30 full-time data development engineers and well-designed systems

Enterprise

Large enterprises and enterprises with multiple branches

Version Specifications (New Version Mode)

The new version mode provides the basic, expert, and enterprise version. Table 2 and Table 3 provide the components and specifications of each version, respectively.

Table 2 Components supported by DataArts Studio

DataArts Studio Component

Starter

Expert

Enterprise

DataArts Migration

Management Center

DataArts Architecture

x

x

DataArts Factory

DataArts Quality

x

This module is available but does not support business metric monitoring, comparison jobs, or quality reports.

DataArts Catalog

x

Supported. However, data directories (categories, tags, and collection tasks) cannot be exported through resource migration in the management center.

DataArts DataService

x

DataArts Security

x

This module is available but does not support data watermarking or source tracing.

Table 3 DataArts Studio version specifications (a single instance)

Specification

Starter

Expert

Enterprise

DataArts Studio CDM cluster[1]

Number of clusters: 1

Name: cdm.medium

vCPUs | memory: 4 vCPUs | 8 GB

Number of clusters: 1

Name: cdm.medium

vCPUs | memory: 4 vCPUs | 8 GB

Number of clusters: 1

Name: cdm.medium

vCPUs | memory: 4 vCPUs | 8 GB

Job scheduling times/day[2]

5,000/day

5,000/day

5,000/day

Number of technical assets[3]

Not supported

500

5000

Number of data models[4]

Not supported

Not supported

100

Annotation:

[1] DataArts Studio CDM cluster: This is a free cluster provided together with the DataArts Studio instance. It can be used as an agent for the data connections in Management Center. However, you are not advised to use the node in a data migration job when the node is used as an agent. To buy a CDM cluster used to run CDM jobs, buy a CDM incremental package. For details, see Buying a CDM Incremental Package.

[2] Job scheduling times/day: This specification is calculated based on the total number of scheduling times of data development jobs, quality jobs, reconciliation jobs, service scenarios, and metadata collection jobs executed every day. You can expand the capacity using the job node scheduling times/day incremental package, for details, see Buying a Job Node Scheduling Times/Day Incremental Package. The number of scheduling times of data development job per day is measured by node (including the Dummy node), covering PatchData tasks but not test or retry upon failures. For example, if a job contains two DWS SQL nodes and one Dummy node, starts to be executed at 00:00 every day, is scheduled every 10 hours, and a PatchData task is performed on the current day to patch data of the last 10 days, then the number of scheduling times of the job is 66 (2 x 3 + 2 x 3 x 10) for the current day and 6 (2 x 3) for every following day.

In addition, if the total number of used scheduling times, scheduling times in use, and scheduling times to be used for job nodes on the current day exceeds the specifications of this version, a message is displayed indicating that the number of job node scheduling times/day exceeds the quota when a batch processing job is scheduled or a real-time job is started.

[3] Number of technical assets: This specification refers to the number of tables and OBS files in the Data Catalog. You can expand the capacity using the incremental package of technology asset quantity. For details about how to purchase the package, see Buying an Incremental Package for Technical Asset Quantity.

[4] Number of data models: This specification indicates the number of logical models, physical models, dimension tables, fact tables, and SDR tables in the data architecture. You can expand the capacity using the data model quantity incremental package. For details about how to purchase the package, see Buying an Incremental Package for Data Model Quantity.

Version Specifications (Old Version Mode)

Table 4 Components supported by DataArts Studio

DataArts Studio Component

Starter

Basic

Advanced

Professional

Enterprise

DataArts Migration

Management Center

DataArts Architecture

x

DataArts Factory

DataArts Quality

x

DataArts Catalog

x

DataArts DataService

x

DataArts Security

x

Table 5 DataArts Studio version specifications (a single instance)

Specifications

Starter

Basic

Advanced

Professional

Enterprise

DataArts Studio CDM cluster[1]

Number of clusters: 1

Name: cdm.medium

vCPUs: 4 | Memory: 8 GB

Number of clusters: 1

Name: cdm.medium

vCPUs: 4 | Memory: 8 GB

Number of clusters: 1

Name: cdm.large

vCPUs: 8 | Memory: 16 GB

Number of clusters: 1

Name: cdm.xlarge

vCPUs: 16 | Memory: 32 GB

Number of clusters: 1

Name: cdm.xlarge

vCPUs: 16 | Memory: 32 GB

Job node scheduling times/day[2]

5,000/day

20,000/day

40,000/day

80,000/day

200,000/day

Number of technical assets[3]

Not supported

1,000

2,000

4,000

10,000

Number of data models[4]

Not supported

1,000

2,000

4,000

10,000

Notes:

[1] DataArts Studio CDM cluster: Due to specifications restrictions, the free CDM cluster provided by a DataArts Studio instance can only be used for informal scenarios such as testing and trial use. To run your migration workloads, buy a CDM incremental package. In addition, you are not advised to use a CDM cluster that serves as a data connection agent to run data migration jobs. For details, see Buying a CDM Incremental Package.

[2] Job node scheduling times/day: It refers to the total number of scheduling times of the data development jobs, quality jobs, comparison jobs, scenarios, and metadata collection jobs per day. The number of scheduling times of data development job per day is measured by node (including the Dummy node), covering PatchData tasks but not test or retry upon failures. For example, if a job contains two DWS SQL nodes and one Dummy node, starts to be executed at 00:00 every day, is scheduled every 10 hours, and a PatchData task is performed on the current day to patch data of the last 10 days, then the number of scheduling times of the job is 66 (2 x 3 + 2 x 3 x 10) for the current day and 6 (2 x 3) for every following day.

In addition, if the total number of used scheduling times, scheduling times in use, and scheduling times to be used for job nodes on the current day exceeds the specifications of this version, a message is displayed indicating that the number of job node scheduling times/day exceeds the quota when a batch processing job is scheduled or a real-time job is started.

[3] Number of technical assets: number of tables and OBS files in DataArts Catalog

[4] Number of data models: number of logical models, physical models, dimension tables, fact tables, and summary tables in DataArts Architecture.

Version Mode

The version mode of DataArts Studio has changed in some regions to provide flexible resource configuration and lightweight data governance capabilities.
  • Now you can purchase DataArts Studio instances of the starter, expert, or enterprise version.
  • The version mode change does not affect the DataArts Studio instances you have purchased before, which may be of the starter, basic, advanced, professional, or enterprise version.

Compared with the old version mode, the new version mode provides more favorable prices and more flexible resource scaling. If you want to experience the new version mode, you are advised to buy a new DataArts Studio instance, migrate service data from the original instance to the new instance by referring to DataArts Studio Data Migration Configuration, and then unsubscribe from the original instance.

For details about the changes in the mode of the new version compared with that of the old version, see Table 6.

Table 6 Comparison between the old and new AI markets:

Difference

Old Version Mode

New Version Mode

Provided Version

  • Junior version: data integration + data development
  • Data Integration & Data Development & Data Governance
  • Data Integration & Data Development & Data Governance
  • Data Integration & Data Development & Data Governance
  • Data Integration & Data Development & Data Governance
  • Junior version: data integration + data development
  • Expert Edition: Data Integration + Data Development + Lightweight Data Governance
  • Data Integration & Data Development & Data Governance

Data governance capabilities

No. Except the basic edition, all editions provide full-function data governance capabilities, which are costly.

Yes. The expert edition provides lightweight data governance capabilities to meet data governance requirements of small- and medium-sized enterprises.

Supported Incremental Packages

Only the function incremental package is provided.

  • CDM incremental package
  • DataArts DataService incremental package

Function incremental packages and specifications incremental packages are available. (For details about how to buy them, see Buying a DataArts Studio Incremental Package.

Function incremental packages:

  • CDM incremental package
  • DataArts Migration resource group incremental package
  • DataArts DataService incremental package

Specifications incremental packages:

  • Job scheduling times/day
  • Incremental package of technical asset quantity
  • Incremental package of data model quantity

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback