Updated on 2024-11-05 GMT+08:00

Introduction to Incremental Packages

DataArts Studio provides and is billed based on basic and incremental packages. If the basic package cannot meet your demands, you need to buy an incremental package.

DataArts Studio Incremental Packages

Table 1 lists the incremental packages provided by DataArts Studio.

Table 1 Incremental packages

Package Type

Description

Scenario

Purchase Mode

DataArts Migration incremental package

A DataArts Migration (that is, CDM) incremental package provides resources for a CDM cluster.
  • When you buy a pay-per-use CDM incremental package, the system automatically creates a CDM cluster based on the specifications you select for the incremental package.
  • When you buy a CDM incremental package which is billed based on a package, the system does not automatically create a CDM cluster. Instead, you can use a CDM cluster you have obtained on the DataArts Studio console for 745 hours each month within the validity period of the incremental package.
CDM clusters can be used in the following scenarios:
  • Data migration jobs can be created and run in CDM clusters to migrate data to the cloud or import data to the data lake.
  • CDM clusters can be used as agents of data connections in Management Center, which enable communications between DataArts Studio instances and data sources.
The DataArts Studio instance contains a CDM cluster that can be used for informal scenarios such as testing and trial use.
  • If the cluster meets your needs, you do not need to buy a CDM incremental package.
  • If you need another CDM cluster that can meet your needs, buy a pay-per-use CDM incremental package.
  • If you want to reduce the costs of your CDM cluster, you can buy a CDM incremental package billed based on a package.
NOTE:

Due to specifications restrictions, the free CDM cluster provided by a DataArts Studio instances can only be used for informal scenarios such as testing and trial use. To run your migration workloads, buy a CDM incremental package. In addition, you are not advised to use a CDM cluster that serves as a data connection agent to run data migration jobs.

  • Pay-per-use
  • Package

DataArts Migration resource group incremental package

This type of incremental package provides resource groups for real-time jobs in DataArts Migration. DataArts Migration resource groups can be used to migrate data to the cloud and ingest data into and export data out of a data lake. It provides wizard-based configuration and management and can integrate all, incremental, and real-time data from a single table, entire database, or database or table shard.
  • When you buy a pay-per-use DataArts Migration resource group incremental package, the system automatically creates a resource group required by real-time data integration jobs based on the specifications you set for the incremental package.
  • When you buy a DataArts Migration resource group incremental package which is billed based on a package, the system does not automatically create a resource group. Instead, you can use a resource group you have obtained on the DataArts Studio console for 745 hours each month within the validity period of the incremental package.

DataArts Migration resource groups can be used in the following scenarios:

Data migration jobs can be created and run in CDM clusters to migrate data to the cloud or import data to the data lake.

By default, a DataArts Studio instance does not contain DataArts Migration resource groups. If you want to migrate data offline or in real time, create a DataArts Migration resource group incremental package.

  • Pay-per-use
  • Package

DataArts DataService Exclusive cluster incremental package

This package corresponds to a DataArts DataService Exclusive cluster. When you create a DataArts DataService Exclusive cluster incremental package, the system automatically creates a DataArts DataService Exclusive cluster based on your selected specifications.

DataArts DataService is a standard data service platform that allows you to generate data APIs quickly from data tables. Using the APIs, you can open your data in a simple, fast, low-cost, and secure way. To use DataArts DataService, you must create a DataArts DataService Exclusive cluster first.

A DataArts Studio instance does not contain a DataArts DataService Exclusive cluster. To use DataArts DataService, you must create a DataArts DataService Exclusive cluster incremental package.

Yearly/Monthly

Job node scheduling times/day incremental package

This package is used to increase the quota of job node scheduling times/day.

The quota of job node scheduling times/day varies depending on the DataArts Studio instance version. This quota refers to the total number of scheduling times of data development jobs, quality jobs, comparison jobs, scenarios, and metadata collection jobs per day. The number of scheduling times of data development job per day is measured by node (including the Dummy node), covering PatchData tasks but not test or retry upon failures. You can locate a DataArts Studio instance, click More, and select Quota Usage to view this quota.
NOTE:

The maximum number of concurrent data development job nodes of a DataArts Studio instance is related to the job node scheduling times/day quota of the instance.

  • When the number of job node scheduling times/day quota is less than or equal to 500, the maximum number of concurrent nodes is 10.
  • When the number of job node scheduling times/day quota is greater than 500 and less than or equal to 5,000, the maximum number of concurrent nodes is 50.
  • When the number of job node scheduling times/day quota is greater than 5,000 and less than or equal to 20,000, the maximum number of concurrent nodes is 100.
  • When the number of job node scheduling times/day quota is greater than 20,000 and less than or equal to 40,000, the maximum number of concurrent nodes is 200.
  • When the number of job node scheduling times/day quota is greater than 40,000 and less than or equal to 80,000, the maximum number of concurrent nodes is 300.
  • When the number of job node scheduling times/day quota is greater than 80,000, the maximum number of concurrent nodes is 400.
If the number of job node scheduling times per day is close to or has reached the upper limit, or if you want to increase the maximum number of concurrent nodes, you are advised to purchase a job node scheduling times/day incremental package.
NOTE:

If the total number of used scheduling times, scheduling times in use, and scheduling times to be used for job nodes on the current day exceeds the upper limit of this version, a message is displayed indicating that the number of job node scheduling times/day exceeds the quota when a batch processing job is scheduled or a real-time job is started.

Yearly/Monthly

Technical asset quantity incremental package

This package is used to increase the quota of the technical asset quantity.

The maximum number of technical assets varies depending on the DataArts Studio instance version. This quota is calculated based on the total number of tables and OBS files in DataArts Catalog. You can locate a DataArts Studio instance, click More, and select Quota Usage to view this quota.

If the number of your technical assets is close to or has reached the upper limit, you are advised to purchase a technical asset quantity incremental package.

Yearly/Monthly

Data model quantity incremental package

This package is used to increase the quota of the data model quantity.

The maximum number of data models varies depending on the DataArts Studio instance version. This quota is calculated based on the total number of logical models, physical models, dimension tables, fact tables, and summary tables in DataArts Architecture. You can locate a DataArts Studio instance, click More, and select Quota Usage to view this quota.

If the number of your data models is close to or has reached the upper limit, you are advised to purchase a data model quantity incremental package.

Yearly/Monthly