Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
Software Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Managing Resources

Updated on 2024-08-30 GMT+08:00

You can upload custom code or text files as resources on Manage Resource and schedule them when running nodes. Nodes that can invoke resources include DLI Spark, MRS Spark, DLI Flink Job, and MRS MapReduce.

After creating a resource, configure the file associated with the resource. Resources can be directly referenced in jobs. When the resource file is changed, you only need to change the resource reference location. You do not need to modify the job configuration. For details about resource usage examples, see Developing a DLI Spark Job.

Constraints

This function depends on OBS or MRS HDFS.

(Optional) Creating a Directory

If a directory exists, you do not need to create one.

  1. Log in to the DataArts Studio console by following the instructions in Accessing the DataArts Studio Instance Console.
  2. On the DataArts Studio console, locate a workspace and click DataArts Factory.
  3. In the left navigation pane, choose Configuration > Manage Resource.
  4. In the directory list, click . In the displayed dialog box, configure directory parameters. Table 1 describes the directory parameters.
    Table 1 Resource directory parameters

    Parameter

    Description

    Directory Name

    Name of the resource directory. The name must contain 1 to 32 characters, including only letters, numbers, underscores (_), and hyphens (-).

    Select Directory

    Parent directory of the resource directory. The parent directory is the root directory by default.

  5. Click OK.

Creating a Resource

You have enabled OBS before creating a resource.

  1. In the left navigation pane, choose Configuration > Manage Resource.
  2. Click Create Resource. In the displayed dialog box, configure resource parameters. Table 2 describes the resource parameters. Click OK.
    Table 2 Resource management parameters

    Parameter

    Mandatory

    Description

    Name

    Yes

    Name of the resource. The name must contain 1 to 32, including only letters, numbers, underscores (_), and hyphens (-).

    Type

    Yes

    File type of the resource. Possible values:

    • jar: JAR file
    • pyFile: User Python file
    • file: User file
    • archive: User AI model file The supported file name extensions are zip, tgz, tar.gz, tar, and jar.

    Resource Location

    Yes

    Location of the resource. OBS and HDFS are supported. HDFS supports only MRS Spark, MRS Flink Job and MRS MapReduce nodes.

    File Path

    Yes

    Select an OBS file path when Resource Location is set to OBS.

    Select an MRS cluster name when Resource Location is set to HDFS.

    Depended Package

    No

    This parameter is available only for DLI Spark nodes.

    Depended JAR package that has been uploaded to OBS. This parameter is required when Type is set to jar or pyFile.

    Select Directory

    Yes

    Directory to which the resource belongs. The root directory is selected by default.

    Description

    No

    Descriptive information about the resource.

Editing a Resource

After a resource is created, you can modify resource parameters.

  1. In the left navigation pane, choose Configuration > Manage Resource.
  2. In the Operation column of the resource, click Edit. In the displayed dialog box, modify the resource parameters. For details, see Table 2.
  3. Click OK.

Deleting a Resource

You can delete resources that are no longer needed.

Before deleting a resource, ensure that it is not used by any jobs.

NOTICE:

If you are trying to delete a resource that is being used by jobs, the Delete Resource dialog box is displayed. When you click OK, the Reference List dialog box is displayed, in which you can view the jobs that are using the resource and click View in the Operation column to go to the job details page.

  1. In the left navigation pane, choose Configuration > Manage Resource.
  2. In the Operation column of the resource, click Delete. The Delete Resource dialog box is displayed.
  3. Click Yes.

Importing a Resource

To import a resource, perform the following operations:

  1. In the left navigation pane, choose Configuration > Manage Resource.
  2. In the resource directory, click and select Import Resource. The Import Resource dialog box is displayed.
  3. Select the resource file that has been uploaded to OBS and click Next. After the import is complete, click Close.

Exporting a Resource

To export a resource, perform the following operations:

  1. In the left navigation pane, choose Configuration > Manage Resource.
  2. In the resource directory, select a resource, click , and select Export Resource. The system starts downloading the resource to the local PC.

Viewing Resource References

To view the references of a resource, perform the following operations:

  1. In the left navigation pane, choose Configuration > Manage Resource.
  2. Right-click a resource in the list and select View Reference.
  3. In the displayed Reference List dialog box, view the references of the resource.
    Figure 1 Reference List dialog box

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback