Creating a Resource
Function
This API is used to create a resource. Types of nodes, including DLI Spark, MRS Spark, and MRS MapReduce, can reference files such as JAR and properties through resources.
URI
- Parameter description
Table 1 URI parameter Parameter
Mandatory
Type
Description
project_id
Yes
String
Project ID. For details about how to obtain a project ID, see Project ID and Account ID.
Request
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
workspace |
No |
String |
Workspace ID.
|
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
name |
Yes |
String |
Name of the resource. The name contains a maximum of 32 characters, including only letters, numbers, underscores (_), and hyphens (-). |
type |
Yes |
String |
Resource type.
|
location |
Yes |
String |
OBS path for storing the resource file. When type is set to jar, location is the path for storing the main JAR package. The path contains a maximum of 1,023 characters. For example, obs://myBucket/test.jar |
dependFiles |
No |
List<String> |
JAR package and properties file that the main JAR package depends on. The description contains a maximum of 10,240 characters. |
desc |
No |
String |
Description of the resource. The description contains a maximum of 255 characters. |
directory |
No |
String |
Directory for storing the resource. Access the DataArts Studio console and choose Data Development. In the left navigation pane, choose . In the directory tree of the script, you can view the created directories. The default directory is the root directory. |
Response
Parameter |
Mandatory |
Type |
Description |
---|---|---|---|
resourceId |
Yes |
String |
Resource ID. |
Example
- Request
POST /v1/b384b9e9ab9b4ee8994c8633aabc9505/resources { "name": "test", "type": "jar", "location": "obs://00000000dlf-test/hadoop-mapreduce-examples-2.4.1.jar", "dependFiles": ["obs://00000000dlf-test/depend1.jar","obs://00000000dlf-test/depend2.jar"], "desc": "test", "directory":"/resource" }
Status Codes
See Status Codes.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.