El contenido no se encuentra disponible en el idioma seleccionado. Estamos trabajando continuamente para agregar más idiomas. Gracias por su apoyo.
- What's New
- Service Overview
- Getting Started
-
User Guide
- IAM Permissions Management
- Getting Started
- Managing DIS Streams
-
Using DIS
- Checking and Configuring DNS Information
- Uploading Data by Using Agent
- Using DIS Flume Plugin to Upload and Download Data
- Using a DIS Logstash Plugin to Upload and Download Data
- Using Kafka Adapter to Upload and Download Data
- Using SparkStreaming SDK to Download Data
- Using a DIS Flink Connector to Upload and Download Data
- Managing a Dump Task
- Managing Enterprise Projects
- Notifying Events
- Monitoring
- Best Practices
-
SDK Reference
- Overview
- Related Resources
- Enabling DIS
- Creating a DIS Stream
- Obtaining Authentication Information
-
Getting Started with SDK
-
Using the Java SDK
- Preparing the Environment
- Configuring a Sample Project
- Initializing a DIS SDK Client Instance
- Creating a Stream
- Creating a Dump Task
- Updating a Dump Task
- Deleting a Dump Task
- Querying a Dump Task List
- Querying Dump Details
- Deleting a Stream
- Querying a Stream List
- Querying Stream Details
- Downloading Streaming Data
- Uploading Streaming Data
- Obtaining the Data Cursor
- Creating an Application
- Deleting an Application
- Adding a Checkpoint
- Querying a Checkpoint
- Changing Partition Quantity
- Using Kafka Adapter to Upload and Download Data
-
Using the Python SDK
- Preparing the Installation Environment
- Configuring a Sample Project
- Initializing a DIS SDK Client Instance
- Creating a Stream
- Creating a Dump Task
- Deleting a Stream
- Deleting a Dump Task
- Querying a Stream List
- Querying a Dump Task List
- Querying Stream Details
- Querying Dump Details
- Uploading Streaming Data in JSON Format
- Uploading Streaming Data in Protobuf Format
- Downloading Streaming Data
- Creating an Application
- Deleting an Application
- Viewing Application Details
- Querying an Application List
- Adding a Checkpoint
- Querying a Checkpoint
- Changing Partition Quantity
- Obtaining a Data Cursor
-
Using the Java SDK
- Error Codes
- Change History
- API Reference
-
FAQs
-
General Questions
- What Is DIS?
- What Is a Partition?
- What Can I Do with DIS?
- What Advantages Does DIS Have?
- Which Modules Do DIS Have?
- How Do I Create a DIS Stream?
- What Is the Difference Between Storing Data into DIS and Dumping Data Elsewhere?
- How Do I Check Software Package Integrity?
- How Do I Send and Retrieve Data Using DIS?
- What Is Data Control?
- Dump Questions
- DIS Agent Questions
-
General Questions
- General Reference
Copied.
Step 3: Sending Data to DIS
Function
Local data is continuously uploaded to DIS.
Data can be stored in MRS, DIS, OBS, and DLI. For how to configure a storage location, see Creating a Dump Task.
The maximum number of days for DIS to preserve data cannot exceed Data Retention (days).
Sample Code
The example code file is the ProducerDemo.java file in the \dis-sdk-demo\src\main\java\com\bigdata\dis\sdk\demo directory decompressed from the huaweicloud-sdk-dis-java-X.X.X.zip package. The compression package is downloaded from the DIS SDK.
Running the Producer Program

14:40:20.090 [main] INFOcom.bigdata.dis.sdk.DISConfig - get from classLoader 14:40:20.093 [main] INFODEMOT - ========== BEGIN PUT ============ 14:40:21.186 [main] INFOcom.bigdata.dis.sdk.util.config.ConfigurationUtils - get from classLoader 14:40:21.187 [main] INFOcom.bigdata.dis.sdk.util.config.ConfigurationUtils - propertyMapFromFile size : 2 14:40:22.092 [main] INFOcom.bigdata.dis.sdk.demo.ProducerDemo - Put 3 records[3 successful / 0 failed]. 14:40:22.092 [main] INFOcom.bigdata.dis.sdk.demo.ProducerDemo - [hello world.] put success, partitionId [shardId-0000000000], partitionKey [964885], sequenceNumber [0] 14:40:22.092 [main] INFOcom.bigdata.dis.sdk.demo.ProducerDemo - [hello world.] put success, partitionId [shardId-0000000000], partitionKey [910960], sequenceNumber [1] 14:40:22.092 [main] INFOcom.bigdata.dis.sdk.demo.ProducerDemo - [hello world.] put success, partitionId [shardId-0000000000], partitionKey [528377], sequenceNumber [2] 14:40:22.092 [main] INFOcom.bigdata.dis.sdk.demo.ProducerDemo - ========== PUT OVER ============
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot