Halaman ini belum tersedia dalam bahasa lokal Anda. Kami berusaha keras untuk menambahkan lebih banyak versi bahasa. Terima kasih atas dukungan Anda.
- What's New
- Service Overview
- Getting Started
-
User Guide
- IAM Permissions Management
- Getting Started
- Managing DIS Streams
-
Using DIS
- Checking and Configuring DNS Information
- Uploading Data by Using Agent
- Using DIS Flume Plugin to Upload and Download Data
- Using a DIS Logstash Plugin to Upload and Download Data
- Using Kafka Adapter to Upload and Download Data
- Using SparkStreaming SDK to Download Data
- Using a DIS Flink Connector to Upload and Download Data
- Managing a Dump Task
- Managing Enterprise Projects
- Notifying Events
- Monitoring
- Best Practices
-
SDK Reference
- Overview
- Related Resources
- Enabling DIS
- Creating a DIS Stream
- Obtaining Authentication Information
-
Getting Started with SDK
-
Using the Java SDK
- Preparing the Environment
- Configuring a Sample Project
- Initializing a DIS SDK Client Instance
- Creating a Stream
- Creating a Dump Task
- Updating a Dump Task
- Deleting a Dump Task
- Querying a Dump Task List
- Querying Dump Details
- Deleting a Stream
- Querying a Stream List
- Querying Stream Details
- Downloading Streaming Data
- Uploading Streaming Data
- Obtaining the Data Cursor
- Creating an Application
- Deleting an Application
- Adding a Checkpoint
- Querying a Checkpoint
- Changing Partition Quantity
- Using Kafka Adapter to Upload and Download Data
-
Using the Python SDK
- Preparing the Installation Environment
- Configuring a Sample Project
- Initializing a DIS SDK Client Instance
- Creating a Stream
- Creating a Dump Task
- Deleting a Stream
- Deleting a Dump Task
- Querying a Stream List
- Querying a Dump Task List
- Querying Stream Details
- Querying Dump Details
- Uploading Streaming Data in JSON Format
- Uploading Streaming Data in Protobuf Format
- Downloading Streaming Data
- Creating an Application
- Deleting an Application
- Viewing Application Details
- Querying an Application List
- Adding a Checkpoint
- Querying a Checkpoint
- Changing Partition Quantity
- Obtaining a Data Cursor
-
Using the Java SDK
- Error Codes
- Change History
- API Reference
-
FAQs
-
General Questions
- What Is DIS?
- What Is a Partition?
- What Can I Do with DIS?
- What Advantages Does DIS Have?
- Which Modules Do DIS Have?
- How Do I Create a DIS Stream?
- What Is the Difference Between Storing Data into DIS and Dumping Data Elsewhere?
- How Do I Check Software Package Integrity?
- How Do I Send and Retrieve Data Using DIS?
- What Is Data Control?
- Dump Questions
- DIS Agent Questions
-
General Questions
- General Reference
Show all
Copied.
Testing a DIS Flume Plugin
Testing DIS Source
- Start PuTTY and log in to the server on which Flume is installed.
- Ensure that the configuration file containing information about the DIS Source is ready.
You can modify the configuration file based on the flume-conf.properties.template provided by Flume. The following provides a file sample:
agent.sources = dissource agent.channels = memoryChannel agent.sinks = loggerSink #Define DIS Source (which is used to obtain data from DIS). agent.sources.dissource.channels = memoryChannel agent.sources.dissource.type = com.cloud.dis.adapter.flume.source.DISSource agent.sources.dissource.streams = YOU_DIS_STREAM_NAME agent.sources.dissource.ak = YOU_ACCESS_KEY_ID agent.sources.dissource.sk = YOU_SECRET_KEY_ID agent.sources.dissource.region = YOU_Region agent.sources.dissource.projectId = YOU_PROJECT_ID agent.sources.dissource.endpoint = https://dis.${region}.cloud.com agent.sources.dissource.group.id = YOU_APP_NAME #Define a stream. agent.channels.memoryChannel.type = memory agent.channels.memoryChannel.capacity = 10000 #Define Logger Sink (which is used to output data to the console). agent.sinks.loggerSink.type = logger agent.sinks.loggerSink.channel = memoryChannel
- Start Flume. For details about the startup command, see the instructions at the Apache Flume official website.
To start Flume from the Flume installation directory, run the following sample command:
bin/flume-ng agent --conf-file conf/flume-conf.properties.template --name agent --conf conf/ -Dflume.root.logger=INFO,console
In the preceding information, bin/flume-ng agent indicates that Flume Agent will be started; conf-file indicates the path of the configuration file written by the user; name indicates the agent name in the configuration file; conf indicates the conf/ path of Flume.
View the log file. If the log file contains information similar to "source disSource started", DIS Source starts normally. In the preceding information, disSource indicates the DIS Source name configured by the user.
- Check that DIS Source can successfully download data from DIS.
Upload data to a stream to which DIS Source points. If Flume does not report an error and DIS Sink can obtain data, the download is successful.
NOTE:
If the sample configuration in Step 2 is used, the data obtained from DIS is output to the console in byte array format.
- Log in to the DIS console. Two minutes later, check monitoring data of the DIS stream specified in Table 1. If data download (blue lines) is displayed, DIS Source is running successfully.
Testing DIS Sink
- Start PuTTY and log in to the server on which Flume is installed.
- Ensure that the configuration file containing information about DIS Sink is ready.
You can modify the configuration file based on the flume-conf.properties.template provided by Flume. The following provides a file sample:
agent.sources = exec agent.channels = memoryChannel agent.sinks = dissink #Define EXEC Source (which is used to monitor the dis.txt file in the tmp directory). agent.sources.exec.type = exec agent.sources.exec.command = tail -F /tmp/dis.txt agent.sources.exec.shell = /bin/bash -c agent.sources.exec.channels = memoryChannel #Define a stream. agent.channels.memoryChannel.type = memory agent.channels.memoryChannel.capacity = 10000 # Define DIS Sink (which is used to output data to the DIS stream). agent.sinks.dissink.channel = memoryChannel agent.sinks.dissink.type = com.cloud.dis.adapter.flume.sink.DISSink agent.sinks.dissink.streamName = YOU_DIS_STREAM_NAME agent.sinks.dissink.ak = YOU_ACCESS_KEY_ID agent.sinks.dissink.sk = YOU_SECRET_KEY_ID agent.sinks.dissink.region = YOU_Region agent.sinks.dissink.projectId = YOU_PROJECT_ID agent.sinks.dissink.endpoint = https://dis.${region}.myhuaweicloud.com agent.sinks.dissink.resultLogLevel = INFO
- Start Flume. For details about the startup command, see the instructions at the Apache Flume official website.
To start Flume from the Flume installation directory, run the following sample command:
bin/flume-ng agent --conf-file conf/flume-conf.properties.template --name agent --conf conf/ -Dflume.root.logger=INFO,console
In the preceding information, bin/flume-ng agent indicates that Flume Agent will be started; conf-file indicates the path of the configuration file written by the user; name indicates the agent name in the configuration file; conf indicates the conf/ path of Flume.
View the log file. If the log file contains information similar to "Dis flume sink [dissink] start", it indicates that DIS Sink starts normally. The value of dissink is the DIS Sink name configured by the user.
- Check that the DIS Sink can successfully upload data to DIS.
Ingest data into the Flume source. Set the resultLogLevel of the DIS Sink to a value that is not OFF and higher than the log level of log4j. If logs similar to the following are displayed, the DIS Flume Sink has successfully uploaded data to the DIS.
CurrentPut 5 events[success 5 / failed 0] spend 131 ms.
NOTE:
If the sample configuration described in Step 2 is used, you can create a dis.txt file in the tmp directory and append content to the file. After Flume is started, each row of the appended content is read by Flume and sent to the DIS stream through DIS Sink.
- Log in to the DIS console. Two minutes later, check monitoring data of the DIS stream specified in Table 2. If data upload (indicated in green lines) is displayed, DIS Sink is running successfully.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot