Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
On this page
Help Center/ MapReduce Service/ Component Operation Guide (Paris Region)/ Using Flume/ Non-Encrypted Transmission/ Typical Scenario: Collecting Local Static Logs and Uploading Them to Kafka

Typical Scenario: Collecting Local Static Logs and Uploading Them to Kafka

Updated on 2022-12-14 GMT+08:00

Scenario

This section describes how to use the Flume client to collect static logs from a local host and save them to the topic list (test1) of Kafka.

This section applies to MRS 3.x or later clusters.

NOTE:

By default, the cluster network environment is secure and the SSL authentication is not enabled during the data transmission process. For details about how to use the encryption mode, see Configuring the Encrypted Transmission. The configuration applies to scenarios where only the Flume is configured, for example, Spooldir Source+Memory Channel+Kafka Sink.

Prerequisites

  • The cluster has been installed, including the Kafka and Flume services.
  • The Flume client has been installed. For details, see .
  • The network environment of the cluster is secure.
  • You have understood service requirements and prepared Kafka administrator flume_kafka.

Procedure

  1. Set Flume parameters.

    Use the Flume configuration tool on Manager to configure the Flume role client parameters and generate a configuration file.
    1. Log in to FusionInsight Manager. Choose Cluster > Services > Flume > Configuration Tool.
    2. Set Agent Name to client. Select and drag the source, channel, and sink to be used to the GUI on the right, and connect them.

      Use SpoolDir Source, Memory Channel, and Kafka Sink.

    3. Double-click the source, channel, and sink. Set corresponding configuration parameters by referring to Table 1 based on the actual environment.
      NOTE:
      • If you want to continue using the properties.propretites file by modifying it, log in to FusionInsight Manager, choose Cluster > Services. On the page that is displayed, choose Flume. On the displayed page, click the Configuration Tool tab, click Import, import the file, and modify the configuration items related to non-encrypted transmission.
      • It is recommended that the numbers of Sources, Channels, and Sinks do not exceed 40 during configuration file import. Otherwise, the response time may be very long.
      Table 1 Parameters to be modified for the Flume role client

      Parameter

      Description

      Example Value

      Name

      The value must be unique and cannot be left blank.

      test

      spoolDir

      Specifies the directory where the file to be collected resides. This parameter cannot be left blank. The directory needs to exist and have the write, read, and execute permissions on the flume running user.

      /srv/BigData/hadoop/data1/zb

      trackerDir

      Specifies the path for storing the metadata of files collected by Flume.

      /srv/BigData/hadoop/data1/tracker

      batchSize

      Specifies the number of events that Flume sends in a batch (number of data pieces). A larger value indicates higher performance and lower timeliness.

      61200

      kafka.topics

      Specifies the list of subscribed Kafka topics, which are separated by commas (,). This parameter cannot be left blank.

      test1

      kafka.bootstrap.servers

      Specifies the bootstrap IP address and port list of Kafka. The default value is all Kafkabrokers in the Kafka cluster.

      192.168.101.10:21007

    4. Click Export to save the properties.properties configuration file to the local server.

  2. Upload the configuration file.

    Upload the file exported in 1.d to the Flume client installation directory/fusioninsight-flume-Flume component version number/conf directory of the cluster.

  1. Verify log transmission.

    1. Log in to the Kafka client.

      cd Kafka client installation directory/Kafka/kafka

      kinit flume_kafka (Enter the password.)

    2. Read data from a Kafka topic.

      bin/kafka-console-consumer.sh --topic topic name --bootstrap-server Kafka service IP address of the node where the role instance is located: 21007 --consumer.config config/consumer.properties --from-beginning

      The system displays the contents of the file to be collected.

      [root@host1 kafka]# bin/kafka-console-consumer.sh --topic test1 --bootstrap-server 192.168.101.10:21007 --consumer.config config/consumer.properties --from-beginning
      Welcome to flume

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback