Help Center/ MapReduce Service/ User Guide (Kuala Lumpur Region)/ Troubleshooting/ Using Flume/ Flume Data Fails to Be Written to the Component
Updated on 2022-12-14 GMT+08:00

Flume Data Fails to Be Written to the Component

Symptom

After the Flume process is started, Flume data cannot be written to the corresponding component. (The following uses writing data from the server to HDFS as an example.)

Cause Analysis

  1. HDFS is not started or is faulty. View Flume run logs.
    2019-02-26 11:16:33,564 | ERROR | [SinkRunner-PollingRunner-DefaultSinkProcessor] |  opreation the hdfs file errors.  | org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:414)
    2019-02-26 11:16:33,747 | WARN  | [hdfs-CCCC-call-runner-4] |  A failover has occurred since the start of call #32795 ClientNamenodeProtocolTranslatorPB.getFileInfo over 192-168-13-88/192.168.13.88:25000  | org.apache.hadoop.io.retry.RetryInvocationHandler$ProxyDescriptor.failover(RetryInvocationHandler.java:220)
    2019-02-26 11:16:33,748 | ERROR | [hdfs-CCCC-call-runner-4] |  execute hdfs error. {}  | org.apache.flume.sink.hdfs.HDFSEventSink$3.call(HDFSEventSink.java:744)
    java.net.ConnectException: Call From 192-168-12-221/192.168.12.221 to 192-168-13-88:25000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
  2. The HDFS sink is not started. Check the Flume run log. It is found that the Flume current metrics file does not contain sink information.
    2019-02-26 11:46:05,501 | INFO  | [pool-22-thread-1] |  flume current metrics:{"CHANNEL.BBBB":{"ChannelCapacity":"10000","ChannelFillPercentage":"0.0","Type":"CHANNEL","ChannelStoreSize":"0","EventProcessTimedelta":"0","EventTakeSuccessCount":"0","ChannelSize":"0","EventTakeAttemptCount":"0","StartTime":"1551152734999","EventPutAttemptCount":"0","EventPutSuccessCount":"0","StopTime":"0"},"SOURCE.AAAA":{"AppendBatchAcceptedCount":"0","EventAcceptedCount":"0","AppendReceivedCount":"0","MonTime":"0","StartTime":"1551152735503","AppendBatchReceivedCount":"0","EventReceivedCount":"0","Type":"SOURCE","TotalFilesCount":"1001","SizeAcceptedCount":"0","UpdateTime":"605410241202740","AppendAcceptedCount":"0","OpenConnectionCount":"0","MovedFilesCount":"1001","StopTime":"0"}}  | org.apache.flume.node.Application.getRestartComps(Application.java:467)

Solution

  1. If the component to which Flume writes data is not started, start the component. If the component is abnormal, contact technical support.
  2. If the sink is not started, check whether the configuration file is correctly configured. If the configuration file is incorrectly configured, modify the configuration file and restart the Flume process. If the configuration file is correctly configured, view the error information in the log and rectify the fault based on the error information.