schedule-tool Usage Guide
Overview
schedule-tool is used to submit jobs of SFTP data sources. You can modify the input path and file filtering criteria before submitting a job. You can modify the output path if the target source is HDFS.
Parameters
Configuration parameters |
Description |
Example Value |
---|---|---|
job.jobName |
Job name. |
job1 |
file.fileName.prefix |
File name prefix. |
table1 |
file.fileName.posfix |
File name suffix. |
.txt |
file.filter |
File filter, which filters files by matching file names.
|
true |
date.day |
Number of delayed days, which is matched with the date in the name of an imported file. For example, if the input date is 20160202 and the number of delayed days is 3, files that contain the 20160205 date field in the input path are matched. For details, see schedule-tool Usage Example. |
3 |
file.date.format |
Log format included in the name of the file to be imported. |
yyyyMMdd |
parameter.date.format |
Entered date format when a script is invoked, which is usually consistent with file.date.format. |
yyyyMMdd |
file.format.iscompressed |
Whether the file to be imported is a compressed file. |
false |
storage.type |
Storage type. The final type of the file to be imported include HDFS, HBase, and Hive. |
HDFS |
schedule-tool supports the configuration of multiple jobs at the same time. When multiple jobs are configured at the same time, job.jobName, file.fileName.prefix, and file.fileName.posfix in Table 2 need to be configured with multiple values, and the values need to be separated by commas (,).
Precautions
server.url must be set to a format string of two IP addresses and port numbers, and the IP addresses and ports need to be separated by commas (,).
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.