schedule-tool Usage Guide
Overview
schedule-tool is used to submit jobs of SFTP data sources. You can modify the input path and file filtering criteria before submitting a job. You can modify the output path if the target source is HDFS.
Parameters
Configuration parameters |
Description |
Example Value |
---|---|---|
server.url |
Floating IP address and port for Loader. The default port is 21351. For compatibility, multiple IP addresses and ports can be configured and need to be separated by commas (,). The first IP address and port must be those of Loader. The others can be configured based on service requirements. |
10.96.26.111:21351,127.0.0.2:21351 |
authentication.type |
Login authentication mode.
|
kerberos |
authentication.user |
User for login when the normal mode or password authentication is used. In the keytab login mode, this parameter does not need to be set. |
bar |
authentication.password |
User password for login when the password authentication mode is used. In the normal mode or keytab login mode, this parameter does not need to be set. The password needs to be encrypted. The encryption method is described as follows:
|
- |
use.keytab |
Whether to use the keytab mode to log in.
|
true |
client.principal |
User principal for accessing the Loader service when the keytab authentication mode is used. In the normal mode or password login mode, this parameter does not need to be set. |
loader/hadoop.System domain name
NOTE:
You can log in to FusionInsight Manager, choose System > Permission > Domain and Mutual Trust, and view the value of Local Domain, which is the current system domain name. |
client.keytab |
Directory where the used keytab file is located when the keytab authentication mode is used. In the normal mode or password login mode, this parameter does not need to be set. |
/opt/client/conf/loader.keytab |
krb5.conf.file |
Directory where the krb5.conf file is located when the keytab authentication mode is used. In the normal mode or password login mode, this parameter does not need to be set. |
/opt/client/conf/krb5.conf |
Configuration parameters |
Description |
Example Value |
---|---|---|
job.jobName |
Job name. |
job1 |
file.fileName.prefix |
File name prefix. |
table1 |
file.fileName.posfix |
File name suffix. |
.txt |
file.filter |
File filter, which filters files by matching file names.
|
true |
date.day |
Number of delayed days, which is matched with the date in the name of an imported file. For example, if the input date is 20160202 and the number of delayed days is 3, files that contain the 20160205 date field in the input path are matched. For details, see schedule-tool Usage Example. |
3 |
file.date.format |
Log format included in the name of the file to be imported. |
yyyyMMdd |
parameter.date.format |
Entered date format when a script is invoked, which is usually consistent with file.date.format. |
yyyyMMdd |
file.format.iscompressed |
Whether the file to be imported is a compressed file. |
false |
storage.type |
Storage type. The final type of the file to be imported include HDFS, HBase, and Hive. |
HDFS |
schedule-tool supports the configuration of multiple jobs at the same time. When multiple jobs are configured at the same time, job.jobName, file.fileName.prefix, and file.fileName.posfix in Table 2 need to be configured with multiple values, and the values need to be separated by commas (,).
Precautions
server.url must be set to a format string of two IP addresses and port numbers, and the IP addresses and ports need to be separated by commas (,).
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot