To Hive
If the destination link of a job is the Link to Hive, configure the destination job parameters based on Table 1.
|
Parameter |
Description |
Example Value |
|---|---|---|
|
Database Name |
Database name. Click the icon next to the text box. The dialog box for selecting the database is displayed. |
default |
|
Auto Table Creation |
This parameter is displayed only when both the migration source and destination are relational databases. The options are as follows:
|
Non-auto creation |
|
Table Name |
Destination table name. Click the icon next to the text box. The dialog box for selecting the table is displayed. This parameter can be configured as a macro variable of date and time and a path name can contain multiple macro variables. When the macro variable of date and time works with a scheduled job, the incremental data can be synchronized periodically. For details, see Incremental Synchronization Using the Macro Variables of Date and Time. |
TBL_X |
|
Clear Data Before Import |
Whether the data in the destination table is cleared before data import. The options are as follows:
|
Yes |
- When Hive serves as the migration destination, the storage format selected during table creation will be automatically used, such as ORC and Parquet.
- When Hive serves as the migration destination, if the storage format is TEXTFILE, delimiters must be explicitly specified in the statement for creating Hive tables. The following gives an example:
CREATE TABLE csv_tbl( smallint_value smallint, tinyint_value tinyint, int_value int, bigint_value bigint, float_value float, double_value double, decimal_value decimal(9, 7), timestmamp_value timestamp, date_value date, varchar_value varchar(100), string_value string, char_value char(20), boolean_value boolean, binary_value binary, varchar_null varchar(100), string_null string, char_null char(20), int_null int ) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde' WITH SERDEPROPERTIES ( "separatorChar" = "\t", "quoteChar" = "'", "escapeChar" = "\\" ) STORED AS TEXTFILE;
Last Article: To HBase/CloudTable
Next Article: To FTP/SFTP/NAS/SFS
Did this article solve your problem?
Thank you for your score!Your feedback would help us improve the website.