To Doris
Table 1 lists the destination job parameters when the destination link is a Doris link.
Type |
Parameter |
Description |
Example Value |
---|---|---|---|
Basic parameters |
Schema/Tablespace |
Name of the database to which data will be written. The schema can be automatically created. Click the icon next to the text box to select a schema or tablespace. |
schema |
Table Name |
Name of the table to which data will be written. Click the icon next to the text box. The dialog box for selecting the table is displayed. This parameter can be configured as a macro variable of date and time and a path name can contain multiple macro variables. When the macro variable of date and time works with a scheduled job, the incremental data can be synchronized periodically. For details, see Incremental Synchronization Using the Macro Variables of Date and Time.
NOTE:
If you have configured a macro variable of date and time and schedule a CDM job through DataArts Factory of DataArts Studio, the system replaces the macro variable of date and time with (Planned start time of the data development job – Offset) rather than (Actual start time of the CDM job – Offset). |
table |
|
Clear Data Before Import |
Whether to clear the data in the destination table before data import. The options are as follows:
|
Clear part of data |
|
WHERE Clause |
If Clear Data Before Import is set to Clear part of data, data in the destination table will be deleted based on the WHERE clause after the configuration is complete and before the import starts. |
age > 18 and age <= 60 |
|
stream load config properties |
Stream load parameters |
max_filter_ratio=0 |
|
Number of failed retries |
Maximum number of retries upon a failure |
3 |
|
Advanced attributes |
Prepare for Data Import |
The SQL statement that is first executed before a task is executed. Currently, only one SQL statement can be executed in wizard mode. |
create temp table |
Complete Statement After Data Import |
The SQL statement that is executed after a task is executed. Currently, only one SQL statement can be executed. |
merge into |
|
Loader Threads |
Number of threads started in each loader. A larger number allows more concurrent write operations. The unique model or aggregation function replace have requirements on the insertion sequence. When they are used, do not use the concurrency capability. Conflict handling policies do not support "replace into" or "on duplicate key update". |
1 |
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot