What Can I Do If the Map Field Tab Page Cannot Display All Columns?
Symptom
When data is exported from HBase/CloudTable using CDM, fields in the HBase/CloudTable table on the Map Field tab page occasionally cannot be displayed completely and cannot match the fields on the migration destination. As a result, the data imported to the migration destination is incomplete.
Possible Cause
HBase/CloudTable are schema-less, and the number of columns in each data is not fixed. On the Map Field page, there is a high probability that all columns cannot be obtained by obtaining example values. In this case, the data on the migration destination is incomplete after the job is executed.
- Add fields on the Map Field tab page.
- Edit the JSON file of the job on the Job Management page (modify the fromJobConfig.columns and toJobConfig.columnList parameters).
- Export the JSON file of the job to the local PC, modify the parameters in the JSON file (the principle is the same to that in 2), and then import the JSON file back to CDM.
You are advised to perform 1. The following uses data migration from HBase to DWS as an example.
Solution 1: Adding Fields on the Map Field Tab Page
- Obtain all fields in the tables to be migrated from source HBase. Use colons (:) to separate column families and columns. The following gives an example:
rowkey:rowkey g:DAY_COUNT g:CATEGORY_ID g:CATEGORY_NAME g:FIND_TIME g:UPLOAD_PEOPLE g:ID g:INFOMATION_ID g:TITLE g:COORDINATE_X g:COORDINATE_Y g:COORDINATE_Z g:CONTENT g:IMAGES g:STATE
- On the Job Management page, locate the job for exporting data from HBase to DWS, click Edit in the row where the job resides, and go to the Map Field tab page.
Figure 1 Field mapping 03
- Click . In the dialog box that is displayed, select Add a new field.
Figure 2 Adding a field 04
- After a field is added, the example value of the new field is not displayed on the console. This does not affect the transmission of field values. CDM directly writes the field values to the migration destination.
- To add new fields, the migration source must be MongoDB, HBase, relational databases, or Redis (data in Redis must be in the Hash format).
- After all fields are added, check whether the mapping between the migration source and destination is correct. If the mapping is incorrect, drag the fields to adjust the field mapping.
- Click Next and Save.
Solution 2: Modifying a JSON File
- Obtain all fields in the tables to be migrated from source HBase. Use colons (:) to separate column families and columns. The following gives an example:
rowkey:rowkey g:DAY_COUNT g:CATEGORY_ID g:CATEGORY_NAME g:FIND_TIME g:UPLOAD_PEOPLE g:ID g:INFOMATION_ID g:TITLE g:COORDINATE_X g:COORDINATE_Y g:COORDINATE_Z g:CONTENT g:IMAGES g:STATE
- In the DWS destination table, obtain the fields corresponding to the HBase table fields.
If any field name corresponding to the HBase field does not exist in the DWS destination table, add it to the DWS table schema. Suppose that the fields in the DWS table are complete and are displayed as follows:
rowkey day_count category category_name find_time upload_people id information_id title coordinate_x coordinate_y coordinate_z content images state
- On the Job Management page, locate the job for exporting data from HBase to DWS, and choose in the row where the job resides.
- On the page that is displayed, edit the JSON file of the job.
- Modify the fromJobConfig.columns parameter of the migration source to the HBase fields obtained in 1. Use & to separate column numbers and colons (:) to separate column families and columns. The following gives an example:
"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.table", "value": "HBase" }, { "name": "fromJobConfig.columns", "value": "rowkey:rowkey&g:DAY_COUNT&g:CATEGORY_ID&g:CATEGORY_NAME&g:FIND_TIME&g:UPLOAD_PEOPLE&g:ID&g:INFOMATION_ID&g:TITLE&g:COORDINATE_X&g:COORDINATE_Y&g:COORDINATE_Z&g:CONTENT&g:IMAGES&g:STATE" }, { "name": "fromJobConfig.formats", "value": { "2": "yyyy-MM-dd", "undefined": "yyyy-MM-dd" } } ], "name": "fromJobConfig" } ] }
- Modify the toJobConfig.columnList parameter of the migration source to the field list of DWS obtained in 2.
The sequence must be the same as that of HBase to ensure correct field mapping. Use & to separate field names. The following gives an example:
"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.schemaName", "value": "dbadmin" }, { "name": "toJobConfig.tablePreparation", "value": "DO_NOTHING" }, { "name": "toJobConfig.tableName", "value": "DWS " }, { "name": "toJobConfig.columnList", "value": "rowkey&day_count&category&category_name&find_time&upload_people&id&information_id&title&coordinate_x&coordinate_y&coordinate_z&content&images&state" }, { "name": "toJobConfig.shouldClearTable", "value": "true" } ], "name": "toJobConfig" } ] }
- Retain the settings of other parameters, and then click Save and Run.
- Modify the fromJobConfig.columns parameter of the migration source to the HBase fields obtained in 1. Use & to separate column numbers and colons (:) to separate column families and columns. The following gives an example:
- After the job is completed, check whether the data in the DWS table matches the data in HBase. If the mapping is incorrect, check whether the sequences of the HBase and DWS fields in the JSON file are the same.
DataArts Migration FAQs
- What Are the Differences Between CDM and Other Data Migration Services?
- What Are the Advantages of CDM?
- What Are the Security Protection Mechanisms of CDM?
- How Do I Reduce the Cost of Using CDM?
- Will I Be Billed If My CDM Cluster Does Not Use the Data Transmission Function?
- Why Am I Billed Pay per Use When I Have Purchased a Yearly/Monthly CDM Incremental Package?
- How Do I Check the Remaining Validity Period of a Package?
- Can CDM Be Shared by Different Tenants?
- Can I Upgrade a CDM Cluster?
- How Is the Migration Performance of CDM?
- What Is the Number of Concurrent Jobs for Different CDM Cluster Versions?
- Does CDM Support Incremental Data Migration?
- Does CDM Support Field Conversion?
- What Component Versions Are Recommended for Migrating Hadoop Data Sources?
- What Data Formats Are Supported When the Data Source Is Hive?
- Can I Synchronize Jobs to Other Clusters?
- Can I Create Jobs in Batches?
- Can I Schedule Jobs in Batches?
- How Do I Back Up CDM Jobs?
- How Do I Configure the Connection If Only Some Nodes in the HANA Cluster Can Communicate with the CDM Cluster?
- How Do I Use Java to Invoke CDM RESTful APIs to Create Data Migration Jobs?
- How Do I Connect the On-Premises Intranet or Third-Party Private Network to CDM?
- Does CDM Support Parameters or Variables?
- How Do I Set the Number of Concurrent Extractors for a CDM Migration Job?
- Does CDM Support Real-Time Migration of Dynamic Data?
- Can I Stop CDM Clusters?
- How Do I Obtain the Current Time Using an Expression?
- What Should I Do If the Log Prompts that the Date Format Fails to Be Parsed?
- What Can I Do If the Map Field Tab Page Cannot Display All Columns?
- How Do I Select Distribution Columns When Using CDM to Migrate Data to DWS?
- What Do I Do If the Error Message "value too long for type character varying" Is Displayed When I Migrate Data to DWS?
- What Can I Do If Error Message "Unable to execute the SQL statement" Is Displayed When I Import Data from OBS to SQL Server?
- What Should I Do If the Cluster List Is Empty, I Have No Access Permission, or My Operation Is Denied?
- Why Is Error ORA-01555 Reported During Migration from Oracle to DWS?
- What Should I Do If the MongoDB Connection Migration Fails?
- What Should I Do If a Hive Migration Job Is Suspended for a Long Period of Time?
- What Should I Do If an Error Is Reported Because the Field Type Mapping Does Not Match During Data Migration Using CDM?
- What Should I Do If a JDBC Connection Timeout Error Is Reported During MySQL Migration?
- What Should I Do If a CDM Migration Job Fails After a Link from Hive to DWS Is Created?
- How Do I Use CDM to Export MySQL Data to an SQL File and Upload the File to an OBS Bucket?
- What Should I Do If CDM Fails to Migrate Data from OBS to DLI?
- What Should I Do If a CDM Connector Reports the Error "Configuration Item [linkConfig.iamAuth] Does Not Exist"?
- What Should I Do If Error Message "Configuration Item [linkConfig.createBackendLinks] Does Not Exist" Is Displayed During Data Link Creation or Error Message "Configuration Item [throttlingConfig.concurrentSubJobs] Does Not Exist" Is Displayed During Job Creation?
- What Should I Do If Message "CORE_0031:Connect time out. (Cdm.0523)" Is Displayed During the Creation of an MRS Hive Link?
- What Should I Do If Message "CDM Does Not Support Auto Creation of an Empty Table with No Column" Is Displayed When I Enable Auto Table Creation?
- What Should I Do If I Cannot Obtain the Schema Name When Creating an Oracle Relational Database Migration Job?
- What Should I Do If invalid input syntax for integer: "true" Is Displayed During MySQL Database Migration?
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbotmore