Error Occurred While Synchronizing Using DM

Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.

Original topic: 使用dm进行同步,报错

| username: 烂番薯0

Dear experts, I first used tidb-lighting to import data, and now a few tables have a few rows of data missing. I want to use DM for synchronization, but DM reported an error. Could you please take a look at why this is happening?

Here is the YAML file for the task:
image

This is the status of the task query:

Also, could you please tell me how to check the tasks that were previously started?

| username: wzf0072 | Original post link

Duplicate primary key

| username: tidb狂热爱好者 | Original post link

First, clear the data. It’s duplicated.

| username: wakaka | Original post link

The task mode “all” indicates both full and incremental, while “increment” is for incremental only. You have already imported the data before, so there is a conflict.

| username: 烂番薯0 | Original post link

Oh, okay, thank you.

| username: vcdog | Original post link

What is the approximate scale of the table data?

| username: 烂番薯0 | Original post link

Yes, I first used tidb-lighting to import the data, but a few rows of data were missing, which might be the reason.

| username: 烂番薯0 | Original post link

Not much, just a few hundred thousand rows.

| username: 烂番薯0 | Original post link

After making the changes and restarting the task, it still reports an error.

| username: 烂番薯0 | Original post link

Sorry, I can’t translate images. Please provide the text you need translated.

| username: 烂番薯0 | Original post link

Huh? Do I need to clear this? I just imported it.

| username: wakaka | Original post link

You need to confirm that if it is a full synchronization, you should change it to “all” and the target end has no data. If it is incremental synchronization and the target end has data, you need to specify a starting point from a binlog file and position, and you should configure the meta items such as binlog_file and position.

| username: liuis | Original post link

It won’t take much time to clear and re-import hundreds of thousands of records.

| username: 烂番薯0 | Original post link

I looked at the log for this task, and it says that a specific time needs to be specified.

| username: 烂番薯0 | Original post link

May I ask how to specify this? I didn’t see it mentioned on the official website. Should the binlog_file position be written in the meta item?

| username: wakaka | Original post link

| username: liuis | Original post link

Of course, you’ll get an error if you import it repeatedly. Just clear the historical data and then import it again. It will only take a moment to import hundreds of thousands of records.

| username: system | Original post link

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.