Tidb-lightning restoring CSV data prompts syntax error: cannot have consecutive fields without separator

Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.

Original topic: tidb-lightning 恢复csv数据提示syntax error: cannot have consecutive fields without separator

| username: jerome

[TiDB Usage Environment] Testing
[TiDB Version] v6.5.0
[Encountered Issue: Problem Phenomenon and Impact]
Using tidb-lightning v6.5 to import CSV data, an error occurs: syntax error: cannot have consecutive fields without separator
[Resource Configuration]
Deployed via k8s, node resources 16c64G


[Attachments: Screenshots/Logs/Monitoring]
Logs

CSV data
“774889765”,“1”,“778301”,“香一组”,“01132”,“罗xx婷”,“2”,“1”,“111”,“4xxxxxxxxxxx”,“\N”,“42”,“2018-12-31 00:00:00”,“1”,“\N”,“\N”,“北流市清水xxxxxx”,“-”,“\N”,“\N”,“-1”,“\N”,“\N”,“1”,“1”,“2021-12-29 04:47:21”,“4140”

| username: tidb菜鸟一只 | Original post link

What does the table structure look like?

| username: jerome | Original post link

The image is not visible. Please provide the text you need translated.

| username: caiyfc | Original post link

You need to check the settings in the lightning configuration file, it is probably an issue with the escape characters and \N settings.

| username: jerome | Original post link

The image is not visible. Please provide the text you need translated.

| username: caiyfc | Original post link

Don’t comment out the delimiter!

| username: jerome | Original post link

The default delimiter is a comma, and the delimiter in the data is also a comma, so it doesn’t have any impact.

| username: caiyfc | Original post link

Is this data normal?

| username: jerome | Original post link

The image is not visible. Please provide the text you need translated.

| username: jerome | Original post link

It is normal.

| username: caiyfc | Original post link

So I understand that the \n I highlighted is an end-of-line delimiter, right? It shouldn’t be set to “” in the configuration file.

| username: jerome | Original post link

The issue has been resolved; it was a problem with escaping in the data.