Error 1193: Unknown system variable 'tidb_current_ts' when syncing TiDB data to MySQL using TiCDC tool

Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.

Original topic: TIDB 通过ticdc 工具tidb数据同步到mysql报错Error 1193: Unknown system variable 'tidb_current_ts

| username: Hacker_nTkchcIf

[TiDB Usage Environment] Test Environment\
[TiDB Version] 5.2.0
[Problem Encountered] Using the ticdc tool to synchronize TiDB data to MySQL, data is not synchronized correctly, and the following error is reported:
Error 1193: Unknown system variable ‘tidb_current_ts’
[Reproduction Path] What operations were performed to encounter the problem
The general steps to configure ticdc data synchronization to MySQL are as follows:

  1. Create an empty table t1 on both the source and target ends.
  2. Execute the command: tiup ctl:v4.0.13 cdc changefeed create --pd=http://127.0.0.1:2379 --sink-uri="mysql://root:root@10.221.13.111:3308" --config=ticdc_config.toml --changefeed-id="tidb2msyql" --sync-point=true --sync-interval=10s
  3. Insert data on the source end, but no data appears on the target end.
  4. Check the cdc task and find error messages. Inquire about the cause of this issue.
    Using the command cdc cli changefeed query --pd=http://127.0.0.1:2379 --changefeed-id="tidb2msyq", the following error is found:
],
"error": {
  "addr": "10.21.15.26:8300",
  "code": "CDC:ErrOwnerUnknown",
  "message": "[CDC:ErrMySQLTxnError]Error 1193: Unknown system variable 'tidb_current_ts'"
},

[Problem Phenomenon and Impact]
Using the command cdc cli changefeed query --pd=http://127.0.0.1:2379 --changefeed-id="tidb2msyq", the following error is found:

],
"error": {
  "addr": "10.21.15.26:8300",
  "code": "CDC:ErrOwnerUnknown",
  "message": "[CDC:ErrMySQLTxnError]Error 1193: Unknown system variable 'tidb_current_ts'"
},

[Attachments]

# more scale-out-cdc.yaml
cdc_servers:
  - host: 10.221.13.206
    gc-ttl: 86400
    data_dir: /tidb-data/cdc-8300

Migration configuration file is as follows:

[root@tidb-test1 ~]# more ticdc_config.toml
case-sensitive = true
enable-old-value = true
[filter]
rules = ['testdb1.*']
[mounter]
worker-num = 8
  • TiUP Cluster Display Information
Starting component `cluster`: /root/.tiup/components/cluster/v1.8.1/tiup-cluster display tidb-pb-test
Cluster type:       tidb
Cluster name:       tidb-pb-test
Cluster version:    v5.2.0
Deploy user:        tidb
SSH type:           builtin
Dashboard URL:      http://10.221.13.206:2379/dashboard
ID                   Role          Host           Ports        OS/Arch       Status  Data Dir                      Deploy Dir
--                   ----          ----           -----        -------       ------  --------                      ----------
10.221.13.208:9093   alertmanager  10.221.13.208  9093/9094    linux/x86_64  Up      /tidb-data/alertmanager-9093  /tidb-deploy/alertmanager-9093
10.221.13.206:8300   cdc           10.221.13.206  8300         linux/x86_64  Up      /tidb-data/cdc-8300           /tidb-deploy/cdc-8300
10.221.13.208:3000   grafana       10.221.13.208  3000         linux/x86_64  Up      -                             /tidb-deploy/grafana-3000
10.221.13.206:2379   pd            10.221.13.206  2379/2380    linux/x86_64  Up|UI   /tidb-data/pd-2379            /tidb-deploy/pd-2379
10.221.13.207:2379   pd            10.221.13.207  2379/2380    linux/x86_64  Up      /tidb-data/pd-2379            /tidb-deploy/pd-2379
10.221.13.208:2379   pd            10.221.13.208  2379/2380    linux/x86_64  Up|L    /tidb-data/pd-2379            /tidb-deploy/pd-2379
10.221.13.208:9090   prometheus    10.221.13.208  9090         linux/x86_64  Up      /tidb-data/prometheus-9090    /tidb-deploy/prometheus-9090
10.221.13.206:4000   tidb          10.221.13.206  4000/10080   linux/x86_64  Up      -                             /tidb-deploy/tidb-4000
10.221.13.207:4000   tidb          10.221.13.207  4000/10080   linux/x86_64  Up      -                             /tidb-deploy/tidb-4000
10.221.13.208:4000   tidb          10.221.13.208  4000/10080   linux/x86_64  Up      -                             /tidb-deploy/tidb-4000
10.221.13.206:20160  tikv          10.221.13.206  20160/20180  linux/x86_64  Up      /tidb-data/tikv-20160         /tidb-deploy/tikv-20160
10.221.13.207:20160  tikv          10.221.13.207  20160/20180  linux/x86_64  Up      /tidb-data/tikv-20160         /tidb-deploy/tikv-20160
10.221.13.208:20160  tikv          10.221.13.208  20160/20180  linux/x86_64  Up      /tidb-data/tikv-20160         /tidb-deploy/tikv-20160
Total nodes: 13

Found cluster newer version:

The latest version:         v1.11.0
Local installed version:    v1.8.1
Update current component:   tiup update cluster
Update all components:      tiup update --all
  • TiUP Cluster Edit config Information
  • TiDB-Overview Monitoring
  • Corresponding module’s Grafana monitoring (if any, such as BR, TiDB-binlog, TiCDC, etc.)
  • Corresponding module logs (including logs from 1 hour before and after the issue)
| username: wuxiangdong | Original post link

Change sync_point to false.

| username: wuxiangdong | Original post link

If the downstream is TiDB, there is tidb_current_ts. If the downstream is MySQL, this variable does not exist.

| username: Hacker_nTkchcIf | Original post link

My downstream is a standalone MySQL.

| username: neilshen | Original post link

Currently, TiCDC’s sync-point only supports synchronization tasks with TiDB as the downstream.

| username: system | Original post link

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.