When Flink SQL Reads Large Tables, Data Beyond a Certain Number of Columns is Read as Null

Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.

Original topic: flink sql读取大表时超过一定列读出数据为Null

| username: llllllll

When Flink SQL reads a large table, data beyond a certain number of columns is read as Null. Currently, it has been observed that if the table has more than 125 columns, the extra columns are read as Null values. Has there been a similar issue before, and how can it be resolved?

| username: Lucien-卢西恩 | Original post link

How does Flink connect to TiDB to read data, and what is the corresponding Flink configuration?

| username: mxd-321 | Original post link

The ticdc-cdc in flink-cdc is implemented using the Java version of tikv-client. tikv-client: GitHub - tikv/client-java at release-3.2