Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.
Original topic: token-limit连接数控制
To improve efficiency, please provide the following information. Clear problem descriptions can be resolved faster:
【TiDB Usage Environment】Production, Testing, Research
【TiDB Version】
【Encountered Problem】
I set the token-limit to 3 and used sysbench for stress testing with threads=100. During the stress test, I observed that the number of connections to TiDB was 100, and the TPS was quite high. It seems that the token-limit did not control the connections. I checked the tidb.log and did not see any related connection errors.
【Reproduction Path】What operations were performed to encounter the problem
【Problem Phenomenon and Impact】
【Attachments】
Please provide the version information of each component, such as cdc/tikv, which can be obtained by executing cdc version/tikv-server --version.
If the question is about performance optimization or fault troubleshooting, please download the script and run it. Please select all and copy-paste the terminal output results for upload.
The number of connections is controlled by max-server-connections
, which is unlimited by default and can all connect. token-limit
is the number of active connections, so you should look at this.
Tokens control whether SQL can be executed, not the number of connections. To execute SQL, you need to obtain a token.
Where can we see the monitoring for this aspect?
- The number of concurrent sessions allowed to run in TiDB, used for traffic control
- Default: 1000
- If the current running connections exceed this token-limit, the request will be blocked, waiting for the completed operations to release the token
Some people say this controls the concurrency, while others say it controls the number of sessions. I don’t understand what this parameter is actually used for?
My sysbench is set with a concurrency of 100, and the token-limit is set to 3. Even if it’s managing tokens, I don’t think this concurrency of 100 can be reached. However, when I look at Grafana, there are no issues at all.
The token-limit means that out of these 100 threads, only 3 can execute concurrently. Even if you start 10,000 threads, only 3 will be executing SQL parsing, sending KV, and receiving data at the same time. You can see the token acquisition time on the Grafana and TiDB panels. If your SQL execution is very fast, you might not notice any changes. It’s similar to how your PC might have only 8 cores, but there are thousands of threads across all processes on the PC. Why doesn’t it feel slow? Because they take turns.
Is there any related alert information in the backend? I didn’t find it in tidb.log.
Why would you need to raise an alert? There won’t be any. The SQL can execute normally, it’s just queued up waiting.
Understood, I increased the limit, and there was a noticeable change in the time to get the token. The QPS also changed significantly.
The processing of SQL within TiDB is divided into four stages: get token, parse, compile, and execute. I would like to ask, if my token-limit=1, does this mean that an SQL statement will go through the above four stages and release the token before the next queued SQL statement can be executed? Is that correct?
Yes, the token applied at the entrance. Actually, I feel this feature is somewhat redundant. If there is no limit on the number of connections, only on execution, when the concurrency reaches thousands, the memory usage will still be high. Picking 3 executable ones from a bunch of goroutines is also quite cumbersome. Anyway, I tried this, and it didn’t work well. Limiting the number of connections works better; just allow as many threads as you can handle. The proxy in front should be set to prioritize the least connections, meaning if there are 10 TiDB nodes, the proxy in front should not be set to round-robin but to pick the one with the least connections among the 10 nodes. This way, it will be more balanced. With round-robin, it will gradually become unbalanced.
Finally understood this parameter, I was confused before. Thank you!
So, in your environment, how much is the token-limit configured? Is it still the default 1000?
This topic was automatically closed 60 days after the last reply. No new replies are allowed.