The query has been canceled due to exceeding the memory limit allowed for a single SQL query. Please try to narrow the query scope or increase the tidb_mem_quota_query limit, and then try again. [conn=3607482257570389131]

Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.

Original topic: 由于超出单个SQL查询所允许的内存限制,查询已被取消。请尝试缩小查询范围或增加tidb_mem_quota_query限制,然后再试一次。[conn=3607482257570389131]

| username: TiDBer_hUfk8uMB

Your query has been cancelled due to exceeding the allowed memory limit for a single SQL query. Please try narrowing your query scope or increase the tidb_mem_quota_query limit and try again. [conn=3607482257570389131]
I set SET GLOBAL tidb_mem_quota_query = 14G but it still doesn’t work. How should I troubleshoot this specifically?

| username: FutureDB | Original post link

What are you troubleshooting? With such a large memory, it should first consider optimizing the SQL.

| username: zhaokede | Original post link

You can reduce the query scope.

| username: zhanggame1 | Original post link

Check the SQL usage during statement execution.

| username: 有猫万事足 | Original post link

Are you trying to find out what this SQL is?

Look for expensive queries, and then see if conn=3607482257570389131 matches.

This number is the conn_id in the log.

| username: TiDBer_hUfk8uMB | Original post link

This SQL was working fine before, but suddenly it stopped working. It doesn’t seem to be an issue with the SQL itself.

| username: 小龙虾爱大龙虾 | Original post link

It might still be an SQL issue. Post the execution plan and let’s take a look.

| username: 小龙虾爱大龙虾 | Original post link

You can find the execution plan of this failed SQL in the slow query log.

| username: zhanggame1 | Original post link

First, take a look at this SQL execution plan.

| username: Miracle | Original post link

An increase in the data in the table or inaccurate statistics could both lead to this issue. Identify the SQL and then check the execution plan to understand it.

| username: 我是吉米哥 | Original post link

Even with such a large setting, is there no OOM (Out of Memory)?

| username: TIDB-Learner | Original post link

Increasing memory can temporarily solve the problem. However, you may frequently encounter similar issues later on. It is recommended to analyze the SQL and see if it can be optimized.

| username: juecong | Original post link

There are slow queries in the console, right? If it doesn’t work, get a TiDB with large memory specifically to execute this statement.

| username: zhaokede | Original post link

It is also possible that the data has changed. Previously, it only returned 10 rows, but now it returns 1 billion rows.

| username: tidb菜鸟一只 | Original post link

Just post the SQL execution plan… 14GB of memory is not enough.

| username: TiDBer_QKDdYGfz | Original post link

If the SQL hasn’t changed, then it must be the data itself that has changed, and the data has increased.

| username: lemonade010 | Original post link

Let the business modify the query range. 14G already causes OOM, increasing it further might affect the stable operation of the database.

| username: Kongdom | Original post link

This should optimize the SQL statement, you can’t use memory without limits.