How to Use the BR Tool to Backup Data to Public Cloud BOS in TiDB

Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.

Original topic: TIDB 如何实现用br工具进行数据备份到公有云bos上

| username: residentevil

[TiDB Usage Environment] Production Environment
[TiDB Version] V6.5.8
[Encountered Problem: Problem Description and Impact] How to use the br tool to back up data to the public cloud BOS in TiDB? Are there any detailed commands? The official documentation only covers the S3 scenario.

| username: 像风一样的男子 | Original post link

Currently, it does not support directly backing up data to OSS. You can back up to the local storage first and then transfer it to OSS.

| username: dba远航 | Original post link

The official documentation states that it supports Amazon S3, Google Cloud Storage (GCS), and Azure Blob Storage storage systems.

| username: residentevil | Original post link

Because multiple TiKV instances are deployed on one server, I see that each instance has a backup set during the backup period. The current solution I have in mind is to mount a large disk machine through NFS, etc., back up the backup set to the large disk machine, and then upload it to BOS. This solution is equivalent to an intermediate transfer, which consumes bandwidth and doubles the time.

| username: residentevil | Original post link

There is no BOS from our domestic cloud vendors, hehe.

| username: 像风一样的男子 | Original post link

I am doing this now, backing up and uploading to OSS is very fast and won’t double the time. However, an additional NFS server with a large disk is indeed needed.

| username: residentevil | Original post link

For large amounts of data, such as storage capacities of several hundred TB, I’m not sure if there would be any issues with using NFS.

| username: heiwandou | Original post link

Backup to object storage

| username: residentevil | Original post link

The intention is to back up to object storage, such as the BOS product from cloud providers, but it seems that it is not supported at the moment.

| username: lemonade010 | Original post link

A NAS with several hundred terabytes is quite a challenge for the network, right? See if you can achieve it with backup + incremental backup.

| username: residentevil | Original post link

Yes, a base backup is definitely necessary. The backup cycle can be extended, for example, one base backup per week and one incremental backup per day. However, regardless, the data transfer at this scale is quite challenging. So, I was thinking if BR could directly transfer the data to BOS storage, it would be relatively better, at least reducing one intermediate transfer.

| username: 有猫万事足 | Original post link

I use Tencent Cloud, and I can directly back up to COS using BR. Not sure about other clouds.

| username: 江湖故人 | Original post link

Will BOS be cheaper? The official documentation doesn’t specify, so using it rashly might be risky.

| username: 小于同学 | Original post link

Currently, it should not support direct backup.

| username: 哈喽沃德 | Original post link

Backup locally and then transfer it.

| username: TiDBer_5Vo9nD1u | Original post link

It seems not supported.

| username: residentevil | Original post link

Currently, we haven’t considered the price issue yet. We’re exploring solutions, haha.

| username: residentevil | Original post link

If using Tencent Cloud’s BOS, you also need to configure endpoint, ak, sk, and other information. How should the command be written when backing up with br? Could you please provide a demo?

| username: 有猫万事足 | Original post link

tiup br backup full -u “<pd_addr>:2379” --storage “s3://<bucket_name>” --s3.endpoint=“https://cos.ap-beijing.myqcloud.com” --send-credentials-to-tikv=true --ratelimit 100 --log-file backuptable.log

–s3.endpoint should be filled with the address closest to your machine.
–ratelimit should have a reasonable limit, otherwise, the backup can easily occupy the bandwidth and cause continuous errors.

–storage “s3://<bucket_name>?access-key=xxxxxx&secret-access-key=xxxxx&endpoint=https://cos.ap-beijing.myqcloud.com/

The storage can also be written this way, with the access-key and secret-access-key included. However, it is highly not recommended because it is not secure; it is better to put them in the configuration file.

| username: 随便改个用户名 | Original post link

Dumpling supports backing up to object storage S3