强曰为道
与天地相似,故不违。知周乎万物,而道济天下,故不过。旁行而不流,乐天知命,故不忧.
文档目录

Rclone 数据迁移完全指南 / 第 3 章 - 远程存储配置

第 3 章 - 远程存储配置


3.1 远程存储(Remote)概念

在 Rclone 中,Remote(远程存储)是指通过特定协议连接的存储后端。每个 Remote 都有一个唯一的名称,用于在命令中引用。

基本语法:

remote:path/to/dir/
remote:bucket/key

例如:

rclone ls s3:my-bucket/data/
rclone ls gdrive:Documents/
rclone ls sftp-server:/home/user/data/

3.2 Amazon S3 及兼容存储

3.2.1 AWS S3 配置

rclone config
# n) New remote
# name> aws-s3
# Storage> s3
# provider> AWS
# access_key_id> AKIAIOSFODNN7EXAMPLE
# secret_access_key> wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
# region> us-east-1
# endpoint> (留空)
# location_constraint> (留空或与 region 一致)

直接编辑配置文件:

[aws-s3]
type = s3
provider = AWS
access_key_id = AKIAIOSFODNN7EXAMPLE
secret_access_key = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
region = us-east-1
acl = private
storage_class = STANDARD

3.2.2 阿里云 OSS 配置

[ali-oss]
type = s3
provider = Alibaba
access_key_id = LTAI5txxxxxxxx
secret_access_key = xxxxxxxxxxxxxxxxxx
endpoint = oss-cn-hangzhou.aliyuncs.com
acl = private

3.2.3 腾讯云 COS 配置

[tencent-cos]
type = s3
provider = TencentCOS
access_key_id = AKIDxxxxxxxx
secret_access_key = xxxxxxxxxxxxxxxxxx
endpoint = cos.ap-guangzhou.myqcloud.com
acl = private

3.2.4 MinIO 配置

[minio]
type = s3
provider = Minio
access_key_id = minioadmin
secret_access_key = minioadmin
endpoint = http://192.168.1.100:9000

3.2.5 Cloudflare R2 配置

[cloudflare-r2]
type = s3
provider = Cloudflare
access_key_id = xxxxxxxx
secret_access_key = xxxxxxxxxxxxxxxxxx
endpoint = https://<ACCOUNT_ID>.r2.cloudflarestorage.com
acl = private

3.2.6 S3 存储类对照表

AWS 存储类说明适用场景
STANDARD标准存储频繁访问
STANDARD_IA低频访问偶尔访问
ONEZONE_IA单可用区低频低成本低频
GLACIER冰川存储归档
GLACIER_IR即时检索冰川归档但需快速访问
DEEP_ARCHIVE深度归档长期保留
# 上传时指定存储类
rclone copy ./data/ s3:my-bucket/ --s3-storage-class STANDARD_IA

# 移动旧数据到冰川存储
rclone move s3:my-bucket/data/ s3:my-bucket/archive/ \
  --min-age 90d \
  --s3-storage-class GLACIER

3.3 Google Drive 配置

3.3.1 基本配置

rclone config
# n) New remote
# name> gdrive
# Storage> drive
# client_id> (留空使用默认,或填入自己的 OAuth Client ID)
# client_secret> (留空使用默认)
# scope> 1 (full access)
# service_account_file> (留空)
# Edit advanced config? n
# Use auto config? y (在浏览器中完成 OAuth 授权)

3.3.2 使用自建 OAuth Client

Google Drive 使用 OAuth 2.0 认证。建议创建自己的 Client ID 以避免共享限额:

  1. 访问 Google Cloud Console
  2. 创建项目 → 启用 Google Drive API
  3. 创建 OAuth 2.0 Client ID(类型:桌面应用)
  4. 将 Client ID 和 Secret 填入 Rclone 配置
[gdrive]
type = drive
client_id = 123456789.apps.googleusercontent.com
client_secret = GOCSPX-xxxxxxxxxxxxxxx
scope = drive
token = {"access_token":"ya29.xxx","token_type":"Bearer","refresh_token":"1//xxx","expiry":"2026-01-01T00:00:00Z"}
team_drive = 

3.3.3 共享云端硬盘(Shared Drive)

# 列出共享云端硬盘
rclone backend drives gdrive:

# 指定共享云端硬盘的 root_folder_id
# 配置时 advanced config → shared_with_me → 输入 root_folder_id

3.3.4 Google Drive 服务账号

适用于服务器端无人值守场景:

# 1. 在 Google Cloud Console 创建服务账号并下载 JSON 密钥
# 2. 将密钥文件保存到安全位置

# 配置
[gdrive-sa]
type = drive
service_account_file = /path/to/service-account.json
service_account_file_path = /path/to/service-account.json
impersonate = user@example.com

3.4 OneDrive 配置

3.4.1 个人版 OneDrive

rclone config
# n) New remote
# name> onedrive
# Storage> onedrive
# client_id> (留空使用默认)
# client_secret> (留空使用默认)
# region> 1 (Microsoft Cloud Global)
# Edit advanced config? n
# Use auto config? y (在浏览器中完成 OAuth 授权)

3.4.2 OneDrive for Business

[onedrive-biz]
type = onedrive
client_id = xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
client_secret = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
token = {"access_token":"xxx","token_type":"Bearer","refresh_token":"xxx","expiry":"2026-01-01T00:00:00Z"}
drive_type = business
drive_id = xxxxxxxxxxxxxxxxxxxxxx

3.4.3 SharePoint 配置

[sharepoint]
type = onedrive
token = {"access_token":"xxx","token_type":"Bearer","refresh_token":"xxx","expiry":"2026-01-01T00:00:00Z"}
drive_type = documentLibrary
drive_id = xxxxxxxxxxxxxxxxxxxxxx

3.5 SFTP 配置

3.5.1 基本配置(密码认证)

[sftp-basic]
type = sftp
host = 192.168.1.100
user = admin
port = 22
pass = $encrypted$xxxxxxx

⚠️ 注意:直接在配置文件中写入密码不安全。建议使用密钥认证或 rclone obscure 加密。

3.5.2 密钥认证配置(推荐)

[sftp-key]
type = sftp
host = 192.168.1.100
user = admin
port = 22
key_file = ~/.ssh/id_rsa
key_file_pass = 
known_hosts_file = ~/.ssh/known_hosts

3.5.3 使用 SSH Agent

[sftp-agent]
type = sftp
host = 192.168.1.100
user = admin
use_ssh_agent = true

3.5.4 SFTP 高级选项

[sftp-advanced]
type = sftp
host = 192.168.1.100
user = admin
key_file = ~/.ssh/id_rsa
md5sum_command = md5sum
sha1sum_command = sha1sum
shell_type = unix
path_override = /data/storage
set_modtime = true

3.6 WebDAV 配置

3.6.1 Nextcloud / ownCloud

[nextcloud]
type = webdav
url = https://cloud.example.com/remote.php/dav/files/admin/
vendor = nextcloud
user = admin
pass = $encrypted$xxxxxxx

3.6.2 坚果云 WebDAV

[jianguoyun]
type = webdav
url = https://dav.jianguoyun.com/dav/
vendor = other
user = user@example.com
pass = $encrypted$xxxxxxx

3.6.3 通用 WebDAV

[webdav-generic]
type = webdav
url = https://webdav.example.com/dav/
vendor = other
user = username
pass = $encrypted$xxxxxxx
bearer_token = 

3.7 Dropbox 配置

rclone config
# n) New remote
# name> dropbox
# Storage> dropbox
# client_id> (留空使用默认)
# client_secret> (留空使用默认)
# Edit advanced config? n
# Use auto config? y

高级配置(自建 App):

[dropbox]
type = dropbox
client_id = xxxxxxxxxx
client_secret = xxxxxxxxxxxxxxxxxx
token = {"access_token":"sl.xxx","token_type":"bearer","refresh_token":"xxx","expiry":"2026-01-01T00:00:00Z"}

3.8 本地存储配置

3.8.1 基本用法

# 本地路径可以直接使用,无需配置 remote
rclone ls /path/to/dir/
rclone copy /source/ /dest/

# 或使用 remote 引用(带波浪号展开)
rclone ls local:/home/user/data/

3.8.2 配置为 Remote

[local]
type = local
nounc = true
copy_links = false
links = true
skip_links = false
one_file_system = false

3.8.3 跨平台路径注意事项

平台路径格式示例
Linux/path/to/dir/home/user/data/
macOS/path/to/dir/Users/user/data/
WindowsC:\path\to\dirC:/path/to/dirC:\Users\user\data\
Windows UNC\\server\share\path\\NAS\backup\

3.9 配置管理技巧

3.9.1 配置文件导入导出

# 查看配置文件路径
rclone config file
# 输出:Configuration file is stored at: /home/user/.config/rclone/rclone.conf

# 备份配置
cp ~/.config/rclone/rclone.conf ~/rclone.conf.bak

# 在另一台机器上恢复
scp ~/rclone.conf.bak user@server:~/.config/rclone/rclone.conf

3.9.2 使用环境变量临时覆盖

# 临时使用不同的 S3 密钥
export RCLONE_CONFIG_MYS3_ACCESS_KEY_ID=AKIAxxxxxxxxx
export RCLONE_CONFIG_MYS3_SECRET_ACCESS_KEY=xxxxxxxxxxxx
rclone ls mys3:my-bucket/

3.9.3 多配置文件管理

# 使用不同的配置文件
rclone --config /path/to/config1.conf ls remote1:
rclone --config /path/to/config2.conf ls remote2:

# 设置别名
alias rclone1='rclone --config ~/.config/rclone/work.conf'
alias rclone2='rclone --config ~/.config/rclone/personal.conf'

3.9.4 Remote 别名(Alias)

创建别名简化长路径:

rclone config
# n) New remote
# name> photos
# Storage> alias
# remote> gdrive:My Drive/Photos/2026/
[photos]
type = alias
remote = gdrive:My Drive/Photos/2026/

现在可以直接使用:

rclone ls photos:
# 等同于 rclone ls "gdrive:My Drive/Photos/2026/"

3.9.5 合并存储(Union)

将多个 Remote 合并为一个逻辑视图:

[all-storage]
type = union
upstreams = gdrive: onedrive: dropbox:
action_policy = epl
create_policy = epm
search_policy = ff

3.9.6 加密远程(Crypt)

在现有 Remote 上叠加加密层(详见 第 9 章):

[encrypted-s3]
type = crypt
remote = s3:my-bucket/encrypted/
password = $encrypted$xxxxxxx
password2 = $encrypted$xxxxxxx
filename_encryption = standard
directory_name_encryption = true

3.10 测试远程连接

验证配置

# 列出所有 remote
rclone listremotes
# 输出:
# aws-s3:
# gdrive:
# onedrive:
# sftp-server:

# 列出 Remote 的根目录
rclone lsd aws-s3:
rclone lsd gdrive:

# 测试速度
rclone test speed aws-s3:my-bucket/

# 测试详细信息
rclone about gdrive:

诊断连接问题

# 使用详细输出调试
rclone lsd myremote: -vv

# 查看 HTTP 请求详情
rclone lsd myremote: --dump headers --dump bodies

# 检查网络连通性
rclone backend features myremote:

注意事项

⚠️ OAuth Token 刷新

  • OAuth Token 通常有效期 1 小时,Rclone 会自动刷新
  • 如果长时间未使用,Refresh Token 可能过期,需要重新授权
  • 服务器环境建议使用 Service Account 避免交互式授权

⚠️ API 配额

  • Google Drive:每个项目每天 10 亿次查询、每 100 秒 12,000 次查询
  • OneDrive:每个应用每分钟 6,000 次请求
  • 建议使用 --tpslimit 参数限制请求速率

💡 安全建议

  1. 使用 rclone config 交互式配置,敏感信息会被自动加密存储
  2. 或使用 RCLONE_CONFIG_PASS 环境变量保护配置文件
  3. 对于 CI/CD 环境,使用环境变量传递凭据
  4. 定期轮换 Access Key / Secret Key

扩展阅读


上一章← 第 2 章 - 安装与配置 下一章第 4 章 - 基本操作 →