问题描述
如何处理 ftplib 中的断开连接?
How can I handle disconnects in ftplib?
我编写了一个 Python 脚本,我将使用它来使用 ftplib 将非常大的文件上传到 FTP 服务器.
I wrote a Python scrip that I will use in order to upload very big files to an FTP server using ftplib.
我的问题是:由于文件的大小,上传可能会花费很多时间,如果互联网在中间断开连接然后在 1 分钟后重新连接怎么办?如何在脚本中处理此类问题?有什么想法吗?
My question is: Seeing as upload will probably take a lot of time due to the file's size, what if the internet disconnects in the middle and then reconnects say after 1 minute? How can I handle such issue in the script? Any ideas?
我想到的是一个 try except 块,它不断检查互联网连接是否可用.有什么想法吗?
What I thought about is a try except block that keeps checking if internet connection is available. Any ideas?
谢谢
推荐答案
使用 Python ftplib 上传时处理断开连接的简单实现:
A simple implementation for handling of disconnects while uploading with Python ftplib:
finished = False
local_path = "/local/source/path/file.zip"
remote_path = "/remote/desti/path/file.zip"
with open(local_path, 'rb') as f:
while (not finished):
try:
if ftp is None:
print("Connecting...")
ftp = FTP(host, user, passwd)
if f.tell() > 0:
rest = ftp.size(remote_path)
print(f"Resuming transfer from {rest}...")
f.seek(rest)
else:
print("Starting from the beginning...")
rest = None
ftp.storbinary(f"STOR {remote_path}", f, rest=rest)
print("Done")
finished = True
except Exception as e:
ftp = None
sec = 5
print(f"Transfer failed: {e}, will retry in {sec} seconds...")
time.sleep(sec)
建议进行更细粒度的异常处理.
类似的下载:
超时后恢复FTP下载
这篇关于在 Python ftplib FTP 传输文件上传中处理断开连接的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!