Python实现GCS bucket断点续传功能,分块上传文件
环境:Python 3.6
我有一个关于使用断点续传到Google Cloud Storage的上传速度的问题。我已经编写了一个Python客户端,用于将大文件上传到GCS(它具有一些特殊功能,这就是为什么gsutil对我公司不适用的原因)。在大约2个月前运行的测试中,它很好地利用了可用的连接带宽,其中25Mbps连接中大约有20Mbps。该项目被冻结了将近2个月,现在,当重新打开该项目时,同一客户端以非常慢的速度上载,速度约为25Mbps的1.4Mbps。我已经编写了简单的Python脚本来检查它是否也会遇到相同的问题,并且速度稍快一些,但仍约为2Mbps。Gsutil工具的执行效果几乎与我的Python脚本相同。我还以超过50Mbps的上传速度在不同的网络基础架构上运行了该测试,效果非常好。
参考地址:Requests Utilities — google-resumable-media documentation
import google.auth
import google.auth.transport.requests as tr_requests
ro_scope = u'https://www.googleapis.com/auth/devstorage.read_only'
credentials, _ = google.auth.default(scopes=(ro_scope,))
transport = tr_requests.AuthorizedSession(credentials)
from google.resumable_media.requests import ResumableUpload
import iobucket_name='xxxxxxx' # 桶名
csvfile_name = 'xxxxxxxxxxxxxxxxxxxx' # 文件名路径url_template = (u'https://www.googleapis.com/upload/storage/v1/b/'+ bucket_name +'/o?'u'uploadType=resumable')upload_url = url_template.format(bucket=bucket_name)# 分块传输的大小
chunk_size = 1024 * 1024 * 33 # 33MB# 开始断点续传,并分块,意思是说,一个文件比如50M,33M每块要执行两次这个语句
upload = ResumableUpload(upload_url, chunk_size)print(response)
print(upload.resumable_url == response.headers[u'Location'])
print(upload.total_bytes == len(data))
upload_id = response.headers[u'X-GUploader-UploadID']
print(upload_id)
print(upload.resumable_url == upload_url + u'&upload_id=' + upload_id)
response0 = upload.transmit_next_chunk(transport)
print(response0)
print(upload.finished)
print(upload.bytes_uploaded == upload.chunk_size)
response1 = upload.transmit_next_chunk(transport)
print(response1)
print(upload.finished)
print(upload.bytes_uploaded == 2 * upload.chunk_size)
response2 = upload.transmit_next_chunk(transport)
print(response2)
print(upload.finished)
print(upload.bytes_uploaded == upload.total_bytes)
json_response = response2.json()
print(json_response[u'bucket'] == bucket)
print(json_response[u'name'] == blob_name)
任何程序错误,以及技术疑问或需要解答的,请添加