问题描述
我正在寻找一种方法来拆分浏览器前端的任何文本/数据文件,然后再作为多个文件上传.我的限制是每次上传 40KB.因此,如果用户上传一个 400KB 的文件,它会在前端将该文件拆分为 10 个单独的块或 10 个单独的文件,然后再将其上传到服务器.
I'm looking for a way to split up any text/data file on the front end in the browser before being uploaded as multiple files. My limit is 40KB per upload. So if a user uploads a 400KB file, it would split this file into 10 separate chunks or 10 separate files on the front end before uploading it to the server.
目前,我将此文件转换为 base64 格式的字符串,然后将此字符串拆分为 40KB,从而得到 10 个单独的块.从那里我上传每个块,文件名为 chunk-1-of-10, chunk-2-of-10...
Currently, I'm doing it by converting this file into a base64 formatted string, then split this string by 40KB which comes out to 10 separate chunks. From there I upload each chunk as with a filename of chunk-1-of-10, chunk-2-of-10...
当拉取这些文件时,我只是将所有这些块连接回来并将其从 base64 转换为它的文件格式.
When pulling down these files, I just concat all these chunks back and deconvert it from base64 into its file format.
有没有更好的方法来做到这一点?是否有一个库可以处理所有这些而不是从头开始编写它?我不确定 base64 路由是否是最好的方法.
Is there a better way of doing this? Is there a library that handles all of this instead of writing it from scratch? I'm not sure if the base64 route is the best way to do this.
推荐答案
不需要用 FileReader 将内容读入 ram使用 base64 只会增加您需要上传的内容的大小,base64 占用的大小增加了约 33%
There is no need for reading the content into ram with the FileReader using base64 will only increase the size of what you need to upload, base64 takes up ~33% more in size
使用 Blob.slice 获取块
// simulate a file from a input
const file = new File(['a'.repeat(1000000)], 'test.txt')
const chunkSize = 40000
const url = 'https://httpbin.org/post'
for (let start = 0; start < file.size; start += chunkSize) {
const chunk = file.slice(start, start + chunkSize + 1)
const fd = new FormData()
fd.set('data', chunk)
await fetch(url, { method: 'post', body: fd }).then(res => res.text())
}
这篇关于使用 javascript 将上传的文件拆分为多个块的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!