将文件变成二进制,方便后续分片
filepParse(file, type) {
const caseType = {
'base64': 'readAsDataURL',
'buffer': 'readAsArrayBuffer'
}
const fileRead = new FileReader()
return new Promise(resolve => {
fileRead[caseType[type]](file)
fileRead.onload = (res) => {
resolve(res.target.result)
}
})
}
避免同一个文件(改名字)多次上传,引入了 spark-md5 ,根据具体文件内容,生成hash值
const buffer = await this.filepParse(file,'buffer')
const sparkMD5 = new SparkMD5.ArrayBuffer()
sparkMD5.append(buffer)
this.hash = sparkMD5.end() //获得该文件的hash值
进行切片,为每一个切片命名当时候,也改成了 hash-1,hash-2 这种形式
//定片数的方式进行切片,还可以通过定大小的方式进行切片
const partSize = file.size / 10
let current = 0
for (let i = 0 ;i < 10 ;i++) {
let reqItem = {
chunk: file.slice(current, current + partSize),
filename: `${this.hash}_${i}.${suffix}`
}
current += partSize
//数组存放每一块的内容
partList.push(reqItem)
}
this.partList = partList
为每一个切片创建对应的请求
createSendQeq() {
const reqPartList = []
this.partList.forEach((item,index) => {
const reqFn = () => {
const formData = new FormData();
//通过formData传递二进制文件时,需要设置headers: {"Content-Type": "multipart/form-data"}
formData.append("chunk", item.chunk);
formData.append("filename", item.filename);
return axios.post("/upload",formData,{
headers: {"Content-Type": "multipart/form-data"}
}).then(res => {
console.log(res)
})
}
reqPartList.push(reqFn)
})
return reqPartList
}
遍历所有请求发送,发送完成时再发送一个合并请求
sendQeq() {
const reqPartList = this.createSendQeq()
let i = 0
let send = async () => {
if (i >= reqPartList.length) {
// 上传完成
this.mergeUpload()
return
}
await reqPartList[i]()
i++
send()
}
send()
}
断点续传
继续上传,我们继续上传剩下的请求,所以,我们对之前上传成功的请求去掉,就可以保证剩下的,就是待上传的了
if (res.data.code === 0) {
this.count += 1;
// 传完的切片我们把它移除掉
this.partList.splice(index, 1);
}