Skip to content

Commit

Permalink
feat(blob): Add multipart option (#517)
Browse files Browse the repository at this point in the history
It turns out, uploading large files using Vercel Blob has been a struggle for users.
Before this change, file uploads were limited to around 200MB for technical reasons.
Before this change, even uploading a file of 100MB could fail for various reasons (network being one of them).

To solve this for good, we're introducting a new option to `put` and `upload` calls: `multipart: true`. This new option will make sure your file is uploaded parts by parts to Vercel Blob, and when some parts are failing, we will retry them. This option is available for server and client uploads.

Usage:
```ts
const blob = await put('file.png', file, {
  access: 'public',
  multipart: true // `false` by default
});

// and:
const blob = await upload('file.png', file, {
  access: 'public',
  handleUploadUrl: '/api/upload',
  multipart: true
});
```

If your `file` is a Node.js stream or a [ReadableStream](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) then we will gradually read and upload it without blowing out your server or browser memory.

More examples:

```ts
import { createReadStream } from 'node:fs';

const blob = await vercelBlob.put(
  'elon.mp4',
  // this works 👍, it will gradually read the file from the system and upload it
  createReadStream('/users/Elon/me.mp4'),
  { access: 'public', multipart: true }
);
```

```ts
const response = await fetch(
  'https://example-files.online-convert.com/video/mp4/example_big.mp4',
);

const blob = await vercelBlob.put(
  'example_big.mp4',
  // this works too 👍, it will gradually read the file from internet and upload it
  response.body,
  { access: 'public', multipart: true },
);
```
  • Loading branch information
vvo committed Jan 12, 2024
1 parent fd1781f commit 898c14a
Show file tree
Hide file tree
Showing 36 changed files with 914 additions and 73 deletions.
55 changes: 55 additions & 0 deletions .changeset/six-melons-doubt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
---
"@vercel/blob": minor
"vercel-storage-integration-test-suite": patch
---

feat(blob): Add multipart option to reliably upload medium and large files

It turns out, uploading large files using Vercel Blob has been a struggle for users.
Before this change, file uploads were limited to around 200MB for technical reasons.
Before this change, even uploading a file of 100MB could fail for various reasons (network being one of them).

To solve this for good, we're introducting a new option to `put` and `upload` calls: `multipart: true`. This new option will make sure your file is uploaded parts by parts to Vercel Blob, and when some parts are failing, we will retry them. This option is available for server and client uploads.

Usage:
```ts
const blob = await put('file.png', file, {
access: 'public',
multipart: true // `false` by default
});

// and:
const blob = await upload('file.png', file, {
access: 'public',
handleUploadUrl: '/api/upload',
multipart: true
});
```

If your `file` is a Node.js stream or a [ReadableStream](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) then we will gradually read and upload it without blowing out your server or browser memory.

More examples:

```ts
import { createReadStream } from 'node:fs';

const blob = await vercelBlob.put(
'elon.mp4',
// this works 👍, it will gradually read the file from the system and upload it
createReadStream('/users/Elon/me.mp4'),
{ access: 'public', multipart: true }
);
```

```ts
const response = await fetch(
'https://example-files.online-convert.com/video/mp4/example_big.mp4',
);

const blob = await vercelBlob.put(
'example_big.mp4',
// this works too 👍, it will gradually read the file from internet and upload it
response.body,
{ access: 'public', multipart: true },
);
```
9 changes: 7 additions & 2 deletions packages/blob/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,8 @@
"module": "./dist/index.js",
"browser": {
"undici": "./dist/undici-browser.js",
"crypto": "./dist/crypto-browser.js"
"crypto": "./dist/crypto-browser.js",
"stream": "./dist/stream-browser.js"
},
"typesVersions": {
"*": {
Expand All @@ -39,7 +40,7 @@
],
"scripts": {
"build": "tsup && pnpm run copy-shims",
"copy-shims": "cp src/undici-browser.js dist/undici-browser.js && cp src/crypto-browser.js dist/crypto-browser.js",
"copy-shims": "cp src/*-browser.js dist/",
"dev": "pnpm run copy-shims && tsup --watch --clean=false",
"lint": "eslint --max-warnings=0 .",
"prepublishOnly": "pnpm run build",
Expand All @@ -59,11 +60,15 @@
}
},
"dependencies": {
"async-retry": "1.3.3",
"bytes": "3.1.2",
"undici": "5.28.2"
},
"devDependencies": {
"@edge-runtime/jest-environment": "2.3.7",
"@edge-runtime/types": "2.2.7",
"@types/async-retry": "1.4.8",
"@types/bytes": "3.1.4",
"@types/jest": "29.5.11",
"@types/node": "20.10.4",
"eslint": "8.55.0",
Expand Down
2 changes: 1 addition & 1 deletion packages/blob/src/client.browser.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ describe('upload()', () => {
1,
'http://localhost:3000/api/upload',
{
body: '{"type":"blob.generate-client-token","payload":{"pathname":"foo.txt","callbackUrl":"http://localhost:3000/api/upload"}}',
body: '{"type":"blob.generate-client-token","payload":{"pathname":"foo.txt","callbackUrl":"http://localhost:3000/api/upload","clientPayload":null,"multipart":false}}',
headers: { 'content-type': 'application/json' },
method: 'POST',
},
Expand Down
12 changes: 8 additions & 4 deletions packages/blob/src/client.node.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,8 @@ describe('client uploads', () => {
payload: {
pathname: 'newfile.txt',
callbackUrl: 'https://example.com',
multipart: false,
clientPayload: null,
},
},
onBeforeGenerateToken: async (pathname) => {
Expand All @@ -102,7 +104,7 @@ describe('client uploads', () => {
});
expect(jsonResponse).toMatchInlineSnapshot(`
{
"clientToken": "vercel_blob_client_12345fakeStoreId_ODBiNjcyZDgyZTNkOTYyNTcwMTQ4NTFhNzJlOTEzZmI0MzQ4NWEzNzE0NzhjNGE0ZGRlN2IxMzRmYjI0NTkxOS5leUowYjJ0bGJsQmhlV3h2WVdRaU9pSnVaWGRtYVd4bExuUjRkQ0lzSW5CaGRHaHVZVzFsSWpvaWJtVjNabWxzWlM1MGVIUWlMQ0p2YmxWd2JHOWhaRU52YlhCc1pYUmxaQ0k2ZXlKallXeHNZbUZqYTFWeWJDSTZJbWgwZEhCek9pOHZaWGhoYlhCc1pTNWpiMjBpTENKMGIydGxibEJoZVd4dllXUWlPaUp1WlhkbWFXeGxMblI0ZENKOUxDSjJZV3hwWkZWdWRHbHNJam94TmpjeU5UTXhNak13TURBd2ZRPT0=",
"clientToken": "vercel_blob_client_12345fakeStoreId_Y2JhNTlmNWM3MmZmMGZmM2I2YzVlYzgwNTU3MDgwMWE1YTA4ZGU2MjIyNTFkNjRiYTI1NjVjNmRjYmFkYmQ5Yy5leUowYjJ0bGJsQmhlV3h2WVdRaU9pSnVaWGRtYVd4bExuUjRkQ0lzSW5CaGRHaHVZVzFsSWpvaWJtVjNabWxzWlM1MGVIUWlMQ0p2YmxWd2JHOWhaRU52YlhCc1pYUmxaQ0k2ZXlKallXeHNZbUZqYTFWeWJDSTZJbWgwZEhCek9pOHZaWGhoYlhCc1pTNWpiMjBpTENKMGIydGxibEJoZVd4dllXUWlPaUp1WlhkbWFXeGxMblI0ZENKOUxDSjJZV3hwWkZWdWRHbHNJam94TmpjeU5UTTBPREF3TURBd2ZRPT0=",
"type": "blob.generate-client-token",
}
`);
Expand All @@ -117,7 +119,7 @@ describe('client uploads', () => {
tokenPayload: 'newfile.txt',
},
pathname: 'newfile.txt',
validUntil: 1672531230000,
validUntil: 1672534800000,
});
});

Expand Down Expand Up @@ -176,6 +178,7 @@ describe('client uploads', () => {
pathname: 'newfile.txt',
callbackUrl: 'https://example.com',
clientPayload: 'custom-metadata-from-client',
multipart: false,
},
},
onBeforeGenerateToken: async () => {
Expand All @@ -191,7 +194,7 @@ describe('client uploads', () => {
});
expect(jsonResponse).toMatchInlineSnapshot(`
{
"clientToken": "vercel_blob_client_12345fakeStoreId_YjgzZDU4YzFkZjM3MmNlN2JhMTk1MmVlYjE4YWMwOTczNGI3NjhlOTljMmE0ZTdiM2M0MTliOGJlNDg5YTFiZS5leUpoWkdSU1lXNWtiMjFUZFdabWFYZ2lPbVpoYkhObExDSndZWFJvYm1GdFpTSTZJbTVsZDJacGJHVXVkSGgwSWl3aWIyNVZjR3h2WVdSRGIyMXdiR1YwWldRaU9uc2lZMkZzYkdKaFkydFZjbXdpT2lKb2RIUndjem92TDJWNFlXMXdiR1V1WTI5dElpd2lkRzlyWlc1UVlYbHNiMkZrSWpvaVkzVnpkRzl0TFcxbGRHRmtZWFJoTFdaeWIyMHRZMnhwWlc1MEluMHNJblpoYkdsa1ZXNTBhV3dpT2pFMk56STFNekV5TXpBd01EQjk=",
"clientToken": "vercel_blob_client_12345fakeStoreId_NThhZGE3YTVkODBjNTcxMmIyMzJlMTAzMDM3MTgwYzI5NzVlMjUzYjhkYzU4MzFkZTZjMzk4ZmEwNmY2ODI5Ny5leUpoWkdSU1lXNWtiMjFUZFdabWFYZ2lPbVpoYkhObExDSndZWFJvYm1GdFpTSTZJbTVsZDJacGJHVXVkSGgwSWl3aWIyNVZjR3h2WVdSRGIyMXdiR1YwWldRaU9uc2lZMkZzYkdKaFkydFZjbXdpT2lKb2RIUndjem92TDJWNFlXMXdiR1V1WTI5dElpd2lkRzlyWlc1UVlYbHNiMkZrSWpvaVkzVnpkRzl0TFcxbGRHRmtZWFJoTFdaeWIyMHRZMnhwWlc1MEluMHNJblpoYkdsa1ZXNTBhV3dpT2pFMk56STFNelE0TURBd01EQjk=",
"type": "blob.generate-client-token",
}
`);
Expand All @@ -207,7 +210,7 @@ describe('client uploads', () => {
"tokenPayload": "custom-metadata-from-client",
},
"pathname": "newfile.txt",
"validUntil": 1672531230000,
"validUntil": 1672534800000,
}
`);
});
Expand All @@ -228,6 +231,7 @@ describe('client uploads', () => {
pathname: 'newfile.txt',
callbackUrl: 'https://example.com',
clientPayload: 'custom-metadata-from-client-we-expect',
multipart: false,
},
},
onBeforeGenerateToken: async (pathname, clientPayload) => {
Expand Down
43 changes: 34 additions & 9 deletions packages/blob/src/client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,10 @@ export interface UploadOptions {
* Additional data which will be sent to your `handleUpload` route.
*/
clientPayload?: string;
/**
* Whether to use multipart upload. Use this when uploading large files. It will split the file into multiple parts, upload them in parallel and retry failed parts.
*/
multipart?: boolean;
}

/**
Expand Down Expand Up @@ -103,7 +107,8 @@ export const upload = createPutMethod<UploadOptions>({
const clientToken = await retrieveClientToken({
handleUploadUrl: options.handleUploadUrl,
pathname,
clientPayload: options.clientPayload,
clientPayload: options.clientPayload ?? null,
multipart: options.multipart ?? false,
});
return clientToken;
},
Expand Down Expand Up @@ -211,13 +216,18 @@ const EventTypes = {

interface GenerateClientTokenEvent {
type: (typeof EventTypes)['generateClientToken'];
payload: { pathname: string; callbackUrl: string; clientPayload?: string };
payload: {
pathname: string;
callbackUrl: string;
multipart: boolean;
clientPayload: string | null;
};
}
interface UploadCompletedEvent {
type: (typeof EventTypes)['uploadCompleted'];
payload: {
blob: PutBlobResult;
tokenPayload?: string;
tokenPayload?: string | null;
};
}

Expand All @@ -229,7 +239,8 @@ export interface HandleUploadOptions {
body: HandleUploadBody;
onBeforeGenerateToken: (
pathname: string,
clientPayload?: string,
clientPayload: string | null,
multipart: boolean,
) => Promise<
Pick<
GenerateClientTokenOptions,
Expand All @@ -238,7 +249,7 @@ export interface HandleUploadOptions {
| 'validUntil'
| 'addRandomSuffix'
| 'cacheControlMaxAge'
> & { tokenPayload?: string }
> & { tokenPayload?: string | null }
>;
onUploadCompleted: (body: UploadCompletedEvent['payload']) => Promise<void>;
token?: string;
Expand All @@ -260,10 +271,21 @@ export async function handleUpload({
const type = body.type;
switch (type) {
case 'blob.generate-client-token': {
const { pathname, callbackUrl, clientPayload } = body.payload;
const payload = await onBeforeGenerateToken(pathname, clientPayload);
const { pathname, callbackUrl, clientPayload, multipart } = body.payload;
const payload = await onBeforeGenerateToken(
pathname,
clientPayload,
multipart,
);
const tokenPayload = payload.tokenPayload ?? clientPayload;

// one hour
const oneHourInSeconds = 60 * 60;
const now = new Date();
const validUntil =
payload.validUntil ??
now.setSeconds(now.getSeconds() + oneHourInSeconds);

return {
type,
clientToken: await generateClientTokenFromReadWriteToken({
Expand All @@ -274,6 +296,7 @@ export async function handleUpload({
callbackUrl,
tokenPayload,
},
validUntil,
}),
};
}
Expand Down Expand Up @@ -309,7 +332,8 @@ export async function handleUpload({
async function retrieveClientToken(options: {
pathname: string;
handleUploadUrl: string;
clientPayload?: string;
clientPayload: string | null;
multipart: boolean;
}): Promise<string> {
const { handleUploadUrl, pathname } = options;
const url = isAbsoluteUrl(handleUploadUrl)
Expand All @@ -322,6 +346,7 @@ async function retrieveClientToken(options: {
pathname,
callbackUrl: url,
clientPayload: options.clientPayload,
multipart: options.multipart,
},
};

Expand Down Expand Up @@ -400,7 +425,7 @@ export interface GenerateClientTokenOptions extends BlobCommandOptions {
pathname: string;
onUploadCompleted?: {
callbackUrl: string;
tokenPayload?: string;
tokenPayload?: string | null;
};
maximumSizeInBytes?: number;
allowedContentTypes?: string[];
Expand Down
21 changes: 21 additions & 0 deletions packages/blob/src/debug.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
let debugIsActive = false;

// wrapping this code in a try/catch in case some env doesn't support process.env (vite by default)
try {
if (
process.env.DEBUG?.includes('blob') ||
process.env.NEXT_PUBLIC_DEBUG?.includes('blob')
) {
debugIsActive = true;
}
} catch (error) {
// noop
}

// Set process.env.DEBUG = 'blob' to enable debug logging
export function debug(message: string, ...args: unknown[]): void {
if (debugIsActive) {
// eslint-disable-next-line no-console -- Ok for debugging
console.debug(`vercel-blob: ${message}`, ...args);
}
}
15 changes: 10 additions & 5 deletions packages/blob/src/helpers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,11 @@ export interface CreateBlobCommandOptions extends BlobCommandOptions {
* @defaultvalue 365 * 24 * 60 * 60 (1 Year)
*/
cacheControlMaxAge?: number;
/**
* Whether to use multipart upload. Use this when uploading large files. It will split the file into multiple parts, upload them in parallel and retry failed parts.
* @defaultvalue false
*/
multipart?: boolean;
}

export function getTokenFromOptionsOrEnv(options?: BlobCommandOptions): string {
Expand All @@ -56,25 +61,25 @@ export class BlobError extends Error {

export class BlobAccessError extends BlobError {
constructor() {
super('Access denied, please provide a valid token for this resource');
super('Access denied, please provide a valid token for this resource.');
}
}

export class BlobStoreNotFoundError extends BlobError {
constructor() {
super('This store does not exist');
super('This store does not exist.');
}
}

export class BlobStoreSuspendedError extends BlobError {
constructor() {
super('This store has been suspended');
super('This store has been suspended.');
}
}

export class BlobUnknownError extends BlobError {
constructor() {
super('Unknown error, please visit https://vercel.com/help');
super('Unknown error, please visit https://vercel.com/help.');
}
}

Expand All @@ -86,7 +91,7 @@ export class BlobNotFoundError extends BlobError {

export class BlobServiceNotAvailable extends BlobError {
constructor() {
super('The blob service is currently not available. Please try again');
super('The blob service is currently not available. Please try again.');
}
}

Expand Down
Loading

1 comment on commit 898c14a

@vercel
Copy link

@vercel vercel bot commented on 898c14a Jan 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please sign in to comment.