Aws S3 Upload Typeerror: List Argument Must Be an Array of Buffer or Uint8array Instances

Many people take trouble with the AWS SDK for JavaScript v3's GetObjectCommand when trying to get Buffer output from s3. In this post, I will encompass the reason why this happens, solutions and related information. This postal service is strongly related to the #1877 issue.

TLDR: For those who come hither to find a just-work version. Here is it. This is the working version in nodejs. If y'all are running on browser, see the details in the residual of the mail.

          import {GetObjectCommand, S3Client} from '@aws-sdk/customer-s3' import blazon {Readable} from 'stream'  const s3Client = new S3Client({     apiVersion: '2006-03-01',     region: 'us-due west-ii',     credentials: {         accessKeyId: '<access key>',         secretAccessKey: '<access secret>',     } }) const response = await s3Client     .send(new GetObjectCommand({         Key: '<cardinal>',         Bucket: '<bucket>',     })) const stream = response.Torso equally Readable        
          // if you are using node version < 17.5.0 return new Promise<Buffer>((resolve, reject) => {     const chunks: Buffer[] = []     stream.on('data', clamper => chunks.push(chunk))     stream.in one case('end', () => resolve(Buffer.concat(chunks)))     stream.one time('error', reject) })  // if you are using node version >= 17.5.0 return Buffer.concat(expect stream.toArray())        

Javascript version (commonjs)

          const {GetObjectCommand, S3Client} = require('@aws-sdk/client-s3')  const s3Client = new S3Client({     apiVersion: '2006-03-01',     region: 'u.s.-west-two',     credentials: {         accessKeyId: '<admission key>',         secretAccessKey: '<access secret>',     } }) const response = expect s3Client     .send(new GetObjectCommand({         Central: '<key>',         Saucepan: '<bucket>',     })) const stream = response.Body        
          // if y'all are using node version < 17.5.0 return new Promise((resolve, reject) => {     const chunks = []     stream.on('information', chunk => chunks.push(chunk))     stream.once('end', () => resolve(Buffer.concat(chunks)))     stream.once('error', pass up) })  // if you are using node version >= 17.v.0 return Buffer.concat(await stream.toArray())        

Why this post?

Recently, I migrated my storage from AWS S3 to DigitalOcean spaces service to save data transfer costs, which included upgrading the storage adapter for this blog (s3-ghost). At the fourth dimension of the upgrade, the AWS SDK Javascript v3 looks getting mature, and then I decided to upgrade it too from v2.

Initially, everything went fine and I released the update. All the same, 2 days after the release, I realized that my blog was dead (really, It was expressionless for 2 days until this post). I checked the server log, and see the following fault.

          The "data" argument must be of type string or an case of Buffer, TypedArray, or DataView. Received an example of IncomingMessage        

This error happened in the phone call to AWS SDK GetObjectCommand. It turned out that getting the Buffer output from the SDK command is non niggling and at that place is much interesting information I want to share in this post and also for my future reference.


How?

This is a sample lawmaking to send a GetObjectCommand request.

          import {GetObjectCommand, S3Client} from "@aws-sdk/client-s3";  const s3Client = new S3Client({     apiVersion: '2006-03-01',     region: 'us-west-2',     credentials: {         accessKeyId: '<admission central>',         secretAccessKey: '<access hugger-mugger>',     } }) const response = await s3Client     .ship(new GetObjectCommand({         Primal: '<key>',         Bucket: '<bucket>',     })) const body = response.Torso                  

From the official docs of GetObjectCommandOutput.Body, the torso's blazon is Readable | ReadableStream | Blob. Why these 3 types?

GetObjectCommandOutput.Torso'southward type is Readable | ReadableStream | Blob

Allow's start digging into the source code of the AWS Javascript v3 sdk.

@aws-sdk/client-s3 package uses @aws-sdk/node-http-handler (source), @aws-sdk/fetch-http-handler (source) as requestHandler.

In browser environment

Looking at the source code of the SDK's @aws-sdk/fetch-http-handler package. The sdk uses global fetch to send network requests.

AWS SDK'due south fetch-http-handler uses fetch internally

Because the global fetch is used, information technology requires polyfill if your browser does not support fetch. whatwg-fetch is a common option for polyfill.

caniuse's fetch api

In browser, response.body is a ReadableStream.

Standard fetch's response.body'due south type is ReadableStream

Where and why the Blob blazon comes to the output?

If we expect again at the source code of the SDK. When response.trunk is not available, the SDK returns a blob equally a workaround in old browsers/polyfill.

          const hasReadableStream = response.body !== undefined;  // Return the response with buffered trunk if (!hasReadableStream) {     render response.blob().then(/*...*/); }        

If your browser is new, you lot tin just skip the Blob type and cast the output type to Readable | ReadableStream in Typescript.

Previously, reponse.body was not supported in many browsers

In node environment

In node surround, @aws-sdk/node-http-handler is used to ship network requests. From the source code of the packet:

@aws-sdk/node-http-handler source code
@aws-sdk/node-http-handler source lawmaking

Suppose that the request does not use SSL, nodejs'due south http is used. In the menstruum of the source lawmaking, the GetObjectCommandOutput.Torso is assigned to the http.IncomingMessage class as the first parameter of the callback in the 'response' event on the ClientRequest class.

IncomingMessage extends stream.Readable, that is why we got the Readable type for the GetObjectCommandOutput.Body.

This also explains why I got the Received an case of IncomingMessage error described in this mail, initially.

When the request uses SSL, https module is used instead. All the same, the nodejs doc does not mention the blazon of the response/request in detail. Probably, the type is the same as of the http module.


Determination

GetObjectCommandOutput.Trunk type is

  • In node: Readable (or precisely, the subclass of Readable, namely IncomingMessage).
  • In browser:
    - If the fetch API in your browser does non support request.torso, Blob blazon is returned.
    - (This is most of the case) Otherwise, ReadableStream type is returned.

How to handle the output stream

I will introduce 3 ways, isomorphic way, node-just way, and browser-only mode.

The isomorphic method

The flim-flam is to use the Response class.

In node environment, import by import {Response} from 'node-fetch'.

In the browser environs, the Reponse object is bachelor in the global scope. Notation: you ever need to polyfill fetch if your browser does not support fetch API natively.

          const res = new Response(trunk)        

Response is a very handy course in which yous can catechumen the stream to many types. For example:

          // blob type const blob = await res.blob()  // json const json = wait res.json()  // string const text = await res.text()  // buffer const buffer = expect res.arrayBuffer() // note: res.buffer() is deprecated        

The buffer's blazon is nodejs Buffer in node (node-fetch), and ArrayBuffer in browser (native fetch).

The node-simply way

Apply this implementation to convert a Readable to a buffer in node surround.

          import type { Readable } from "stream"  const streamToBuffer = (stream: Readable) => new Hope<Buffer>((resolve, reject) => { 	const chunks: Buffer[] = [] 	stream.on('data', clamper => chunks.push(chunk)) 	stream.once('finish', () => resolve(Buffer.concat(chunks))) 	stream.in one case('mistake', refuse) })        

If yous are using nodejs version >= 17.five.0, Readable.toArray provides a shorter version.

          import blazon { Readable } from "stream"  const streamToBuffer = (stream: Readable) => Buffer.concat(await stream.toArray())        

Note that, at the fourth dimension of this writing (Feb thirteen, 202), Readable.toArray is an experimental characteristic.

The browser-only way

Use this implementation to convert a ReadableStream to a buffer in browser environment.

          // Buffer is a subclass of Uint8Array, and then information technology can be used as a ReadableStream'southward source // https://nodejs.org/api/buffer.html export const concatBuffers = (buffers: Uint8Array[]) => { 	const totalLength = buffers.reduce((sum, buffer) => sum + buffer.byteLength, 0) 	const result = new Uint8Array(totalLength) 	permit offset = 0 	for (const buffer of buffers) { 		upshot.set up(new Uint8Array(buffer), commencement) 		offset += buffer.byteLength 	} 	return issue }  const streamToBuffer = (stream: ReadableStream) => new Promise<Buffer>(async (resolve, reject) => { 	const reader = stream.getReader() 	const chunks: Uint8Array[] = []  	try { 		while (truthful) { 			const { 				done, 				value 			} = await reader.read() 			if (washed) break 			chunks.push(value!) 		} 	} finally { 		// safari (iOS and macOS) doesn't support .releaseReader() 		// https://programmer.mozilla.org/en-US/docs/Spider web/API/ReadableStreamDefaultReader/releaseLock#browser_compatibility 		reader?.releaseLock() 	} 	return concatBuffers(chunks) })                  

laverywhilly83.blogspot.com

Source: https://transang.me/modern-fetch-and-how-to-get-buffer-output-from-aws-sdk-v3-getobjectcommand/

0 Response to "Aws S3 Upload Typeerror: List Argument Must Be an Array of Buffer or Uint8array Instances"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel