ReadableStream
Note: This feature is available in Web Workers.
The ReadableStream
interface of the Streams API represents a readable stream of byte data. The Fetch API offers a concrete instance of a ReadableStream
through the body
property of a Response
object.
ReadableStream
is a transferable object.
Constructor
ReadableStream()
-
Creates and returns a readable stream object from the given handlers.
Instance properties
ReadableStream.locked
Read only-
Returns a boolean indicating whether or not the readable stream is locked to a reader.
Static methods
ReadableStream.from()
Experimental-
Returns
ReadableStream
from a provided iterable or async iterable object, such as an array, a set, an async generator, and so on.
Instance methods
ReadableStream.cancel()
-
Returns a
Promise
that resolves when the stream is canceled. Calling this method signals a loss of interest in the stream by a consumer. The suppliedreason
argument will be given to the underlying source, which may or may not use it. ReadableStream.getReader()
-
Creates a reader and locks the stream to it. While the stream is locked, no other reader can be acquired until this one is released.
ReadableStream.pipeThrough()
-
Provides a chainable way of piping the current stream through a transform stream or any other writable/readable pair.
ReadableStream.pipeTo()
-
Pipes the current ReadableStream to a given
WritableStream
and returns aPromise
that fulfills when the piping process completes successfully, or rejects if any errors were encountered. ReadableStream.tee()
-
The
tee
method tees this readable stream, returning a two-element array containing the two resulting branches as newReadableStream
instances. Each of those streams receives the same incoming data.
Async iteration
ReadableStream
implements the async iterable protocol.
This enables asynchronous iteration over the chunks in a stream using the for await...of
syntax:
const stream = new ReadableStream(getSomeSource());
for await (const chunk of stream) {
// Do something with each 'chunk'
}
The async iterator consumes the stream until it runs out of data or otherwise terminates.
The loop can also exit early due to a break
, throw
, or return
statement.
While iterating, the stream is locked to prevent other consumers from acquiring a reader (attempting to iterate over a stream that is already locked will throw a TypeError
).
This lock is released when the loop exits.
By default, exiting the loop will also cancel the stream, so that it can no longer be used.
To continue to use a stream after exiting the loop, pass { preventCancel: true }
to the stream's values()
method:
for await (const chunk of stream.values({ preventCancel: true })) {
// Do something with 'chunk'
break;
}
// Acquire a reader for the stream and continue reading ...
Examples
Fetch stream
In the following example, an artificial Response
is created to stream HTML fragments fetched from another resource to the browser.
It demonstrates the usage of a ReadableStream
in combination with a Uint8Array
.
fetch("https://www.example.org")
.then((response) => response.body)
.then((rb) => {
const reader = rb.getReader();
return new ReadableStream({
start(controller) {
// The following function handles each data chunk
function push() {
// "done" is a Boolean and value a "Uint8Array"
reader.read().then(({ done, value }) => {
// If there is no more data to read
if (done) {
console.log("done", done);
controller.close();
return;
}
// Get the data and send it to the browser via the controller
controller.enqueue(value);
// Check chunks by logging to the console
console.log(done, value);
push();
});
}
push();
},
});
})
.then((stream) =>
// Respond with our stream
new Response(stream, { headers: { "Content-Type": "text/html" } }).text(),
)
.then((result) => {
// Do things with result
console.log(result);
});
Convert an iterator or async iterator to a stream
The from()
static method can convert an iterator, such as an Array
or Map
, or an (async) iterator to a readable stream:
const myReadableStream = ReadableStream.from(iteratorOrAsyncIterator);
On browsers that don't support the from()
method you can instead create your own custom readable stream to achieve the same result:
function iteratorToStream(iterator) {
return new ReadableStream({
async pull(controller) {
const { value, done } = await iterator.next();
if (value) {
controller.enqueue(value);
}
if (done) {
controller.close();
}
},
});
}
Warning:
This example assumes that the return value (value
when done
is true
), if present, is also a chunk to be enqueued. Some iterator APIs may use the return value for different purposes. You may need to adjust the code based on the API you are interacting with.
Async iteration of a stream using for await...of
This example shows how you can process the fetch()
response using a for await...of
loop to iterate through the arriving chunks.
const response = await fetch("https://www.example.org");
let total = 0;
// Iterate response.body (a ReadableStream) asynchronously
for await (const chunk of response.body) {
// Do something with each chunk
// Here we just accumulate the size of the response.
total += chunk.length;
}
// Do something with the total
console.log(total);
Specifications
Specification |
---|
Streams Standard # rs-class |
Browser compatibility
BCD tables only load in the browser
See also
- Streams API concepts
- Using readable streams
- Using readable byte stream
- WHATWG Stream Visualizer, for a basic visualization of readable, writable, and transform streams.
- Web-streams-polyfill or sd-streams - polyfills