Skip to content

stream: Readable iterator unhandled error when piping #28194

Closed
@marcosc90

Description

@marcosc90
  • Version: 12.4
  • Platform: Ubuntu 18.04

When using asyncIterator on a piped stream, the error is not handled correctly.

const fs = require('fs');
const { PassThrough } = require('stream');

async function print() {

	const read = fs.createReadStream('file');
	const iterator = read.pipe(new PassThrough())

	for await (const k of iterator) {
		console.log(k);
	}
}

print()
  .then(() => console.log('done')) // never called
  .catch(console.log); // never called

In the above example, the .catch is not catching the error, and the script crashes.

I know, that I should catch the error of each stream, but if I do:

read.on('error', console.log);

The print function never resolves nor rejects. The only solution I've found is to emit the error to the piped stream.

async function print() {

	const read = fs.createReadStream('file');
	const stream = new PassThrough();
	
	read.on('error', (err) => stream.emit('error', err));

	const iterator = read.pipe(stream);

	for await (const k of iterator) {
		console.log(k);
	}
}

When you have multiple pipes, this can get very ugly. I don't know if this is the intended behaviour, but makes it hard & ugly to work with async iterators.

Metadata

Metadata

Assignees

No one assigned

    Labels

    streamIssues and PRs related to the stream subsystem.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions