[Ferature Request] Allow Stream transforms to be chained at the chunk level #15828
Labels
area-library
closed-obsolete
Closed as the reported issue is no longer relevant
type-enhancement
A request for a change that isn't a bug
This issue was originally filed by gruyere.emmen...@gmail.com
What steps will reproduce the problem?
This is based on a question I put in on SO
http://stackoverflow.com/q/20815913/1085699
Currently:
Stream<List<int>> stream = new File('Data.txt').openRead();
stream
.transform(const Utf8Decoder())
.transform(const LineSplitter())
.listen((line) {
stdout.writeln(line);
}).asFuture().catchError(() => print());
Will call each transform after reading the entire contents of the file first. This gives the user very little control over the amount of work being performed and is not conducive to chunking into efficient pieces that can handled quickly.
What would be better?
The current "Chain of Responsibility" pattern does not give enough control to the user. It would be better to use a function parameter for the next downstream transform so that it can be called repeatedly when the data chunk size is reached. Some transforms seem to have the size of the chunk implicit in their definition (LineSplitter). I think this library needs to be rethought. I would point to Node libraries for streaming: https://github.com/jahewson/node-byline
Dart Editor version 1.0.0_r30798 (STABLE)
Dart SDK version 1.0.0.10_r30798
The text was updated successfully, but these errors were encountered: