3. Events
Basic primitives of asynchronous
programming.
Support a publisher/subscriber model.
Publisher: emits an object.
Subscriber: registers a callback.
4. Subscribers
All receive an emited object.
All independent from each other.
A callback can:
Stop DOM propagation.
Stop processing by others.
6. Events: semantics
No info on its frequency.
Is it singular, or multiple?
No info on valid sequence of events.
Are all events independent?
Is any order valid?
7. Events: example
A le reader can emit three types of
events:
OPEN
DATA(with read data)
DONE
A valid sequence:
OPEN, DATA, DATA, DONE.
8. Events: example fail
How to interpret these:
DONE, DATA, OPEN.
OPEN, OPEN, DATA.
OPEN, DONE, OPEN, DONE.
9. Events: example x
Document event emitters?
Use better primitives?
Go high-level?
Possible x: add a state machine.
12. Promise
AKA future, AKA deferred.
A proxy object for a not-yet-available
result.
Most implementations do not allow to
observe result (or lack of thereof) other
than by registering a callback.
13. Promise vs. Event
Promise is a one-off deal.
Callbacks for success and failure are
different.
Subscribers can be chained.
They can pass different results down a
pipe.
14. Promise: independent
promise.then(x => x + 1);
promise.then(x => x + 2);
promise.then(x => x + 3);
If promiseis resolved with 0, each
subscriber will receive 0.
15. Promise: pipeline
promise.
then(x => x + 1).
then(x => x + 2).
then(x => x + 3);
If promiseis resolved with 0, subscribers
will receive 0, 1, 3.
18. Promise: alternatives
var count = 2;
function countdown () {
if (!--count) console.log("Done!");
}
a.addCallback(countdown);
b.addCallback(countdown);
Done!will be printed only when both a
and bare nished (in any order).
20. Promise: counter-argument
But we can wrap our function too!
And we can make it to accept N
functions!
And it will call one callback too!
21. Promise: not so easy
This aggregate function will depend on
a convention:
E.g., the last argument of a
function is always a callback.
No way to use synchronous, callback-
less functions.
22. Promise: strikes back!
If we do it right, at the end we will get
… a Promiseobject.
It may have a different API, yet a
similar functionality.
26. Streams: why?
They represent a sequential
asynchronous I/O.
Can process unlimited amount of data
with a small predictable buffer.
Can read/process/write in parallel
saving time.
27. Streams: node.js
node.js includes streams.
Modelled after Unix streams.
Can be les, pipes, network
connections.
Any sequential I/O can be
modelled.
32. node.js: Transform
Implements both APIs.
Reads from a readable side.
Does something with it.
Writes it to a writable side.
Example: encoding.
33. node.js: example
Copy all data from one stream to another:
readable.on('end',
() => console.log('Done!'));
readable.pipe(writable);
34. node.js: pipeline
Compress a le:
var r, z, w;
r = fs.createReadStream('f.txt'),
z = zlib.createGzip(),
w = fs.createWriteStream('f.txt.gz');
r.pipe(z).pipe(w);
36. node.js: implementation
How to implement custom streams?
Just implement necessary
methods.
Relax, and watch them being
called.
Custom transform rulez!
39. node.js: null example (part 1)
const util = require('util'),
T = require('streams').Transform;
util.inherits(Null, T);
function Null() { T.call(this); }
40. node.js: null example (part 2)
Null.prototype._transform =
function (data, _, cb) {
this.push(data);
cb();
};
41. node.js: upper case example
UpperCase.prototype._transform =
function (data, enc, cb) {
let str = data.toString(enc);
this.push(str.toUpperCase());
cb();
};
42. Novices: thinking
Aha! Stream is a concept!
Why do we even need streams as
implementation?
We can create an arbitrary object with
necessary methods.
No need for heavy-weight libraries.
43. node.js: reality
What if our pipe components have
different throughputs?
What happens, if we push more water
in a pipe, then it can drain.
We have to manage buffers by
regulating a throughput.
44. node.js: more API
It is OK to drain fast.
It is not OK to pump in too fast.
Readablehas more:
pause()— stop pumping.
resume()— start pumping.
45. node.js: like a plumber
Now when we send data, we pause().
When we are done, we can resume().
Otherwise we may have too many I/O
requests “in ight”.
46. Novices: thinking
“Cool concept, but not for me!”
“Who needs to process text?”
“I deal with SQL, and OOP!”
“I don’t save binaries on disk!”
47. node.js: object mode
We read and/or write arbitrary objects,
not strings of buffers.
It is a userland feature:
Nothing in node.js library.
npm offers a lot ὠ
48. node.js: object streams
Any part (or both) of Duplexor
Transformcan be in object mode.
Set a required option to true:
readableObjectMode
writableObjectMode
It should be set at construction.
49. node.js: lter example (part 1)
const util = require('util'),
T = require('streams').Transform;
util.inherits(Filter, T);
function Filter() {
T.call(this, {
readableObjectMode: true,
writableObjectMode: true
});
}
50. node.js: lter example (part 2)
Filter.prototype._transform =
function (data, _, cb) {
if(data.color == 'red'){
this.push(data);
}
cb();
};
51. node.js: object sources
Parsing text les: XML, JSON, CSV
Reading from database.
There are npm modules for that.
I wrote two:
stream-json
stream-csv-enhanced
55. Philosophy
Remember data ow programming?
Pipelines of object mode streams
can be used for that.
Remember event streams?
Object mode streams again.
Remember Array Extras?
56. node.js: back to earth
Pipeline has a “tax” per object.
The smaller object you have, the
more relative overhead.
Doesn’t make any sense to process
bytes in object mode.
Group small objects.
58. Streams: WHATWG
WHATWG got us covered:
New Streams standard is under
development.
Will be used in Fetch —
replacement for XHR.
Likely to be used in File API.
For use client- and server- side.
59. WHATWG vs. node.js (part 1)
Most functionality is preserved.
Modern API:
Promises instead of callbacks.
BYOB mode:
“Bring your own buffer”.
Zero-copy pipelines!
60. WHATWG vs. node.js (part 2)
Backpressure is introduced.
Automated management of a
throughput.
Explicit splitting of a stream.
Handles backpressure too.
Pipeline can have trees!
61. WHATWG vs. node.js (part 3)
Reader is separated from a stream.
No accidental reads to disrupt a
pipeline.
“Revealing constructor pattern” is
used.
The same advantages and
drawbacks as ES6 Promise
constructor.
62. WHATWG vs. node.js (part 4)
No special Duplexstream.
Transformstream is a wrapped pair
of Readableand Writable.
Totally separate API.