Forgive me for being "That Guy" but I really think Javascript is ill-suited for this paradigm!
Streams, honestly, are hard to keep straight when the program gets big without a stronger type system. IMHO.
Some really sharp people have been working on stream computing software in Haskell for a while - Gabriel's Pipes package is a good example of generalized stream computing with strong equational reasoning as its foundation.
Here's the main enlightenment of becoming a node.js guy:
Node follows the unix way. Everything is a stream. It's just Buffers and JS objects flying around. It's really stupid, and sometimes it's nasty. This isn't helped by Javascript's warts.
But there's an enormous upside to this: following the stupid Unix way means that no matter what you need to do with your data, there's an npm module for it. Just .pipe() your stream in and your code is done. This is amazing. And it's possible only because of how bare-bones and loose the Buffer stream API is.
Strong typing has its place, but it would ruin node's biggest selling point. It's hard to realize this without trying it.
I don't know node very well and only Javascript minimally but I understand what streaming I/O is and have used generators / iteratees for a long time in many languages.
I understand the benefits and that's why Pipes in HS is such an exciting thing because it gives us a formally reasoned and general set of stream computing tools - you can compute anything with type-level guarantees. It's just as flexible and general as, say, Unix pipes but better because there are guarantees of the library's tooling and there are guarantees of the programs you produce! You can't say that in Node / JS, Python, Ruby, etc...
JS's lack of strong typing limits your ability to reason about streams (a lot more than just streams, too) and further limits your ability to write performant stream computing software. Pipes, in Haskell, give you the big three: Effects (I/O), Composition (function composition with fusion), and Streams (generators and iteratees); because of the type system Haskell (and some nudges here and there by the library author) can fuse and optimize that code to a ridiculous degree in addition to all of the other nice guarantees you get from the type system (separate of I/O from pure code, etc...)
I personally don't think dynamic typing is a selling point, ever - I write software faster and with fewer bugs in Haskell than I ever have before in Python, Ruby, Erlang, or Scheme. But that's a totally different topic and I don't want to derail this one.
Don't misunderstand me as being aggressive, please. I fully respect what people decide to like and work on, I'm just trying to expand the awareness that there are tools in existence that do it better.
> JS's lack of strong typing limits your ability to reason about streams (a lot more than just streams, too) and further limits your ability to write performant stream computing software
I think you missed my point so I'll restate: in practice you don't reason about streams in Node, because the community (a product of the simplicity of the streams API) has a packaged solution to your problem. It plugs right in. And this ecosystem exists because of the simplicity and dynamicity of the constructs used.
I actually agree with you that Haskell does it "better". It's purer and cleaner. You'll probably have less bugs if you write everything in Haskell.
Except it doesn't matter to me, because Haskell doesn't have anything close to the plug-and-playability of npm modules -- and this is a pure social product of the stupid interface that Node exposes compared to Haskell. Node is shittier, and that's why it's more capable at solving the problem I have -- constructing powerful apps in close to no time, and zero lines of my own code.
I guess what I'm saying is that sometimes worse is better.
Can you qualify "Haskell doesn't have anything close to the plug-and-playability of npm modules" because I don't quite get what you mean? Maybe it is because I can't think of anything plug-and-play in node that isn't plug-and-play in Haskell.
I think that has much more to do with the size of Node than the methods of streaming being used. Such things are easily accomplish able atop libraries in Haskell, but pushover hasn't been written as a library because the ecosystem isn't at the point where it specializes so far yet.
But the API provided is trivially replicated, and, yes, easily built atop pipes.
Yes, and "the unix way" isn't just "everything's a stream", but rather "everything's a text stream." In that regard, node's version of "everything's a stream" is actually a half-step up in abstraction.
The default build-in streams found in the stdlib are byte streams (byte chunk streams, to be more exact). But user-level there are object streams as well, those are also mentioned in the docs: http://nodejs.org/api/stream.html#stream_object_mode - so they are kind-of official.
The absolute most compelling thing about Node is how it has hacked organizational dynamics in large companies. Walmart and PayPal basically used it to completely liberate their frontend groups from their backend systems using facade system with huge improvements on customer systems.
What is sad is that all the other high concurrency systems are going to end up implementing much of the GHC runtime without the reliability of Haskell...
Streams are being used in node.js already with a lot of success. Both basic byte chunk streams and object streams. As long as components are well behaved and only emit uniform data (everything they emit is of the same type), it's not better or worse than any function call.
Right, but the burden is on the programmer to ensure those types. If the stream composition is of simple things, like:
cat some.txt | sort | uniq > yay.txt
Then it isn't a problem - it's obvious and simple, but I think there will be difficulties as the programs get larger and type-level awareness occupies more space in the programmer's brain vs. it being handled by the compiler...
I didn't say loose/dynamic typing would not put the burden on the programmer to ensure the types. And it definitely has it's problems. I just don't think that this is specific to streams, it's true for every kind of composition.
interesting example. the fact you know you have to call sort before uniq already shows that even with plain gnu unix util the user must know what he is doing.
Streams, honestly, are hard to keep straight when the program gets big without a stronger type system. IMHO.
Some really sharp people have been working on stream computing software in Haskell for a while - Gabriel's Pipes package is a good example of generalized stream computing with strong equational reasoning as its foundation.
Maybe if you really want to try and do this in Node you can gain some inspiration from his journey: http://hackage.haskell.org/package/pipes