We might be able to cross one more language off your wishlist soon, Javascript is on the way to getting a pipeline operator, the proposal is currently at Stage 2
It also has barely seen any activity in years. It is going nowhere. The TC39 committee is utterly dysfunctional and anti-progress, and will not let any this or any other new syntax into JavaScript. Records and tuples has just been killed, despite being cited in surveys as a major missing feature[1]. Pattern matching is stuck in stage 1 and hasn't been presented since 2022. Ditto for type annotations and a million other things.
Our only hope is if TypeScript finally gives up on the broken TC39 process and starts to implement its own syntax enhancements again.
I wouldn’t hold your breath for TypeScript introducing any new supra-JS features. In the old days they did a little bit, but now those features (namely enums) are considered harmful.
More specifically, with the (also ironically gummed up in tc39) type syntax [1], and importantly node introducing the --strip-types option [2], TS is only ever going to look more and more like standards compliant JS.
Aren't the engine devs all part of the TC39 committee? I know they stopped SIMD in JS because they were mire interested in shipping WASM, and then adding SIMD to it.
I would say representatives of the engine teams are involved. However not involved enough clearly, because it should have been withdrawn waaay before now due to this issue.
I was excited for that proposal, but it veered off course some years ago – some TC39 members have stuck to the position that without member property support or async/await support, they will not let the feature move forward.
It seems like most people are just asking for the simple function piping everyone expects from the |> syntax, but that doesn't look likely to happen.
My guess is the TC committee would want this to be more seamless.
This also gets weird because if the `|>` is a special function that sends in a magic `%` parameter, it'd have to be context sensitive to whether or not an `async` thing happens within the bounds. Whether or not it does will determine if the subsequent pipes are dealing with a future of % or just % directly.
It wouldn't though? The first await would... await the value out of the future. You still do the syntactic transformation with the magic parameter. In your example you're awaiting the future returned by getFuture twice and improperly awaiting the output of baz (which isn't async in the example).
(assuming getFuture and bat are both async). You do need |> to be aware of the case where the await keyword is present, but that's about it. The above would effectively transform to:
I don't think |> really can support applying the result of one of its composite applications in general, so it's not ambiguous.
Given this example:
(await getFutureAsyncFactory("bar"))("input")
the getFutureAsyncFactory function is async, but the function it returns is not (or it may be and we just don't await it). Basically, using |> like you stated above doesn't do what you want. If you wanted the same semantics, you would have to do something like:
("bar" |> await getFutureAsyncFactory())("input")
to invoke the returned function.
The whole pipeline takes on the value of the last function specified.
It's still not ambiguous. Your second example would be a syntax error (probably, if I was designing it at least) because you're missing the invocation parenthesis after the wrapped value:
a |> (await f())()
which removes any sort of ambiguity. Your first example calls f() with a as its first argument while the second (after my fix) calls and awaits f() and then invokes that result with a as its first argument.
For the last example, it would look like:
a |> (await f())() | g()
assuming f() is still async and returns a function. g() must be a function, so the parenthesis have to be added.
I worry about "soon" here. I've been excited for this proposal for years now (8 maybe? I forget), and I'm not sure it'll ever actually get traction at this point.
A while ago, I wondered how close you could get to a pipeline operator using existing JavaScript features. In case anyone might like to have a look, I wrote a proof-of-concept function called "Chute" [1]. It chains function and method calls in a dot-notation style like the basic example below.
chute(7) // setup a chute and give it a seed value
.toString // call methods of the current data (parens optional)
.parseInt // send the current data through global native Fns
.do(x=>[x]) // through a chain of one or more local / inline Fns
.JSON.stringify // through nested global functions (native / custom)
.JSON.parse
.do(x=>x[0])
.log // through built in Chute methods
.add_one // global custom Fns (e.g. const add_one=x=>x+1)
() // end a chute with '()' and get the result
All of their examples are wordier than just function chaining and I worry they’ve lost the plot somewhere.
They list this as a con of F# (also Elixir) pipes:
value |> x=> x.foo()
The insistence on an arrow function is pure hallucination
value |> x.foo()
Should be perfectly achievable as it is in these other languages. What’s more, doing so removes all of the handwringing about await. And I’m frankly at a loss why you would want to put yield in the middle of one of these chains instead of after.
Letting your JS/TS compiler convert it into supported form. Not really a polyfill, but it allows to use new features in the source and still support older targets. This was done a lot when ES6 was new, I remember.
Polyfills are for runtime behavior that can't be replicated with a simple syntax transformation, such as adding new functions to built-in objects like string.prototype contains or the Symbol constructor and prototype or custom elements.
I haven't looked at the member properties bits but I suspect the pipeline syntax just needs the transform to be supported in build tools, rather than adding yet another polyfill.
Even more concise and it doesn't even require a special language feature, it's just regular syntax of the language ( |> is a method like .get(...) so you could even write `params.get("user").|>(create_user) if you wanted to)
In elixir, ```Map.get("user") |> create_user |> notify_admin ``` would aso be valid, standard elixir, just not idiomatic (parens are optional, but preferred in most cases, and one-line pipes are also frowned upon except for scripting).
With the disclaimer that I don't know Elixir and haven't programmed with the pipeline operator before: I don't like that special () syntax. That syntax denotes application of the function without passing any arguments, but the whole point here is that an argument is being passed. It seems clearer to me to just put the pipeline operator and the name of the function that it's being used with. I don't see how it's unclear that application is being handled by the pipeline operator.
Also, what if the function you want to use is returned by some nullary function? You couldn't just do |> getfunc(), as presumably the pipeline operator will interfere with the usual meaning of the parentheses and will try to pass something to getfunc. Would |> ( getfunc() ) work? This is the kind of problem that can arise when one language feature is permitted to change the ordinary behaviour of an existing feature in the name of convenience. (Unless of course I'm just missing something.)
I am also confused with such syntax of "passing as first argument" pipes. Having to write `x |> foo` instead of `x |> foo()` does not solve much, because you have the same lack of clarity if you need to pass a second argument. Ie `x |> foo(y)` in this case means `foo(x,y)`, but if `foo(y)` actually gives you a function to apply to `x` prob you should write `x |> foo(y)()` or `x |> (foo(y))()` then as I understand it? If that even makes sense in a language. In any case, you have the same issue as before, in different contexts `foo(y)` is interpreted differently.
I just find this syntax too inconsistent and vague, and hence actually annoying. Which is why I prefer defining pipes as composition of functions which can then be applied to whatever data. Then eg one can write sth like `(|> foo1 foo2 (foo3) #(foo4 % y))` and know that foo1 and foo2 are references to functions, foo3 evaluates to another function, and when one needs more arguments in foo4 they have to explicitly state that. This gives another function, and there is no ambiguity here whatsoever.
> if `foo(y)` actually gives you a function to apply to `x` prob you should write `x |> foo(y)()`
If foo(y) returned a function, then to call it with x, you would have to write foo(y).(x) or x |> foo(y).(), so the syntax around calling the anonymous function isn't affected by the pipe. Also, you're not generally going to be using pipelines with functions that return functions so much as with functions that return data which is then consumed as the first argument by the next function in the pipeline. See my previous comment on this thread for more on that point.
There's no inconsistency or ambiguity in the pipeline operator's behavior. It's just syntactic sugar that's handy for making your code easier to read.
> Having to write `x |> foo` instead of `x |> foo()` does not solve much, because you have the same lack of clarity if you need to pass a second argument
That's actually true. In Scala that is not so nice, because then it becomes `x |> foo(_, arg2)` or, even worse, `x |> (param => foo(param, arg2))`. I have a few such cases in my sourcecode and I really don't like it. Haskell and PureScript do a much better job keeping the code clean in such cases.
I've been using Elxir for a long time and had that same hope after having experienced how clear, concise and maintainable apps can be when the core is all a bunch of pipelines (and the boundary does error handling using cases and withs). But having seen the pipe operator in Ruby, I now think it was a bad idea.
The problem is that method-chaining is common in several OO languages, including Ruby. This means the functions on an object return an object, which can then call other functions on itself. In contrast, the pipe operator calls a function, passing in what's on the left side of it as the first argument. In order to work properly, this means you'll need functions that take the data as the first argument and return the same shape to return, whether that's a list, a map, a string or a struct, etc.
When you add a pipe operator to an OO language where method-chaining is common, you'll start getting two different types of APIs and it ends up messier than if you'd just stuck with chaining method calls. I much prefer passing immutable data into a pipeline of functions as Elixir does it, but I'd pick method chaining over a mix of method chaining and pipelines.
I'm a big fan of the Elixir operator, and it should be standard in all functional programming languages. You need it because everything is just a function and you can't do anything like method chaining, because none of the return values have anything like methods. The |> is "just" syntax sugar for a load of nested functions. Whereas the Rust style method chaining doesn't need language support - it's more of a programming style.
Note also that it works well in Elixir because it was created at the same time as most of the standard library. That means that the standard library takes the relevant argument in the first position all the time. Very rarely do you need to pipe into the second argument (and you need a lambda or convenience function to make that work).
Agree. This is absolutely my fave part of Elixir. Whenever I can get something to flow elegantly thru a pipeline like that, I feel like it’s a win against chaos.
Yes, a small feature set is important, and adding the functional-style pipe to languages that already have chaining with the dot seems to clutter up the design space. However, dot-chaining has the severe limitation that you can only pass to the first or "this" argument.
Is there any language with a single feature that gives the best of both worlds?
The pipe operator relies on the first argument being the subject of the operation. A lot of languages have the arguments in a different order, and OO languages sometimes use function chaining to get a similar result.
IIRC the usual workaround in Elixir involves be small lambda that rearranges things:
"World"
|> then(&concat("Hello ", &1))
I imagine a shorter syntax could someday be possible, where some special placeholder expression could be used, ex:
"World"
|> concat("Hello ", &1)
However that creates a new problem: If the implicit-first-argument form is still permitted (foo() instead of foo(&1)) then it becomes confusing which function-arity is being called. A human could easily fail to notice the absence or presence of the special placeholder on some lines, and invoke the wrong thing.
Yeah, R (tidyverse) has `.` as such a placeholder. It is useful but indeed I find the syntax off, though I find the syntax off even without it, anyway. I would rather define pipes as compositions of functions, which are pretty unambiguous in terms of what arguments they get, and then apply these to whatever i want.
Last time I checked (2020) there were already a few rejected proposals to shorten the syntax for this. It seemed like they were pretty exasperated by them at the time.
Yes, `&` (reverse apply) is equivalent to `|>`, but it is interesting that there is no common operator for reversed compose `.`, so function compositions are still read right-to-left.
In my programming language, I added `.>` as a reverse-compose operator, so pipelines of function compositions can also be read uniformly left-to-right, e.g.
process = map validate .> catMaybes .> mapM persist
Elm (written in Haskell) uses |> and <| for pipelining forwards and backwards, and function composition is >> and <<. These have made it into Haskell via nri-prelude https://hackage.haskell.org/package/nri-prelude (written by a company that uses a lot of Elm in order to make writing Haskell look more like writing Elm).
EDIT: in no way do I want to claim the originality of these things in Elm or the Haskell package inspired by it. AFAIK |> came from F# but it could be miles earlier.
I disagree, because then it can be very ambiguous with an existing `|` operator. The language has to be able to tell that this is a pipeline and not doing a bitwise or operation on the output of multiple functions.
Yes, I’m talking about a language where `|` would be the pipe operator and nothing else, like in a shell. Retrofitting a new operator into an existing language tends to be suboptimal.
However.
I would be lying if I didn't secretly wish that all languages adopted the `|>` syntax from Elixir.
```
params
|> Map.get("user")
|> create_user()
|> notify_admin()
```