Why don't I see pipe operators in most high-level languages?

In Unix shell programming the pipe operator is an extremely powerful tool. With a small set of core utilities, a systems language (like C) and a scripting language (like Python) you can construct extremely compact and powerful shell scripts, that are automatically parallelized by the operating system.

Obviously this is a very powerful programming paradigm, but I haven't seen pipes as first class abstractions in any language other than a shell script. The code needed to replicate the functionality of scripts using pipes seems to always be quite complex.

So my question is why don't I see something similar to Unix pipes in modern high-level languages like C#, Java, etc.? Are there languages (other than shell scripts) which do support first class pipes? Isn't it a convenient and safe way to express concurrent algorithms?

Just in case someone brings it up, I looked at the F# pipe-forward operator (forward pipe operator), and it looks more like a function application operator. It applies a function to data, rather than connecting two streams together, as far as I can tell, but I am open to corrections.

Postscript: While doing some research on implementing coroutines, I realize that there are certain parallels. In a blog post Martin Wolf describes a similar problem to mine but in terms of coroutines instead of pipes.


Haha! Thanks to my Google-fu, I have found an SO answer that may interest you. Basically, the answer is going against the "don't overload operators unless you really have to" argument by overloading the bitwise-OR operator to provide shell-like piping, resulting in Python code like this:

for i in xrange(2,100) | sieve(2) | sieve(3) | sieve(5) | sieve(7):
    print i

What it does, conceptually, is pipe the list of numbers from 2 to 99 (xrange(2, 100)) through a sieve function that removes multiples of a given number (first 2, then 3, then 5, then 7). This is the start of a prime-number generator, though generating prime numbers this way is a rather bad idea. But we can do more:

for i in xrange(2,100) | strify() | startswith(5):
    print i

This generates the range, then converts all of them from numbers to strings, and then filters out anything that doesn't start with 5.

The post shows a basic parent class that allows you to overload two methods, map and filter, to describe the behavior of your pipe. So strify() uses the map method to convert everything to a string, while sieve() uses the filter method to weed out things that aren't multiples of the number.

It's quite clever, though perhaps that means it's not very Pythonic, but it demonstrates what you are after and a technique to get it that can probably be applied easily to other languages.


You can do pipelining type parallelism quite easily in Erlang. Below is a shameless copy/paste from my blogpost of Jan 2008.

Also, Glasgow Parallel Haskell allows for parallel function composition, which amounts to the same thing, giving you implicit parallelisation.

You already think in terms of pipelines - how about "gzcat foo.tar.gz | tar xf -"? You may not have known it, but the shell is running the unzip and untar in parallel - the stdin read in tar just blocks until data is sent to stdout by gzcat.

Well a lot of tasks can be expressed in terms of pipelines, and if you can do that then getting some level of parallelisation is simple with David King's helper code (even across erlang nodes, ie. machines):

pipeline:run([pipeline:generator(BigList),
          {filter,fun some_filter/1},
          {map,fun_some_map/1},
          {generic,fun some_complex_function/2},
          fun some_more_complicated_function/1,
          fun pipeline:collect/1]).

So basically what he's doing here is making a list of the steps - each step being implemented in a fun that accepts as input whatever the previous step outputs (the funs can even be defined inline of course). Go check out David's blog entry for the code and more detailed explanation.


magrittr package provides something similar to F#'s pipe-forward operator in R:

rnorm(100) %>% abs %>% mean

Combined with dplyr package, it brings a neat data manipulation tool:

iris %>%
  filter(Species == "virginica") %>%
  select(-Species) %>%
  colMeans

You can find something like pipes in C# and Java, for example, where you take a connection stream and put it inside the constructor of another connection stream.

So, you have in Java:

new BufferedReader(new InputStreamReader(System.in));

You may want to look up chaining input streams or output streams.