CITS2002 Systems Programming  
prev
next CITS2002 CITS2002 schedule  

Inter-process communication - filters

This is the Unix philosophy: write programs that do one thing and do it well. Write programs to work together. Write programs that handle text streams, because that is a universal interface. Douglas McIlroy, Bell System Technical Journal, 1978

One of the most successful ideas introduced in early Unix systems was the interprocess communication mechanism termed a pipe. Pipes enable shells (or other programs) to connect the output of one program to the input of another, and for arbitrary sequences of pipes - a pipeline - to filter a data-stream with a number of transformations.


A great pipeline example, providing a rudimentary spell-checker:

prompt>tr -cs 'A-Za-z' '\n' < inputfilename | sort -u | comm -23 - /usr/share/dict/words


Programs typically used in pipelines are termed filters, and they work in combination because of their simple communication schemes which do not add 'unexpected detail' to their output, so that programs reading that output as their input only have the expected data-stream to process.


Unix-based systems provide a huge number of utility programs that filter their standard-input, writing their 'results' to stdout-output:

comm, cut, grep, gzip, head, join, merge, paste, sort, tail, tee, tr, uniq, wc, zcat


It's for this reason that programs don't produce verbose natural-language descriptions of their output, no headings for tables of data, unless a specific command-line option requests it. Just the facts.

 


CITS2002 Systems Programming, Lecture 18, p9, 2nd October 2023.