What blew my mind was this whole piping of input-output thing. It is so useful.
Holy thread necro!
But since we're here...
It isn't the piping alone that makes it so useful (the Windows command prompt supports piping too, after all). The real power is in the fact that the entire UNIX/Linux CLI ecosystem is designed from the ground up to leverage it seamlessly.
One example of the sort of thing I'm talking about: You can clone a raw disk (or partition) across a network to a remote disk with a one-line CLI command that uses dd to read the local disk and pipes its output to ssh; ssh in turn pipes its output to a remote copy of dd to write the remote disk. Want to create a compressed remote disk image (in a regular file) instead of cloning to a raw disk? Just add gzip to the end of the pipeline and redirect gzip's output to a file. Nearly all of the common CLI tools "just work" together like this.
You can also rapidly develop complex pipelines by taking an incremental approach. Start with the first command of the pipeline, and run it to see if it does what you want. Then hit up-arrow (command recall), add the second command in the pipeline, run and view the output again. Lather, rinse, repeat until you've got the whole thing constructed. A pipeline can even produce additional shell commands (by leveraging tools like sed), which are in turn piped to another copy of the shell... or additional commands can be invoked based on data in the pipeline via the xargs tool.