the pattern itself has much more benefit and performance scalability is only the one of them (and in many cases not the most important one). the principle to process a flow of information in atomic, semantically closed steps is a pattern which unix itself was build on.
if you ever seen how easy even complex processes can be created based on pretty simple and atomic executables you can imagine what pipelining mean in terms of software architecture (see also http://en.wikipedia.org/wiki/Pipeline_(Unix))
unix (and nowadays many other similar infrastructures) use a abstract stream of data passed through the pipeline and therefore the flow of information is less semantic, is is only a abstract stream of data. the usage of pipelines can be improved if more common semantic is part of the overall pipeline, means each step / operation has more information of what is expected to flow through than just data.
XML pipelining is intent to use XML as information flow. this ensures that data processing / transformation can be done using less basic byte stream operation. processing can be done on declarative languages like xpath, xquery, xslt, .... which again reduce the complexity of information access and transformation.
xml itself can express endless amount of user data using different data models. a pipeline defined for a subset of data models can again reduce the complexity and therefore can improve the benefit of a particular pipeline infrastructure.
you can imagine pipelines dedicated to transform content created against a DITA data model into endless of distribution formats. the semantic of a particular pipeline step can be used in several pipes are much higher than on "general purpose XML" level.
if we're looking into the second term "SOA" within the mentioned book title we have to divide:
- pipelining to orchestrate dedicated services (macro pipelining)
because each business process can be expressed using the pipeline paradigm the implementation of SOA orchestration is suggested to do using the pipeline pattern.
therefore you have to define the sequence / control flow of services and corresponding message transformation.
languages like BPEL providing a model to express such kind of pipelines.
this layer often require persistence of pipeline state because execution of such processes can take between hours and years.
this layer often requires human steps, means not the complete pipeline can be executed by a machine without human interaction. languages like BPEL4People are extension to cover this standard requirement.
there are many frameworks out there trying to provide an easy to start infrastructure. by the way the usage and complexity of current implementation must be still not underestimated.
- pipelining to solve one or more dedicated business steps (micro pipelining)
within each business process a given amount of data confirms to specification A must be transformed into data confirms to specification B.
e.g. extracting order data from ERP system and add sum for particular product groups before the result must be render to HTML for later display to the person in charge.
those operation can of course as well defined as sequence of steps in which the input data is transformed into output data using multiple steps. those definition mainly derived from business rules.
this layer does not require human interaction and persistence and therefore can be implemented on fully automated frameworks. using XML as data backbone results in "xml pipelining which combines most advantages required for "micro pipelining"
languages like xproc, xpl , .... and corresponding implementation can be used in this area.
in general a micro pipeline transform one input of a macro pipeline step into corresponding output(s)
pipelining is one of the most powerful paradigm we faced with for todays common IT problems. but this pattern is either new not magic its more a "back to the roots....".