Title
Components in the Pipeline
Abstract
Scientists commonly describe their data-processing systems metaphorically as software pipelines. These pipelines input one or more data sources and apply steps to transform the data and create useful results. Although conceptually simple, pipelines often adopt complex topologies and must meet stringent quality-of-service requirements that stress the software infrastructure used to construct the pipeline. The Middleware for Data-Intensive Computing (MeDICi) Integration Framework (MIF) is a component-based framework for constructing complex software pipelines. It supports composing pipelines from distributed heterogeneous software components and provides mechanisms for controlling the quality of service to meet demanding performance, reliability, and communication requirements.
Year
DOI
Venue
2011
10.1109/MS.2011.23
Software, IEEE
Keywords
Field
DocType
middleware,pipeline processing,software quality,software reliability,MeDICi integration framework,data-processing pipelines,massive datasets,middleware for data-intensive computing,pipe-and-filter architecture,qualities of service,reliability,state-of-the-art scientific instruments,components,pipelines,scientific software,software engineering
Middleware,Pipeline transport,Data processing,Software engineering,Computer science,Scientific instrument,Software,Pipeline (software),Software quality,Executable
Journal
Volume
Issue
ISSN
28
3
0740-7459
Citations 
PageRank 
References 
10
0.89
3
Authors
4
Name
Order
Citations
PageRank
Ian Gorton11488134.37
Adam Wynne2689.41
Yan Liu32551189.16
Jian Yin486197.01