Title
Synthetic designs: a new form of true experimental design for use in information systems development
Abstract
Computer scientists and software engineers seldom rely on using experimental methods despite frequent calls to do so. The problem may lie with the shortcomings of traditional experimental methods. We introduce a new form of experimental designs, synthetic designs, which address these shortcomings. Compared with classical experimental designs (between-subjects, within-subjects, and matched-subjects), synthetic designs can offer substantial reductions in sample sizes, cost, time and effort expended, increased statistical power, and fewer threats to validity (internal, external, and statistical conclusion). This new design is a variation of within-subjects design in which each system user serves in only a single treatment condition. System performance scores for all other treatment conditions are derived synthetically without repeated testing of each subject. This design, though not applicable in all situations, can be used in the development and testing of some computer systems provided that user behavior is unaffected by the version of computer system being used. We justify synthetic designs on three grounds: this design has been used successfully in the development of computerized mug shot systems, showing marked advantages over traditional designs; a detailed comparison with traditional designs showing their advantages on 17 of the 18 criteria considered; and an assessment showing these designs satisfy all the requirements of true experiments (albeit in a novel way).
Year
DOI
Venue
2007
10.1145/1254882.1254904
Proceedings of the ACM SIGMETRICS international conference on Measurement and modeling of computer systems
Keywords
Field
DocType
software engineering,system performance,satisfiability,sample size,statistical power,experimental designs,experimental design
Information system,Single-subject research,Computer science,Simulation,Software,Statistical power,Sample size determination,Reliability engineering,Design of experiments,Distributed computing,Statistical conclusion validity
Conference
Volume
Issue
ISSN
35
1
0163-5999
Citations 
PageRank 
References 
0
0.34
15
Authors
2
Name
Order
Citations
PageRank
Eric S. Lee12011.90
Thomas Whalen211532.39