Abstract | ||
---|---|---|
Experimental network research is subject to challenges since the experiment outcomes can be influenced by undesired effects from other activities in the network. In shared experiment networks, control over resources is often limited and QoS guarantees might not be available. When the network conditions vary during a series of experiment unwanted artifacts can be introduced in the experimental results, reducing the reliability of the experiments. We propose a novel, systematic, methodology where network conditions are monitored during the experiments and information about the network is collected. This information, known as metadata, is analyzed statistically to identify periods during the experiments when the network conditions have been similar. Data points collected during these periods are valid for comparison. Our hypothesis is that this methodology can make experiments more reliable. We present a proof-of-concept implementation of our method, deployed in the FEDERICA and PlanetLab networks. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1007/978-3-642-28534-9_15 | TMA |
Keywords | Field | DocType |
experiment unwanted artifact,qos guarantee,experiment outcome,experiment reliability,experimental network research,planetlab network,network condition,shared experiment network,data point,shared environment,proof-of-concept implementation,measuring,metadata,communication systems,clustering | Data point,Data mining,Metadata,PlanetLab,Computer science,Quality of service,Communications system,Cluster analysis,Network conditions | Conference |
Citations | PageRank | References |
1 | 0.36 | 17 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Pehr Söderman | 1 | 21 | 3.20 |
Markus Hidell | 2 | 84 | 10.90 |
Peter Sjödin | 3 | 127 | 14.87 |