Abstract | ||
---|---|---|
Video quality assessment with subjective testing is both time consuming and expensive. An interesting new approach to traditional testing is the so-called crowdsourcing, moving the testing effort into the internet. We therefore propose in this contribution the QualityCrowd framework to effortlessly perform subjective quality assessment with crowdsourcing. QualityCrowd allows codec independent quality assessment with a simple web interface, usable with common web browsers. We compared the results from an online subjective test using this framework with the results from a test in a standardized environment. This comparison shows that QualityCrowd delivers equivalent results within the acceptable inter-lab correlation. While we only consider video quality in this contribution, QualityCrowd can also be used for multimodal quality assessment. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1109/PCS.2012.6213338 | 2012 PICTURE CODING SYMPOSIUM (PCS) |
Keywords | Field | DocType |
servers,testing,correlation,user interfaces,internet,codecs,crowdsourcing,web interface,video quality | USable,Computer science,Crowdsourcing,Theoretical computer science,PEVQ,Subjective video quality,Human–computer interaction,User interface,Video quality,Multimedia,Codec,The Internet | Conference |
Citations | PageRank | References |
27 | 1.51 | 4 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Christian Keimel | 1 | 150 | 12.00 |
Julian Habigt | 2 | 79 | 6.90 |
Clemens Horch | 3 | 35 | 2.48 |
Klaus Diepold | 4 | 437 | 56.47 |