Title
Assessing the readability of ClinicalTrials.gov.
Abstract
Objective ClinicalTrials.gov serves critical functions of disseminating trial information to the public and helping the trials recruit participants. This study assessed the readability of trial descriptions at ClinicalTrials.gov using multiple quantitative measures. Materials and Methods The analysis included all 165 988 trials registered at ClinicalTrials.gov as of April 30, 2014. To obtain benchmarks, the authors also analyzed 2 other medical corpora: (1) all 955 Health Topics articles from MedlinePlus and (2) a random sample of 100 000 clinician notes retrieved from an electronic health records system intended for conveying internal communication among medical professionals. The authors characterized each of the corpora using 4 surface metrics, and then applied 5 different scoring algorithms to assess their readability. The authors hypothesized that clinician notes would be most difficult to read, followed by trial descriptions and MedlinePlus Health Topics articles. Results Trial descriptions have the longest average sentence length (26.1 words) across all corpora; 65% of their words used are not covered by a basic medical English dictionary. In comparison, average sentence length of MedlinePlus Health Topics articles is 61% shorter, vocabulary size is 95% smaller, and dictionary coverage is 46% higher. All 5 scoring algorithms consistently rated CliniclTrials.gov trial descriptions the most difficult corpus to read, even harder than clinician notes. On average, it requires 18 years of education to properly understand these trial descriptions according to the results generated by the readability assessment algorithms. Discussion and Conclusion Trial descriptions at CliniclTrials.gov are extremely difficult to read. Significant work is warranted to improve their readability in order to achieve CliniclTrials.gov's goal of facilitating information dissemination and subject recruitment.
Year
DOI
Venue
2016
10.1093/jamia/ocv062
JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION
Keywords
Field
DocType
readability,comprehension,clinical trial,CliniclTrials.gov,electronic health records,natural language processing
MedlinePlus,Data mining,Computer science,Clinical trial,Readability,Artificial intelligence,Natural language processing,Information Dissemination,Vocabulary,Sentence,Comprehension
Journal
Volume
Issue
ISSN
23
2
1067-5027
Citations 
PageRank 
References 
3
0.45
6
Authors
10
Name
Order
Citations
PageRank
Danny T. Y. Wu1134.19
David A Hanauer219518.96
Qiaozhu Mei34395207.09
Patricia M. Clark481.01
lawrence c an530.45
Joshua Proulx6234.34
qing t zeng730.45
V. G. Vinod Vydiswaran816219.68
Kevyn Collins-Thompson9112165.69
Kai Zheng1014112.37