Title
Exploiting referential context in spoken language interfaces for data-poor domains
Abstract
This paper describes an implementation of a shell-like programming interface that utilizes referential context (that is, information about the current state of an interfaced application) in order to achieve accurate recognition -- even in user-defined domains with no available domain-specific training corpora. The interface incorporates a knowledge of context into its model of syntax, yielding a referential semantic language model. Interestingly, the referential semantic language model exploits context dynamically, unlike other recent systems, by using incremental processing and the limited stack memory of an HMM-like time series model.
Year
DOI
Venue
2008
10.1145/1378773.1378811
IUI
Keywords
Field
DocType
hmm-like time series model,language interface,interfaced application,shell-like programming interface,referential context,context dynamically,accurate recognition,available domain-specific training corpus,incremental processing,referential semantic language model,data-poor domain,current state,time series model,language model
Time series,Computer science,Speech recognition,Exploit,Artificial intelligence,Natural language processing,Syntax,Language model,Speech summarization,Spoken language,Stack-based memory allocation
Conference
Citations 
PageRank 
References 
1
0.37
14
Authors
3
Name
Order
Citations
PageRank
Stephen Wu114711.73
Lane Schwartz220918.01
William Schuler312517.78