Abstract | ||
---|---|---|
In this paper we propose an extension of sequence kernels to the case where the symbols that define the sequences have multiple representations. This configuration occurs, for instance, in natural language processing, where words can be characterized according to different linguistic dimensions. The core of our contribution is to integrate early the different representations in the kernel, in a way that generates rich composite features defined across the various symbol dimensions. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1016/j.neucom.2008.11.025 | Neurocomputing |
Keywords | Field | DocType |
factored sequence kernel,natural language processing,sequence kernel,different representation,various symbol dimension,rich composite feature,multiple representation,different linguistic dimension,kernel methods,language modeling,machine learning,kernel method,language model | Kernel (linear algebra),Pattern recognition,Symbol,Tree kernel,Artificial intelligence,Natural language processing,Kernel method,Language model,Machine learning,Mathematics | Journal |
Volume | Issue | ISSN |
72 | 7-9 | Neurocomputing |
Citations | PageRank | References |
4 | 0.40 | 7 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Nicola Cancedda | 1 | 261 | 20.27 |
Pierre Mahé | 2 | 257 | 13.98 |