Abstract | ||
---|---|---|
Convolution kernels are essential in signal processing. They are used to model sensors, to define filters, to ensure the interplay between continuous and discrete domains, etc. A classic shortcoming is the difficulty to define which kernel is suitable for a particular application. In previous articles, we relied on a simple analogy between positive convolution kernels and probability distributions to define the notion of maxitive kernel. A maxitive kernel aims at representing a convex set of positive convolution kernels. It therefore models imprecise information on the suitable kernel to be used. Though, in many applications, such as filtering, it may be necessary to use signed convolution kernels. In this article we propose to extend the notion of maxitive kernel domination over signed convolution kernels. This will lead us towards a concept little used until now that are signed - and thus non-monotonous - set functions. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/FUZZ-IEEE.2019.8858814 | 2019 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) |
Keywords | Field | DocType |
Possibility theory,signed kernels,maxitive kernels | Kernel (linear algebra),Set function,Signal processing,Convolution,Computer science,Algorithm,Filter (signal processing),Convex set,Probability distribution,Artificial intelligence,Analogy,Machine learning | Conference |
ISSN | ISBN | Citations |
1544-5615 | 978-1-5386-1729-8 | 0 |
PageRank | References | Authors |
0.34 | 9 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
O. Strauss | 1 | 153 | 21.17 |
Agnès Rico | 2 | 129 | 20.74 |