Abstract | ||
---|---|---|
Halfspace depth (HD), aka Tukey depth, is one of the most prevailing depth notions among all its competitors. To exactly compute the HD in is a challenging task nevertheless, due to its definition involving infinitely many directional projections. Existing algorithms to compute HD in (d > 2), more or less involve data projection to the directions perpendicular to hyperplanes (therefore involving polytopes or polyhedral cones) are either dimension d-, or sample size n-limited, or slow. Thanks to the work of Merkle and Bogicevic and Merkle, algorithms for the fast computation of HD in ultra high dimension d for large n are proposed and addressed. They are sheer ball-based (or essentially point-wise distances) computations. The worst time complexity to approximately compute the depth m/n of a single point is O((d + m + log (n))n(2)) which seems to be one of the best known results so far among the competitors for extra large d and n. Unlike most previous algorithms, they do not require the underlying dataset to be in general position. As by-products of the algorithm, some depth regions (or depth level sets), and depth of sample points also could be obtained during the depth calculation of a given point. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1080/03610918.2017.1402040 | COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION |
Keywords | Field | DocType |
Algorithm,Definition and computation,Halfspace depth,High dimension and large sample size,Time complexity | Discrete mathematics,Polytope,Data projection,Hyperplane,Time complexity,Statistics,Sample size determination,Mathematics,AKA,Computation | Journal |
Volume | Issue | ISSN |
48.0 | 3.0 | 0361-0918 |
Citations | PageRank | References |
0 | 0.34 | 14 |
Authors | ||
1 |