{"ID":71456,"CreatedAt":"2026-02-27T13:00:40Z","UpdatedAt":"2026-02-27T13:00:40Z","DeletedAt":null,"paper_url":"https://paperswithcode.com/paper/log-hilbert-schmidt-metric-between-positive","title":"Log-Hilbert-Schmidt metric between positive definite operators on Hilbert spaces","abstract":"This paper introduces a novel mathematical and computational framework, namely {\\it Log-Hilbert-Schmidt metric} between positive definite operators on a Hilbert space. This is a generalization of the Log-Euclidean metric on the Riemannian manifold of positive definite matrices to the infinite-dimensional setting. The general framework is applied in particular to compute distances between covariance operators on a Reproducing Kernel Hilbert Space (RKHS), for which we obtain explicit formulas via the corresponding Gram matrices. Empirically, we apply our formulation to the task of multi-category image classification, where each image is represented by an infinite-dimensional RKHS covariance operator. On several challenging datasets, our method significantly outperforms approaches based on covariance matrices computed directly on the original input features, including those using the Log-Euclidean metric, Stein and Jeffreys divergences, achieving new state of the art results.","url_abs":"http://papers.nips.cc/paper/5457-log-hilbert-schmidt-metric-between-positive-definite-operators-on-hilbert-spaces","url_pdf":"http://papers.nips.cc/paper/5457-log-hilbert-schmidt-metric-between-positive-definite-operators-on-hilbert-spaces.pdf","authors":"[\"Minh Ha Quang\", \"Marco San Biagio\", \"Vittorio Murino\"]","published":"2014-12-01T00:00:00Z","proceeding":"NeurIPS 2014 12","tasks":"[\"image-classification\", \"Image Classification\"]","methods":"[]","has_code":false}
